Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
Golden, L. M.
1979-01-01
To account for surface roughness, the transmission of microwave radiation through a planetary surface to an observer is treated by a Monte Carlo technique. Sizable effects are found near the limb of the planet, and they should be included in analyses of high-resolution observations and high-precision integrated disk observations.
Parallel processing Monte Carlo radiation transport codes
International Nuclear Information System (INIS)
McKinney, G.W.
1994-01-01
Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine
Monte Carlo applications to radiation shielding problems
International Nuclear Information System (INIS)
Subbaiah, K.V.
2009-01-01
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation
Monte Carlo method in radiation transport problems
International Nuclear Information System (INIS)
Dejonghe, G.; Nimal, J.C.; Vergnaud, T.
1986-11-01
In neutral radiation transport problems (neutrons, photons), two values are important: the flux in the phase space and the density of particles. To solve the problem with Monte Carlo method leads to, among other things, build a statistical process (called the play) and to provide a numerical value to a variable x (this attribution is called score). Sampling techniques are presented. Play biasing necessity is proved. A biased simulation is made. At last, the current developments (rewriting of programs for instance) are presented due to several reasons: two of them are the vectorial calculation apparition and the photon and neutron transport in vacancy media [fr
Radiation protection for human interplanetary spaceflight and planetary surface operations
Energy Technology Data Exchange (ETDEWEB)
Clark, B.C. [Armed Forces Radiobiology Research Inst., Bethesda, MD (United States)]|[DLR Inst. of Aerospace Medicine, Cologne (Germany)]|[NASA, Goddard Space Flight Center, Greenbelt, MD (United States)
1993-12-31
Radiation protection issues are reviewed for five categories of radiation exposure during human missions to the moon and Mars: trapped radiation belts, galactic cosmic rays, solar flare particle events, planetary surface emissions, and on-board radiation sources. Relative hazards are dependent upon spacecraft and vehicle configurations, flight trajectories, human susceptibility, shielding effectiveness, monitoring and warning systems, and other factors. Crew cabins, interplanetary mission modules, surface habitats, planetary rovers, and extravehicular mobility units (spacesuits) provide various degrees of protection. Countermeasures that may be taken are reviewed relative to added complexity and risks that they could entail, with suggestions for future research and analysis.
Radiation Belts of Antiparticles in Planetary Magnetospheres
Pugacheva, G. I.; Gusev, A. A.; Jayanthi, U. B.; Martin, I. M.; Spjeldvik, W. N.
2007-05-01
The Earth's radiation belts could be populated, besides with electrons and protons, also by antiparticles, such as positrons (Basilova et al., 1982) and antiprotons (pbar). Positrons are born in the decay of pions that are directly produced in nuclear reactions of trapped relativistic inner zone protons with the residual atmosphere at altitudes in the range of about 500 to 3000 km over the Earth's surface. Antiprotons are born by high energy (E > 6 GeV) cosmic rays in p+p - p+p+p+ pbar and in p+p - p+p+n+nbar reactions. The trapping and storage of these charged anti-particles in the magnetosphere result in radiation belts similar to the classical Van Allen belts of protons and electrons. We describe the mathematical techniques used for numerical simulation of the trapped positron and antiproton belt fluxes. The pion and antiproton yields were simulated on the basis of the Russian nuclear reaction computer code MSDM, a Multy Stage Dynamical Model, Monte Carlo code, (i.e., Dementyev and Sobolevsky, 1999). For estimates of positron flux there we have accounted for ionisation, bremsstrahlung, and synchrotron energy losses. The resulting numerical estimates show that the positron flux with energy >100 MeV trapped into the radiation belt at L=1.2 is of the order ~1000 m-2 s-1 sr-1, and that it is very sensitive to the shape of the trapped proton spectrum. This confined positron flux is found to be greater than that albedo, not trapped, mixed electron/positron flux of about 50 m-2 s-1 sr-1 produced by CR in the same region at the top of the geomagnetic field line at L=1.2. As we show in report, this albedo flux also consists mostly of positrons. The trapped antiproton fluxes produced by CR in the Earth's upper rarified atmosphere were calculated in the energy range from 10 MeV to several GeV. In the simulations we included a mathematic consideration of the radial diffusion process, both an inner and an outer antiproton source, losses of particles due to ionization process
Discrete diffusion Monte Carlo for frequency-dependent radiative transfer
Energy Technology Data Exchange (ETDEWEB)
Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory
2010-11-17
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.
International Nuclear Information System (INIS)
Basit, Abdul; Raza, S Shoaib; Irfan, Naseem
2006-01-01
In this paper a Monte Carlo model for describing the atmospheric dispersion of radionuclides (represented by Lagrangian particles/neutral tracers) continuously released into a stable planetary boundary layer is presented. The effect of variation in release height and wind directional shear on plume dispersion is studied. The resultant plume concentration and dose rate at the ground is also calculated. The turbulent atmospheric parameters, like vertical profiles of fluctuating wind velocity components and eddy lifetime, were calculated using empirical relations for a stable atmosphere. The horizontal and vertical dispersion coefficients calculated by a numerical Lagrangian model are compared with the original and modified Pasquill-Gifford and Briggs empirical σs. The comparison shows that the Monte Carlo model can successfully predict dispersion in a stable atmosphere using the empirical turbulent parameters. The predicted ground concentration and dose rate contours indicate a significant increase in the affected area when wind shear is accounted for in the calculations
Continental Ice Sheets and the Planetary Radiation Budget
Oerlemans, J.
1980-01-01
The interaction between continental ice sheets and the planetary radiation budget is potentially important in climate-sensitivity studies. A simple ice-sheet model incorporated in an energybalance climate model provides a tool for studying this interaction in a quantitative way. Experiments in which
A Monte Carlo transport code study of the space radiation environment using FLUKA and ROOT
Wilson, T; Carminati, F; Brun, R; Ferrari, A; Sala, P; Empl, A; MacGibbon, J
2001-01-01
We report on the progress of a current study aimed at developing a state-of-the-art Monte-Carlo computer simulation of the space radiation environment using advanced computer software techniques recently available at CERN, the European Laboratory for Particle Physics in Geneva, Switzerland. By taking the next-generation computer software appearing at CERN and adapting it to known problems in the implementation of space exploration strategies, this research is identifying changes necessary to bring these two advanced technologies together. The radiation transport tool being developed is tailored to the problem of taking measured space radiation fluxes impinging on the geometry of any particular spacecraft or planetary habitat and simulating the evolution of that flux through an accurate model of the spacecraft material. The simulation uses the latest known results in low-energy and high-energy physics. The output is a prediction of the detailed nature of the radiation environment experienced in space as well a...
Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes
Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R
2001-01-01
This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
A study of Monte Carlo radiative transfer through fractal clouds
Energy Technology Data Exchange (ETDEWEB)
Gautier, C.; Lavallec, D.; O`Hirok, W.; Ricchiazzi, P. [Univ. of California, Santa Barbara, CA (United States)] [and others
1996-04-01
An understanding of radiation transport (RT) through clouds is fundamental to studies of the earth`s radiation budget and climate dynamics. The transmission through horizontally homogeneous clouds has been studied thoroughly using accurate, discreet ordinates radiative transfer models. However, the applicability of these results to general problems of global radiation budget is limited by the plane parallel assumption and the fact that real clouds fields show variability, both vertically and horizontally, on all size scales. To understand how radiation interacts with realistic clouds, we have used a Monte Carlo radiative transfer model to compute the details of the photon-cloud interaction on synthetic cloud fields. Synthetic cloud fields, generated by a cascade model, reproduce the scaling behavior, as well as the cloud variability observed and estimated from cloud satellite data.
Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro
2001-01-01
This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.
Radiation pressure dynamics in planetary exospheres - a natural framework
International Nuclear Information System (INIS)
Bishop, J.; Chamberlain, J.W.
1989-01-01
Exospheric theory is reformulated to provide for the analysis of dynamical underpinning of exospheric features. The formulation is based on the parabolic-cylindrical separability of the Hamiltonian that describes particle motions in the combined fields of planetary gravity and solar radiation pressure. An approximate solution for trajectory evolution in terms of orbital elements is derived and the role of the exopause in the tail phenomenon is discussed. Also, an expression is obtained for the bound constituent atom densities at outer planetocoronal positions along the planet-sun axis for the case of an evaporative, uniform exobase. This expression is used to estimate midnight density enhancements as a function of radial distance for the terrestrial planets. 19 refs
Monte Carlo radiation transport: A revolution in science
International Nuclear Information System (INIS)
Hendricks, J.
1993-01-01
When Enrico Fermi, Stan Ulam, Nicholas Metropolis, John von Neuman, and Robert Richtmyer invented the Monte Carlo method fifty years ago, little could they imagine the far-flung consequences, the international applications, and the revolution in science epitomized by their abstract mathematical method. The Monte Carlo method is used in a wide variety of fields to solve exact computational models approximately by statistical sampling. It is an alternative to traditional physics modeling methods which solve approximate computational models exactly by deterministic methods. Modern computers and improved methods, such as variance reduction, have enhanced the method to the point of enabling a true predictive capability in areas such as radiation or particle transport. This predictive capability has contributed to a radical change in the way science is done: design and understanding come from computations built upon experiments rather than being limited to experiments, and the computer codes doing the computations have become the repository for physics knowledge. The MCNP Monte Carlo computer code effort at Los Alamos is an example of this revolution. Physicians unfamiliar with physics details can design cancer treatments using physics buried in the MCNP computer code. Hazardous environments and hypothetical accidents can be explored. Many other fields, from underground oil well exploration to aerospace, from physics research to energy production, from safety to bulk materials processing, benefit from MCNP, the Monte Carlo method, and the revolution in science
Mazzola, Guglielmo; Helled, Ravit; Sorella, Sandro
2018-01-01
Understanding planetary interiors is directly linked to our ability of simulating exotic quantum mechanical systems such as hydrogen (H) and hydrogen-helium (H-He) mixtures at high pressures and temperatures. Equation of state (EOS) tables based on density functional theory are commonly used by planetary scientists, although this method allows only for a qualitative description of the phase diagram. Here we report quantum Monte Carlo (QMC) molecular dynamics simulations of pure H and H-He mixture. We calculate the first QMC EOS at 6000 K for a H-He mixture of a protosolar composition, and show the crucial influence of He on the H metallization pressure. Our results can be used to calibrate other EOS calculations and are very timely given the accurate determination of Jupiter's gravitational field from the NASA Juno mission and the effort to determine its structure.
Radiative heat transfer by the Monte Carlo method
Hartnett †, James P; Cho, Young I; Greene, George A; Taniguchi, Hiroshi; Yang, Wen-Jei; Kudo, Kazuhiko
1995-01-01
This book presents the basic principles and applications of radiative heat transfer used in energy, space, and geo-environmental engineering, and can serve as a reference book for engineers and scientists in researchand development. A PC disk containing software for numerical analyses by the Monte Carlo method is included to provide hands-on practice in analyzing actual radiative heat transfer problems.Advances in Heat Transfer is designed to fill the information gap between regularly scheduled journals and university level textbooks by providing in-depth review articles over a broader scope than journals or texts usually allow.Key Features* Offers solution methods for integro-differential formulation to help avoid difficulties* Includes a computer disk for numerical analyses by PC* Discusses energy absorption by gas and scattering effects by particles* Treats non-gray radiative gases* Provides example problems for direct applications in energy, space, and geo-environmental engineering
Wilkins, Richard
2010-01-01
The Center for Radiation Engineering and Science for Space Exploration (CRESSE) at Prairie View A&M University, Prairie View, Texas, USA, is establishing an integrated, multi-disciplinary research program on the scientific and engineering challenges faced by NASA and the international space community caused by space radiation. CRESSE focuses on space radiation research directly applicable to astronaut health and safety during future long term, deep space missions, including Martian, lunar, and other planetary body missions beyond low earth orbit. The research approach will consist of experimental and theoretical radiation modeling studies utilizing particle accelerator facilities including: 1. NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory; 2. Proton Synchrotron at Loma Linda University Medical Center; and 3. Los Alamos Neutron Science Center (LANSCE) at Los Alamos National Laboratory. Specifically, CRESSE investigators are designing, developing, and building experimental test beds that simulate the lunar and Martian radiation environments for experiments focused on risk assessment for astronauts and instrumentation. The testbeds have been designated the Bioastronautics Experimental Research Testbeds for Environmental Radiation Nostrum Investigations and Education (BERT and ERNIE). The designs of BERT and ERNIE will allow for a high degree of flexibility and adaptability to modify experimental configurations to simulate planetary surface environments, planetary habitats, and spacecraft interiors. In the nominal configuration, BERT and ERIE will consist of a set of experimental zones that will simulate the planetary atmosphere (Solid CO2 in the case of the Martian surface.), the planetary surface, and sub-surface regions. These experimental zones can be used for dosimetry, shielding, biological, and electronic effects radiation studies in support of space exploration missions. BERT and ERNIE are designed to be compatible with the
Wilkins, Richard
The Center for Radiation Engineering and Science for Space Exploration (CRESSE) at Prairie View A&M University, Prairie View, Texas, USA, is establishing an integrated, multi-disciplinary research program on the scientific and engineering challenges faced by NASA and the inter-national space community caused by space radiation. CRESSE focuses on space radiation research directly applicable to astronaut health and safety during future long term, deep space missions, including Martian, lunar, and other planetary body missions beyond low earth orbit. The research approach will consist of experimental and theoretical radiation modeling studies utilizing particle accelerator facilities including: 1. NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory; 2. Proton Synchrotron at Loma Linda University Med-ical Center; and 3. Los Alamos Neutron Science Center (LANSCE) at Los Alamos National Laboratory. Specifically, CRESSE investigators are designing, developing, and building experimental test beds that simulate the lunar and Martian radiation environments for experiments focused on risk assessment for astronauts and instrumentation. The testbeds have been designated the Bioastronautics Experimental Research Testbeds for Environmental Radiation Nostrum Investigations and Education (BERT and ERNIE). The designs of BERT and ERNIE will allow for a high degree of flexibility and adaptability to modify experimental configurations to simulate planetary surface environments, planetary habitats, and spacecraft interiors. In the nominal configuration, BERT and ERIE will consist of a set of experimental zones that will simulate the planetary atmosphere (Solid CO2 in the case of the Martian surface.), the planetary surface, and sub-surface regions. These experimental zones can be used for dosimetry, shielding, biological, and electronic effects radiation studies in support of space exploration missions. BERT and ERNIE are designed to be compatible with the
Acceleration of a Monte Carlo radiation transport code
International Nuclear Information System (INIS)
Hochstedler, R.D.; Smith, L.M.
1996-01-01
Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. copyright 1996 American Institute of Physics
Françoise Benz
2006-01-01
2005-2006 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 27, 28, 29 June 11:00-12:00 - TH Conference Room, bldg. 4 The use of Monte Carlo radiation transport codes in radiation physics and dosimetry F. Salvat Gavalda,Univ. de Barcelona, A. FERRARI, CERN-AB, M. SILARI, CERN-SC Lecture 1. Transport and interaction of electromagnetic radiation F. Salvat Gavalda,Univ. de Barcelona Interaction models and simulation schemes implemented in modern Monte Carlo codes for the simulation of coupled electron-photon transport will be briefly reviewed. Different schemes for simulating electron transport will be discussed. Condensed algorithms, which rely on multiple-scattering theories, are comparatively fast, but less accurate than mixed algorithms, in which hard interactions (with energy loss or angular deflection larger than certain cut-off values) are simulated individually. The reliability, and limitations, of electron-interaction models and multiple-scattering theories will be analyzed. Benchmark comparisons of simu...
Monte Carlo simulation of radiation streaming from a radioactive material shipping cask
International Nuclear Information System (INIS)
Liu, Y.Y.; Schwarz, R.A.; Tang, J.S.
1996-01-01
Simulated detection of gamma radiation streaming from a radioactive material shipping cask have been performed with the Monte Carlo codes MCNP4A and MORSE-SGC/S. Despite inherent difficulties in simulating deep penetration of radiation and streaming, the simulations have yielded results that agree within one order of magnitude with the radiation survey data, with reasonable statistics. These simulations have also provided insight into modeling radiation detection, notably on location and orientation of the radiation detector with respect to photon streaming paths, and on techniques used to reduce variance in the Monte Carlo calculations. 13 refs., 4 figs., 2 tabs
Monte Carlo analysis of radiative transport in oceanographic lidar measurements
Energy Technology Data Exchange (ETDEWEB)
Cupini, E.; Ferro, G. [ENEA, Divisione Fisica Applicata, Centro Ricerche Ezio Clementel, Bologna (Italy); Ferrari, N. [Bologna Univ., Bologna (Italy). Dipt. Ingegneria Energetica, Nucleare e del Controllo Ambientale
2001-07-01
The analysis of oceanographic lidar systems measurements is often carried out with semi-empirical methods, since there is only a rough understanding of the effects of many environmental variables. The development of techniques for interpreting the accuracy of lidar measurements is needed to evaluate the effects of various environmental situations, as well as of different experimental geometric configurations and boundary conditions. A Monte Carlo simulation model represents a tool that is particularly well suited for answering these important questions. The PREMAR-2F Monte Carlo code has been developed taking into account the main molecular and non-molecular components of the marine environment. The laser radiation interaction processes of diffusion, re-emission, refraction and absorption are treated. In particular are considered: the Rayleigh elastic scattering, produced by atoms and molecules with small dimensions with respect to the laser emission wavelength (i.e. water molecules), the Mie elastic scattering, arising from atoms or molecules with dimensions comparable to the laser wavelength (hydrosols), the Raman inelastic scattering, typical of water, the absorption of water, inorganic (sediments) and organic (phytoplankton and CDOM) hydrosols, the fluorescence re-emission of chlorophyll and yellow substances. PREMAR-2F is an extension of a code for the simulation of the radiative transport in atmospheric environments (PREMAR-2). The approach followed in PREMAR-2 was to combine conventional Monte Carlo techniques with analytical estimates of the probability of the receiver to have a contribution from photons coming back after an interaction in the field of view of the lidar fluorosensor collecting apparatus. This offers an effective mean for modelling a lidar system with realistic geometric constraints. The retrieved semianalytic Monte Carlo radiative transfer model has been developed in the frame of the Italian Research Program for Antarctica (PNRA) and it is
International Nuclear Information System (INIS)
Gualdrini, G.F.; Casalini, L.; Morelli, B.
1994-12-01
The present report summarizes the activities concerned with numerical dosimetry as carried out at the Radiation Protection Institute of ENEA (Italian Agency for New Technologies, Energy and the Environment) on photon dosimetric quantities. The first part is concerned with MCNP Monte Carlo calculation of field parameters and operational quantities for the ICRU sphere with reference photon beams for the design of personal dosemeters. The second part is related with studies on the ADAM anthropomorphic phantom using the SABRINA and MCNP codes. The results of other Monte Carlo studies carried out on electron conversion factors for various tissue equivalent slab phantoms are about to be published in other ENEA reports. The report has been produced in the framework of the EURADOS WG4 (numerical dosimetry) activities within a collaboration between the ENEA Environmental Department and ENEA Energy Department
International Nuclear Information System (INIS)
Zazula, J.M.
1983-01-01
The general purpose code BALTORO was written for coupling the three-dimensional Monte-Carlo /MC/ with the one-dimensional Discrete Ordinates /DO/ radiation transport calculations. The quantity of a radiation-induced /neutrons or gamma-rays/ nuclear effect or the score from a radiation-yielding nuclear effect can be analysed in this way. (author)
Application of Monte Carlo method in determination of secondary characteristic X radiation in XFA
International Nuclear Information System (INIS)
Roubicek, P.
1982-01-01
Secondary characteristic radiation is excited by primary radiation from the X-ray tube and by secondary radiation of other elements so that excitations of several orders result. The Monte Carlo method was used to consider all these possibilities and the resulting flux of characteristic radiation was simulated for samples of silicate raw materials. A comparison of the results of these computations with experiments allows to determine the effect of sample preparation on the characteristic radiation flux. (M.D.)
Monte Carlo simulation of breast imaging using synchrotron radiation
Energy Technology Data Exchange (ETDEWEB)
Fitousi, N. T.; Delis, H.; Panayiotakis, G. [Department of Medical Physics, Faculty of Medicine, University of Patras, 26504 Patras (Greece)
2012-04-15
Purpose: Synchrotron radiation (SR), being the brightest artificial source of x-rays with a very promising geometry, has raised the scientific expectations that it could be used for breast imaging with optimized results. The ''in situ'' evaluation of this technique is difficult to perform, mostly due to the limited available SR facilities worldwide. In this study, a simulation model for SR breast imaging was developed, based on Monte Carlo simulation techniques, and validated using data acquired in the SYRMEP beamline of the Elettra facility in Trieste, Italy. Furthermore, primary results concerning the performance of SR were derived. Methods: The developed model includes the exact setup of the SR beamline, considering that the x-ray source is located at almost 23 m from the slit, while the photon energy was considered to originate from a very narrow Gaussian spectrum. Breast phantoms, made of Perspex and filled with air cavities, were irradiated with energies in the range of 16-28 keV. The model included a Gd{sub 2}O{sub 2}S detector with the same characteristics as the one available in the SYRMEP beamline. Following the development and validation of the model, experiments were performed in order to evaluate the contrast resolution of SR. A phantom made of adipose tissue and filled with inhomogeneities of several compositions and sizes was designed and utilized to simulate the irradiation under conventional mammography and SR conditions. Results: The validation results of the model showed an excellent agreement with the experimental data, with the correlation for contrast being 0.996. Significant differences only appeared at the edges of the phantom, where phase effects occur. The initial evaluation experiments revealed that SR shows very good performance in terms of the image quality indices utilized, namely subject contrast and contrast to noise ratio. The response of subject contrast to energy is monotonic; however, this does not stand for
VUV photochemistry simulation of planetary upper atmosphere using synchrotron radiation.
Carrasco, Nathalie; Giuliani, Alexandre; Correia, Jean Jacques; Cernogora, Guy
2013-07-01
The coupling of a gas reactor, named APSIS, with a vacuum-ultraviolet (VUV) beamline at the SOLEIL synchrotron radiation facility, for a photochemistry study of gas mixtures, is reported. The reactor may be irradiated windowless with gas pressures up to hundreds of millibar, and thus allows the effect of energetic photons below 100 nm wavelength to be studied on possibly dense media. This set-up is perfectly suited to atmospheric photochemistry investigations, as illustrated by a preliminary report of a simulation of the upper atmospheric photochemistry of Titan, the largest satellite of Saturn. Titan's atmosphere is mainly composed of molecular nitrogen and methane. Solar VUV irradiation with wavelengths no longer than 100 nm on the top of the atmosphere enables the dissociation and ionization of nitrogen, involving a nitrogen chemistry specific to nitrogen-rich upper atmospheres.
Monte Carlo and analytic simulations in nanoparticle-enhanced radiation therapy
Directory of Open Access Journals (Sweden)
Paro AD
2016-09-01
Full Text Available Autumn D Paro,1 Mainul Hossain,2 Thomas J Webster,1,3,4 Ming Su1,4 1Department of Chemical Engineering, Northeastern University, Boston, MA, USA; 2NanoScience Technology Center and School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, Florida, USA; 3Excellence for Advanced Materials Research, King Abdulaziz University, Jeddah, Saudi Arabia; 4Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Science, Wenzhou Medical University, Zhejiang, People’s Republic of China Abstract: Analytical and Monte Carlo simulations have been used to predict dose enhancement factors in nanoparticle-enhanced X-ray radiation therapy. Both simulations predict an increase in dose enhancement in the presence of nanoparticles, but the two methods predict different levels of enhancement over the studied energy, nanoparticle materials, and concentration regime for several reasons. The Monte Carlo simulation calculates energy deposited by electrons and photons, while the analytical one only calculates energy deposited by source photons and photoelectrons; the Monte Carlo simulation accounts for electron–hole recombination, while the analytical one does not; and the Monte Carlo simulation randomly samples photon or electron path and accounts for particle interactions, while the analytical simulation assumes a linear trajectory. This study demonstrates that the Monte Carlo simulation will be a better choice to evaluate dose enhancement with nanoparticles in radiation therapy. Keywords: nanoparticle, dose enhancement, Monte Carlo simulation, analytical simulation, radiation therapy, tumor cell, X-ray
The use of Monte Carlo radiation transport codes in radiation physics and dosimetry
CERN. Geneva; Ferrari, Alfredo; Silari, Marco
2006-01-01
Transport and interaction of electromagnetic radiation Interaction models and simulation schemes implemented in modern Monte Carlo codes for the simulation of coupled electron-photon transport will be briefly reviewed. In these codes, photon transport is simulated by using the detailed scheme, i.e., interaction by interaction. Detailed simulation is easy to implement, and the reliability of the results is only limited by the accuracy of the adopted cross sections. Simulations of electron and positron transport are more difficult, because these particles undergo a large number of interactions in the course of their slowing down. Different schemes for simulating electron transport will be discussed. Condensed algorithms, which rely on multiple-scattering theories, are comparatively fast, but less accurate than mixed algorithms, in which hard interactions (with energy loss or angular deflection larger than certain cut-off values) are simulated individually. The reliability, and limitations, of electron-interacti...
Beth, A.; Garnier, P.; Toublanc, D.; Dandouras, I.; Mazelle, C.
2016-12-01
The planetary exospheres are poorly known in their outer parts, since the neutral densities are low compared with the instruments detection capabilities. The exospheric models are thus often the main source of information at such high altitudes. We present a new way to take into account analytically the additional effect of the stellar radiation pressure on planetary exospheres. In a series of papers, we present with a Hamiltonian approach the effect of the radiation pressure on dynamical trajectories, density profiles and escaping thermal flux. Our work is a generalization of the study by Bishop and Chamberlain [1989] Icarus, 81, 145-163. In this third paper, we investigate the effect of the stellar radiation pressure on the Circular Restricted Three Body Problem (CR3BP), called also the photogravitational CR3BP, and its implication on the escape and the stability of planetary exospheres, especially for hot Jupiters. In particular, we describe the transformation of the equipotentials and the location of the Lagrange points, and we provide a modified equation for the Hill sphere radius that includes the influence of the radiation pressure. Finally, an application to the hot Jupiter HD 209458b and hot Neptune GJ 436b reveals the existence of a blow-off escape regime induced by the stellar radiation pressure.
Monte Carlo simulation of radiative heat transfer in coarse fibrous media
Energy Technology Data Exchange (ETDEWEB)
Nisipeanu, E.; Jones, P.D.
1999-07-01
Radiative transfer through a medium made up of a multitude of randomly oriented opaque cylindrical fibers is examined using Monte Carlo simulation of multiple surface radiative exchange for energy bundles interacting with each fiber in their path. The method is termed Monte Carlo Discontinuous Medium (MCDM). As compared to radiative continuum methods, the present approach does not require specification of extinction coefficient, scattering albedo, or scattering phase function. Instead, only volume fraction, fiber diameter, and fiber material complex index of refraction are required as parameters. Although the MCDM method is only strictly valid for the geometric limit, comparison with previous experiments on the edge of this limit (5 {lt} x {lt} 11) is qualitatively good. For the low (solid) volume fractions considered here, comparison is excellent between MCDM results and radiative continuum results, the later being solved by both Monte Carlo simulation and by exact integral solution of the Radiative Transfer Equation (RTE). MCDM results show a sensitivity to directional bias of the fibers in the medium, suggesting that bias parameters are necessary to solve radiative transfer in media with non-random fiber orientations. MCDM results for fibrous media are very similar to those for spherical suspensions at the same volume fraction and scatterer diameter, suggesting that the precise shape of a scattering particle may be relatively less important for radiation heat transfer through randomly oriented solid matrix materials.
Monte Carlo simulations of the radiation environment for the CMS Experiment
AUTHOR|(CDS)2068566; Bayshev, I.; Bergstrom, I.; Cooijmans, T.; Dabrowski, A.; Glöggler, L.; Guthoff, M.; Kurochkin, I.; Vincke, H.; Tajeda, S.
2016-01-01
Monte Carlo radiation transport codes are used by the CMS Beam Radiation Instrumentation and Luminosity (BRIL) project to estimate the radiation levels due to proton-proton collisions and machine induced background. Results are used by the CMS collaboration for various applications: comparison with detector hit rates, pile-up studies, predictions of radiation damage based on various models (Dose, NIEL, DPA), shielding design, estimations of residual dose environment. Simulation parameters, and the maintenance of the input files are summarised, and key results are presented. Furthermore, an overview of additional programs developed by the BRIL project to meet the specific needs of CMS community is given.
Implicit Monte Carlo methods and non-equilibrium Marshak wave radiative transport
International Nuclear Information System (INIS)
Lynch, J.E.
1985-01-01
Two enhancements to the Fleck implicit Monte Carlo method for radiative transport are described, for use in transparent and opaque media respectively. The first introduces a spectral mean cross section, which applies to pseudoscattering in transparent regions with a high frequency incident spectrum. The second provides a simple Monte Carlo random walk method for opaque regions, without the need for a supplementary diffusion equation formulation. A time-dependent transport Marshak wave problem of radiative transfer, in which a non-equilibrium condition exists between the radiation and material energy fields, is then solved. These results are compared to published benchmark solutions and to new discrete ordinate S-N results, for both spatially integrated radiation-material energies versus time and to new spatially dependent temperature profiles. Multigroup opacities, which are independent of both temperature and frequency, are used in addition to a material specific heat which is proportional to the cube of the temperature. 7 refs., 4 figs
Combining four Monte Carlo estimators for radiation momentum deposition
International Nuclear Information System (INIS)
Hykes, Joshua M.; Urbatsch, Todd J.
2011-01-01
Using four distinct Monte Carlo estimators for momentum deposition - analog, absorption, collision, and track-length estimators - we compute a combined estimator. In the wide range of problems tested, the combined estimator always has a figure of merit (FOM) equal to or better than the other estimators. In some instances the FOM of the combined estimator is only a few percent higher than the FOM of the best solo estimator, the track-length estimator, while in one instance it is better by a factor of 2.5. Over the majority of configurations, the combined estimator's FOM is 10 - 20% greater than any of the solo estimators' FOM. The numerical results show that the track-length estimator is the most important term in computing the combined estimator, followed far behind by the analog estimator. The absorption and collision estimators make negligible contributions. (author)
International Nuclear Information System (INIS)
Densmore, Jeffery D.; Larsen, Edward W.
2004-01-01
The equations of nonlinear, time-dependent radiative transfer are known to yield the equilibrium diffusion equation as the leading-order solution of an asymptotic analysis when the mean-free path and mean-free time of a photon become small. We apply this same analysis to the Fleck-Cummings, Carter-Forest, and N'kaoua Monte Carlo approximations for grey (frequency-independent) radiative transfer. Although Monte Carlo simulation usually does not require the discretizations found in deterministic transport techniques, Monte Carlo methods for radiative transfer require a time discretization due to the nonlinearities of the problem. If an asymptotic analysis of the equations used by a particular Monte Carlo method yields an accurate time-discretized version of the equilibrium diffusion equation, the method should generate accurate solutions if a time discretization is chosen that resolves temperature changes, even if the time steps are much larger than the mean-free time of a photon. This analysis is of interest because in many radiative transfer problems, it is a practical necessity to use time steps that are large compared to a mean-free time. Our asymptotic analysis shows that: (i) the N'kaoua method has the equilibrium diffusion limit, (ii) the Carter-Forest method has the equilibrium diffusion limit if the material temperature change during a time step is small, and (iii) the Fleck-Cummings method does not have the equilibrium diffusion limit. We include numerical results that verify our theoretical predictions
Czech Academy of Sciences Publication Activity Database
Barone, F.; La Nave, E.; Matzeu, M.; Mazzei, F.; Sy, D.; Běgusová, Marie
2000-01-01
Roč. 76, č. 6 (2000), s. 731-740 ISSN 0955-3002 Institutional research plan: CEZ:AV0Z1048901 Keywords : DNA triplex * ionizing radiation * footprinting * Monte Carlo simulation Subject RIV: AQ - Safety, Health Protection, Human - Machine Impact factor: 2.586, year: 2000
Local dose enhancement in radiation therapy: Monte Carlo simulation study
International Nuclear Information System (INIS)
Silva, Laura E. da; Nicolucci, Patricia
2014-01-01
The development of nanotechnology has boosted the use of nanoparticles in radiation therapy in order to achieve greater therapeutic ratio between tumor and healthy tissues. Gold has been shown to be most suitable to this task due to the high biocompatibility and high atomic number, which contributes to a better in vivo distribution and for the local energy deposition. As a result, this study proposes to study, nanoparticle in the tumor cell. At a range of 11 nm from the nanoparticle surface, results have shown an absorbed dose 141 times higher for the medium with the gold nanoparticle compared to the water for an incident energy spectrum with maximum photon energy of 50 keV. It was also noted that when only scattered radiation is interacting with the gold nanoparticles, the dose was 134 times higher compared to enhanced local dose that remained significant even for scattered radiation. (author)
Osmane, Adnane; Wilson, Lynn B., III; Blum, Lauren; Pulkkinen, Tuija I.
2016-01-01
Using a dynamical-system approach, we have investigated the efficiency of large-amplitude whistler waves for causing microburst precipitation in planetary radiation belts by modeling the microburst energy and particle fluxes produced as a result of nonlinear wave-particle interactions. We show that wave parameters, consistent with large amplitude oblique whistlers, can commonly generate microbursts of electrons with hundreds of keV-energies as a result of Landau trapping. Relativistic microbursts (greater than 1 MeV) can also be generated by a similar mechanism, but require waves with large propagation angles Theta (sub k)B greater than 50 degrees and phase-speeds v(sub phi) greater than or equal to c/9. Using our result for precipitating density and energy fluxes, we argue that holes in the distribution function of electrons near the magnetic mirror point can result in the generation of double layers and electron solitary holes consistent in scales (of the order of Debye lengths) to nonlinear structures observed in the radiation belts by the Van Allen Probes. Our results indicate a relationship between nonlinear electrostatic and electromagnetic structures in the dynamics of planetary radiation belts and their role in the cyclical production of energetic electrons (E greater than or equal to 100 keV) on kinetic timescales, which is much faster than previously inferred.
SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations
Baes, M.; Camps, P.
2015-09-01
The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.
Monte Carlo simulation for radiation dose in children radiology
International Nuclear Information System (INIS)
Mendes, Hitalo R.; Tomal, Alessandra
2016-01-01
The dosimetry in pediatric radiology is essential due to the higher risk that children have in comparison to adults. The focus of this study is to present how the dose varies depending on the depth in a 10 year old and a newborn, for this purpose simulations are made using the Monte Carlo method. Potential differences were considered 70 and 90 kVp for the 10 year old and 70 and 80 kVp for the newborn. The results show that in both cases, the dose at the skin surface is larger for smaller potential value, however, it decreases faster for larger potential values. Another observation made is that because the newborn is less thick the ratio between the initial dose and the final is lower compared to the case of a 10 year old, showing that it is possible to make an image using a smaller entrance dose in the skin, keeping the same level of exposure at the detector. (author)
Energy Technology Data Exchange (ETDEWEB)
Villafan-Vidales, H.I.; Arancibia-Bulnes, C.A.; Dehesa-Carrasco, U. [Centro de Investigacion en Energia, Universidad Nacional Autonoma de Mexico, Privada Xochicalco s/n, Col. Centro, A.P. 34, Temixco, Morelos 62580 (Mexico); Romero-Paredes, H. [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco No.186, Col. Vicentina, A.P. 55-534, Mexico D.F 09340 (Mexico)
2009-01-15
Radiative heat transfer in a solar thermochemical reactor for the thermal reduction of cerium oxide is simulated with the Monte Carlo method. The directional characteristics and the power distribution of the concentrated solar radiation that enters the cavity is obtained by carrying out a Monte Carlo ray tracing of a paraboloidal concentrator. It is considered that the reactor contains a gas/particle suspension directly exposed to concentrated solar radiation. The suspension is treated as a non-isothermal, non-gray, absorbing, emitting, and anisotropically scattering medium. The transport coefficients of the particles are obtained from Mie-scattering theory by using the optical properties of cerium oxide. From the simulations, the aperture radius and the particle concentration were optimized to match the characteristics of the considered concentrator. (author)
Radiation response of inorganic scintillators: Insights from Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Prange, Micah P.; Wu, Dangxin; Xie, YuLong; Campbell, Luke W.; Gao, Fei; Kerisit, Sebastien N.
2014-07-24
The spatial and temporal scales of hot particle thermalization in inorganic scintillators are critical factors determining the extent of second- and third-order nonlinear quenching in regions with high densities of electron-hole pairs, which, in turn, leads to the light yield nonproportionality observed, to some degree, for all inorganic scintillators. Therefore, kinetic Monte Carlo simulations were performed to calculate the distances traveled by hot electrons and holes as well as the time required for the particles to reach thermal energy following γ-ray irradiation. CsI, a common scintillator from the alkali halide class of materials, was used as a model system. Two models of quasi-particle dispersion were evaluated, namely, the effective mass approximation model and a model that relied on the group velocities of electrons and holes determined from band structure calculations. Both models predicted rapid electron-hole pair recombination over short distances (a few nanometers) as well as a significant extent of charge separation between electrons and holes that did not recombine and reached thermal energy. However, the effective mass approximation model predicted much longer electron thermalization distances and times than the group velocity model. Comparison with limited experimental data suggested that the group velocity model provided more accurate predictions. Nonetheless, both models indicated that hole thermalization is faster than electron thermalization and thus is likely to be an important factor determining the extent of third-order nonlinear quenching in high-density regions. The merits of different models of quasi-particle dispersion are also discussed.
Lobanov, S.; Goncharov, A. F.; Holtgrewe, N.; Konopkova, Z.; McWilliams, R. S.
2017-12-01
Thermal conductivity of deep planetary materials determines the planetary heat transport mode and properties (e.g. magnetic field) and can be used to decipher the planetary thermal history. Due to the lack of direct measurements of the lattice and radiative conductivity of the relevant materials at the planetary conditions, the current geodynamical models use theoretical calculations and extrapolations of the available experimental data. Here we describe our pulsed laser techniques that enable direct measurements of the lattice and radiative lattice conductivity of the Earth's mantle and core materials and also of noble gases and simple molecules present in the interiors of giant planets (e.g. hydrogen). Flash heating laser techniques working in a pump-probe mode that include time resolved two-side radiative and thermoreflection temperature probes employ various laser and photo-detector configurations, which provide a measure of the thermal fluxes propagating through the samples confined in the diamond anvil cell cavity. A supercontinuum ultra-bright broadband laser source empower accurate measurements of the optical properties of planetary materials used to extract the radiative conductivity. Finite element calculations serve to extract the temperature and pressure dependent thermal conductivity and temperature gradients across the sample. We report thermal conductivity measurements of the Earth's minerals (postperovskite, bridgmanite, ferropericlase) and their assemblies (pyrolite) and core materials (Fe and alloys with Si and O) at the realistic deep Earth's pressure temperature conditions. We thank J.-F.Lin, M. Murakami, J. Badro for contributing to this work.
A hybrid transport-diffusion method for Monte Carlo radiative-transfer simulations
International Nuclear Information System (INIS)
Densmore, Jeffery D.; Urbatsch, Todd J.; Evans, Thomas M.; Buksas, Michael W.
2007-01-01
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Monte Carlo particle-transport simulations in diffusive media. If standard Monte Carlo is used in such media, particle histories will consist of many small steps, resulting in a computationally expensive calculation. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many small Monte Carlo steps, thus increasing the efficiency of the simulation. In addition, given that DDMC is based on a diffusion equation, it should produce accurate solutions if used judiciously. In practice, DDMC is combined with standard Monte Carlo to form a hybrid transport-diffusion method that can accurately simulate problems with both diffusive and non-diffusive regions. In this paper, we extend previously developed DDMC techniques in several ways that improve the accuracy and utility of DDMC for nonlinear, time-dependent, radiative-transfer calculations. The use of DDMC in these types of problems is advantageous since, due to the underlying linearizations, optically thick regions appear to be diffusive. First, we employ a diffusion equation that is discretized in space but is continuous in time. Not only is this methodology theoretically more accurate than temporally discretized DDMC techniques, but it also has the benefit that a particle's time is always known. Thus, there is no ambiguity regarding what time to assign a particle that leaves an optically thick region (where DDMC is used) and begins transporting by standard Monte Carlo in an optically thin region. Also, we treat the interface between optically thick and optically thin regions with an improved method, based on the asymptotic diffusion-limit boundary condition, that can produce accurate results regardless of the angular distribution of the incident Monte Carlo particles. Finally, we develop a technique for estimating radiation momentum deposition during the
Sky-Radiance Models for Monte Carlo Radiative Transfer Applications
Santos, I.; Dalimonte, D.; Santos, J. P.
2012-04-01
Photon-tracing can be initialized through sky-radiance (Lsky) distribution models when executing Monte Carlo simulations for ocean color studies. To be effective, the Lsky model should: 1) properly represent sky-radiance features of interest; 2) require low computing time; and 3) depend on a limited number of input parameters. The present study verifies the satisfiability of these prerequisite by comparing results from different Lsky formulations. Specifically, two Lsky models were considered as reference cases because of their different approach among solutions presented in the literature. The first model, developed by the Harrisson and Coombes (HC), is based on a parametric expression where the sun geometry is the unique input. The HC model is one of the sky-radiance analytical distribution applied in state-of-art simulations for ocean optics. The coefficients of the HC model were set upon broad-band field measurements and the result is a model that requires a few implementation steps. The second model, implemented by Zibordi and Voss (ZV), is based on physical expressions that accounts for the optical thickness of permanent gases, aerosol, ozone and water vapour at specific wavelengths. Inter-comparisons between normalized ^LskyZV and ^LskyHC (i.e., with unitary scalar irradiance) are discussed by means of individual polar maps and percent difference between sky-radiance distributions. Sky-radiance cross-sections are presented as well. Considered cases include different sun zenith values and wavelengths (i.e., λ=413, 490 and 665 nm, corresponding to selected center-bands of the MEdium Resolution Imaging Spectrometer MERIS). Results have shown a significant convergence between ^LskyHC and ^LskyZV at 665 nm. Differences between models increase with the sun zenith and mostly with wavelength. For Instance, relative differences up to 50% between ^ L skyHC and ^ LskyZV can be observed in the antisolar region for λ=665 nm and θ*=45°. The effects of these
Application of OMEGA Monte Carlo codes for radiation therapy treatment planning
International Nuclear Information System (INIS)
Ayyangar, Komanduri M.; Jiang, Steve B.
1998-01-01
The accuracy of conventional dose algorithms for radiosurgery treatment planning is limited, due to the inadequate consideration of the lateral radiation transport and the difficulty of acquiring accurate dosimetric data for very small beams. In the present paper, some initial work on the application of Monte Carlo method in radiation treatment planning in general, and in radiosurgery treatment planning in particular, has been presented. Two OMEGA Monte Carlo codes, BEAM and DOSXYZ, are used. The BEAM code is used to simulate the transport of particles in the linac treatment head and radiosurgery collimator. A phase space file is obtained from the BEAM simulation for each collimator size. The DOSXYZ code is used to calculate the dose distribution in the patient's body reconstructed from CT slices using the phase space file as input. The accuracy of OMEGA Monte Carlo simulation for radiosurgery dose calculation is verified by comparing the calculated and measured basic dosimetric data for several radiosurgery beams and a 4 x 4 cm 2 conventional beam. The dose distributions for three clinical cases are calculated using OMEGA codes as the dose engine for an in-house developed radiosurgery treatment planning system. The verification using basic dosimetric data and the dose calculation for clinical cases demonstrate the feasibility of applying OMEGA Monte Carlo code system to radiosurgery treatment planning. (author)
Foucart, Francois
2018-04-01
General relativistic radiation hydrodynamic simulations are necessary to accurately model a number of astrophysical systems involving black holes and neutron stars. Photon transport plays a crucial role in radiatively dominated accretion discs, while neutrino transport is critical to core-collapse supernovae and to the modelling of electromagnetic transients and nucleosynthesis in neutron star mergers. However, evolving the full Boltzmann equations of radiative transport is extremely expensive. Here, we describe the implementation in the general relativistic SPEC code of a cheaper radiation hydrodynamic method that theoretically converges to a solution of Boltzmann's equation in the limit of infinite numerical resources. The algorithm is based on a grey two-moment scheme, in which we evolve the energy density and momentum density of the radiation. Two-moment schemes require a closure that fills in missing information about the energy spectrum and higher order moments of the radiation. Instead of the approximate analytical closure currently used in core-collapse and merger simulations, we complement the two-moment scheme with a low-accuracy Monte Carlo evolution. The Monte Carlo results can provide any or all of the missing information in the evolution of the moments, as desired by the user. As a first test of our methods, we study a set of idealized problems demonstrating that our algorithm performs significantly better than existing analytical closures. We also discuss the current limitations of our method, in particular open questions regarding the stability of the fully coupled scheme.
Monte Carlo simulation of the RBE of I-131 radiation using DNA damage as biomarker.
Ezzati, Ahad Ollah; Mahmoud-Pashazadeh, Ali; Studenski, Matthew T
2017-06-01
In general, a weighting factor of one is applied for low linear energy transfer radiations. However, several studies indicate that relative biological effectiveness (RBE) of low energy photons and electrons is greater than one. The aim of this current study was calculating the RBE of I-131 radiation relative to Co-60 gamma photons in 100 μm spheroid cells using Monte Carlo (MC) simulations. These calculations were compared to experimentally measured results. MCNPX2.6 was used to simulate the I-131 and Co-60 irradiation setups and calculate the secondary electron spectra at energies higher than 1 keV with varying oxygen concentrations. The electron spectra at energies lower than 1 keV were obtained by extrapolation (down to 10 eV). The calculated electron spectra were input into the MCDS micro-dosimetric Monte Carlo code to calculate the DSB induction and related RBE. The calculated RBE of I-131 radiation relative to Co-60 photons, as the reference radiation recommended by the International Commission on Radiation Protection (ICRP), was 1.06, 1.03 and 1.02 for oxygen concentrations of 0, 5 and 100%, respectively. Results of MC simulations indicate the RBE of I-131 is greater than one. This finding, despite a 10% discrepancy with the findings of the previous in vitro study of one of the authors of this paper, reemphasizes that I-131 radiation induces more severe biological damage than current ICRP recommendations.
Monte Carlo simulation of gas-filled radiation detectors
International Nuclear Information System (INIS)
Kundu, A.
2000-06-01
A new simulation code has been developed that allows the response of gas-filled proportional counters to be calculated. The code is an electron transport code that simulates the elastic and inelastic scattering processes that occur as a result of electron-impact collisions with the gas atoms. The simulation concentrates on the avalanche development after the primary ionising particle has freed electrons in the gas volume, by tracking electrons until they reach the anode of the counter. The dynamics of the ions that accumulate in the gas volume are also considered. A major motivation for this work is the general renewed interest in proportional counters over the last decade, since the advent of micro-pattern detectors such as the micro-strip and the micro-gap detector. It is argued that the low relative cost, intrinsic amplification and environmental stability of these detectors gives them considerable advantages over other types of radiation detectors. The code has been benchmarked against experimental data. The manner in which the variation in the avalanche statistics affects the energy resolution properties of the detector is examined for single wire counters, micro-strip and micro-gap counters. The stability of micro-gap detectors when subjected to high rates of irradiation is also examined. It is envisaged that these detectors will be used in the future as part of a multiphase flow tomography device for imaging the flow of oil/water/natural gas mixtures that have been pumped through pipes from the seabed. (author)
International Nuclear Information System (INIS)
Del Pino Albuja, Norma Josefina
2005-01-01
Ionizing radiation represents a daily risk for the people who work occupationally exposed to radiations at Carlos Andrade Marin hospital. For that reason, the knowledge of the basic concepts of the physical phenomenon of ionizing radiation and the study of dosimetry that is carried out to occupationally exposed workers at Carlos Andrade Marin hospital are very important to manage ionizing radiations as a risk factor. This study shows the system of dosimetry of Carlos Andrade Marin hospital. Moreover, it includes an analysis between the doses received by workers occupationally exposed of Carlos Andrade Marin hospital and the limit dose internationally recommended. For this investigation, it was used bibliographical revision, descriptive, historical, and inductive study, and descriptive statistics with the software Microsoft Office Excel 2003. The hypothesis of this research is that the workplaces exposed to ionizing radiations at Carlos Andrade Marin hospital have an appropriate dosimetry system. Furthermore, it considers superficial and deep doses of occupationally exposed workers of both genders and age. The obtained results of the studied period 1998 to 2000 are: i) The 99% of the occupationally exposed workers used the dosimeter. ii) The higher superficial dose -13,34mSv - corresponds to a Hemodynamic doctor. iii) The higher deep dose -7,1mSv - corresponds to a Nuclear Medicine medical technologist. iv) The higher doses mentioned above are under the limits internationally recommended by the International Commission on International Protection. These limits are 20mSv per year and 100mSv per 5 years respectively. The conclusions of the investigation are: i) Carlos Andrade Marin hospital has an adequate Dosimetry system and the occupationally exposed workers are permanently monitored with the dosimeter. ii) The Nuclear Medicine workers have the higher doses of exposition related to the other areas of Carlos Andrade Marin hospital. iii) The most exposed
Spada, F.M.; Krol, M.C.; Stammes, P.
2006-01-01
A new multiple-scattering Monte Carlo 3-D radiative transfer model named McSCIA (Monte Carlo for SCIAmachy) is presented. The backward technique is used to efficiently simulate narrow field of view instruments. The McSCIA algorithm has been formulated as a function of the Earth’s radius, and can
Spada, F.; Krol, M.C.; Stammes, P.
2006-01-01
A new multiple-scatteringMonte Carlo 3-D radiative transfer model named McSCIA (Monte Carlo for SCIA-machy) is presented. The backward technique is used to efficiently simulate narrow field of view instruments. The McSCIA algorithm has been formulated as a function of the Earth's radius, and can
FTREE. Single-history Monte Carlo analysis for radiation detection and measurement
International Nuclear Information System (INIS)
Chin, M.P.W.
2015-01-01
This work introduces FTREE, which describes radiation cascades following impingement of a source particle on matter. The ensuing radiation field is characterised interaction by interaction, accounting for each generation of secondaries recursively. Each progeny is uniquely differentiated and catalogued into a family tree; the kinship is identified without ambiguity. This mode of observation, analysis and presentation goes beyond present-day detector technologies, beyond conventional Monte Carlo simulations and beyond standard pedagogy. It is able to observe rare events far out in the Gaussian tail which would have been lost in averaging-events less probable, but no less correct in physics. (author)
Multiple compton scattering effect on the spectrum of X-ray radiation. Monte-Carlo computations
International Nuclear Information System (INIS)
Pozdnyakov, L.A.; Sobol', I.M.; Syunyaev, R.A.; AN SSSR, Moscow. Inst. Prikladnoj Matematiki)
1977-01-01
Computation of the X-ray radiation spectrum forming at multiple scattering of low-frequency photons on relativistic electrons is carried out. A spherical cloud of relativistic plasma with optical depth on Thomson scattering tau and a given temperature of Maxwellian electrons kTsub(e) is considered. There is a point source of low frequency radiation in the centre of the cloud with a Planckian spectrum. Monte-Carlo computations and analytical estimates show that in the case of small optical depth tau < 1, the radiation escaping from the cloud has a power-law spectrum Isub(ν) approximately νsup(-α) where α is the spectral index. In the case of an optically thick cloud, the escaping radiation spectrum tends to the Wien equilibrium shape. The energy loss rate of the cloud is computed. The transfer of hard radiation from a central point source through a plasma cloud with kTsub(e) approximately 3 keV is considered. Monte-Carlo techniques for computing such problems are decribed
An Approach in Radiation Therapy Treatment Planning: A Fast, GPU-Based Monte Carlo Method.
Karbalaee, Mojtaba; Shahbazi-Gahrouei, Daryoush; Tavakoli, Mohammad B
2017-01-01
An accurate and fast radiation dose calculation is essential for successful radiation radiotherapy. The aim of this study was to implement a new graphic processing unit (GPU) based radiation therapy treatment planning for accurate and fast dose calculation in radiotherapy centers. A program was written for parallel running based on GPU. The code validation was performed by EGSnrc/DOSXYZnrc. Moreover, a semi-automatic, rotary, asymmetric phantom was designed and produced using a bone, the lung, and the soft tissue equivalent materials. All measurements were performed using a Mapcheck dosimeter. The accuracy of the code was validated using the experimental data, which was obtained from the anthropomorphic phantom as the gold standard. The findings showed that, compared with those of DOSXYZnrc in the virtual phantom and for most of the voxels (>95%), GPU-based Monte Carlo method in dose calculation may be useful in routine radiation therapy centers as the core and main component of a treatment planning verification system.
Bouchard, Hugo; Bielajew, Alex
2015-07-07
To establish a theoretical framework for generalizing Monte Carlo transport algorithms by adding external electromagnetic fields to the Boltzmann radiation transport equation in a rigorous and consistent fashion. Using first principles, the Boltzmann radiation transport equation is modified by adding a term describing the variation of the particle distribution due to the Lorentz force. The implications of this new equation are evaluated by investigating the validity of Fano's theorem. Additionally, Lewis' approach to multiple scattering theory in infinite homogeneous media is redefined to account for the presence of external electromagnetic fields. The equation is modified and yields a description consistent with the deterministic laws of motion as well as probabilistic methods of solution. The time-independent Boltzmann radiation transport equation is generalized to account for the electromagnetic forces in an additional operator similar to the interaction term. Fano's and Lewis' approaches are stated in this new equation. Fano's theorem is found not to apply in the presence of electromagnetic fields. Lewis' theory for electron multiple scattering and moments, accounting for the coupling between the Lorentz force and multiple elastic scattering, is found. However, further investigation is required to develop useful algorithms for Monte Carlo and deterministic transport methods. To test the accuracy of Monte Carlo transport algorithms in the presence of electromagnetic fields, the Fano cavity test, as currently defined, cannot be applied. Therefore, new tests must be designed for this specific application. A multiple scattering theory that accurately couples the Lorentz force with elastic scattering could improve Monte Carlo efficiency. The present study proposes a new theoretical framework to develop such algorithms.
Subcritical Growth of Electron Phase-space Holes in Planetary Radiation Belts
Osmane, A.; Wilson, L. B., III; Turner, D. L.; Dimmock, A. P.; Pulkkinen, T. I.
2017-12-01
The discovery of self-sustained coherent structures with large-amplitude electric fields (E ˜ 10 - 100 mV/m) by the Van Allen Probes has revealed alternative routes through which energy-momentum exchange can take place in planetary radiation belts. When originating from energetic electrons in Landau resonance with large-amplitude whistlers, phase-space electron holes form with small amplitudes of the order of the hot to cold electron density, i.e., qφ/T_e≃ n_h/n_c ≃ 10^{-3}, and orders of magnitude smaller than observed values of the largest phase-space holes amplitude, i.e., qφ /T_e ≃ 1. In this report we present a mechanism through which electron holes can grow nonlinearly (i.e. γ ∝ √{φ}) and subcritically as a result of momentum exchange with passing (untrapped) electrons. Growth rates are computed analytically for plasma parameters consistent with those measured in the Earth's radiation belts under quiet and disturbed conditions. Our results provide an explanation for the fast growth of electron phase-space holes in the Earth's radiation belts from small initial values qφ/T_c ≃ 10^{-3}, to larger values of the order qφ /T_e ≃ 1.
Directory of Open Access Journals (Sweden)
Chapoutier Nicolas
2017-01-01
Full Text Available In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics. Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
Minimizing the cost of splitting in Monte Carlo radiation transport simulation
Energy Technology Data Exchange (ETDEWEB)
Juzaitis, R.J.
1980-10-01
A deterministic analysis of the computational cost associated with geometric splitting/Russian roulette in Monte Carlo radiation transport calculations is presented. Appropriate integro-differential equations are developed for the first and second moments of the Monte Carlo tally as well as time per particle history, given that splitting with Russian roulette takes place at one (or several) internal surfaces of the geometry. The equations are solved using a standard S/sub n/ (discrete ordinates) solution technique, allowing for the prediction of computer cost (formulated as the product of sample variance and time per particle history, sigma/sup 2//sub s/tau p) associated with a given set of splitting parameters. Optimum splitting surface locations and splitting ratios are determined. Benefits of such an analysis are particularly noteworthy for transport problems in which splitting is apt to be extensively employed (e.g., deep penetration calculations).
Minimizing the cost of splitting in Monte Carlo radiation transport simulation
International Nuclear Information System (INIS)
Juzaitis, R.J.
1980-10-01
A deterministic analysis of the computational cost associated with geometric splitting/Russian roulette in Monte Carlo radiation transport calculations is presented. Appropriate integro-differential equations are developed for the first and second moments of the Monte Carlo tally as well as time per particle history, given that splitting with Russian roulette takes place at one (or several) internal surfaces of the geometry. The equations are solved using a standard S/sub n/ (discrete ordinates) solution technique, allowing for the prediction of computer cost (formulated as the product of sample variance and time per particle history, sigma 2 /sub s/tau p) associated with a given set of splitting parameters. Optimum splitting surface locations and splitting ratios are determined. Benefits of such an analysis are particularly noteworthy for transport problems in which splitting is apt to be extensively employed
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.
2016-03-01
This work discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package authored at Oak Ridge National Laboratory. Shift has been developed to scale well from laptops to small computing clusters to advanced supercomputers and includes features such as support for multiple geometry and physics engines, hybrid capabilities for variance reduction methods such as the Consistent Adjoint-Driven Importance Sampling methodology, advanced parallel decompositions, and tally methods optimized for scalability on supercomputing architectures. The scaling studies presented in this paper demonstrate good weak and strong scaling behavior for the implemented algorithms. Shift has also been validated and verified against various reactor physics benchmarks, including the Consortium for Advanced Simulation of Light Water Reactors' Virtual Environment for Reactor Analysis criticality test suite and several Westinghouse AP1000® problems presented in this paper. These benchmark results compare well to those from other contemporary Monte Carlo codes such as MCNP5 and KENO.
Energy Technology Data Exchange (ETDEWEB)
T.J. Urbatsch; T.M. Evans
2006-02-15
We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.
Monte Carlo simulations of ultra high vacuum and synchrotron radiation for particle accelerators
AUTHOR|(CDS)2082330; Leonid, Rivkin
With preparation of Hi-Lumi LHC fully underway, and the FCC machines under study, accelerators will reach unprecedented energies and along with it very large amount of synchrotron radiation (SR). This will desorb photoelectrons and molecules from accelerator walls, which contribute to electron cloud buildup and increase the residual pressure - both effects reducing the beam lifetime. In current accelerators these two effects are among the principal limiting factors, therefore precise calculation of synchrotron radiation and pressure properties are very important, desirably in the early design phase. This PhD project shows the modernization and a major upgrade of two codes, Molflow and Synrad, originally written by R. Kersevan in the 1990s, which are based on the test-particle Monte Carlo method and allow ultra-high vacuum and synchrotron radiation calculations. The new versions contain new physics, and are built as an all-in-one package - available to the public. Existing vacuum calculation methods are overvi...
Review of the Monte Carlo and deterministic codes in radiation protection and dosimetry
International Nuclear Information System (INIS)
Tagziria, H.
2000-02-01
Modelling a physical system can be carried out either stochastically or deterministically. An example of the former method is the Monte Carlo technique, in which statistically approximate methods are applied to exact models. No transport equation is solved as individual particles are simulated and some specific aspect (tally) of their average behaviour is recorded. The average behaviour of the physical system is then inferred using the central limit theorem. In contrast, deterministic codes use mathematically exact methods that are applied to approximate models to solve the transport equation for the average particle behaviour. The physical system is subdivided in boxes in the phase-space system and particles are followed from one box to the next. The smaller the boxes the better the approximations become. Although the Monte Carlo method has been used for centuries, its more recent manifestation has really emerged from the Manhattan project of the Word War II. Its invention is thought to be mainly due to Metropolis, Ulah (through his interest in poker), Fermi, von Neuman and Richtmeyer. Over the last 20 years or so, the Monte Carlo technique has become a powerful tool in radiation transport. This is due to users taking full advantage of richer cross section data, more powerful computers and Monte Carlo techniques for radiation transport, with high quality physics and better known source spectra. This method is a common sense approach to radiation transport and its success and popularity is quite often also due to necessity, because measurements are not always possible or affordable. In the Monte Carlo method, which is inherently realistic because nature is statistical, a more detailed physics is made possible by isolation of events while rather elaborate geometries can be modelled. Provided that the physics is correct, a simulation is exactly analogous to an experimenter counting particles. In contrast to the deterministic approach, however, a disadvantage of the
Evaluation of radiation dose to patients in intraoral dental radiography using Monte Carlo Method
Energy Technology Data Exchange (ETDEWEB)
Park, Il; Kim, Kyeong Ho; Oh, Seung Chul; Song, Ji Young [Dept. of Nuclear Engineering, Kyung Hee University, Yongin (Korea, Republic of)
2016-11-15
The use of dental radiographic examinations is common although radiation dose resulting from the dental radiography is relatively small. Therefore, it is required to evaluate radiation dose from the dental radiography for radiation safety purpose. The objectives of the present study were to develop dosimetry method for intraoral dental radiography using a Monte Carlo method based radiation transport code and to calculate organ doses and effective doses of patients from different types of intraoral radiographies. Radiological properties of dental radiography equipment were characterized for the evaluation of patient radiation dose. The properties including x-ray energy spectrum were simulated using MCNP code. Organ doses and effective doses to patients were calculated by MCNP simulation with computational adult phantoms. At the typical equipment settings (60 kVp, 7 mA, and 0.12 sec), the entrance air kerma was 1.79 mGy and the measured half value layer was 1.82 mm. The half value layer calculated by MCNP simulation was well agreed with the measurement values. Effective doses from intraoral radiographies ranged from 1 μSv for maxilla premolar to 3 μSv for maxilla incisor. Oral cavity layer (23⁓82 μSv) and salivary glands (10⁓68 μSv) received relatively high radiation dose. Thyroid also received high radiation dose (3⁓47 μSv) for examinations. The developed dosimetry method and evaluated radiation doses in this study can be utilized for policy making, patient dose management, and development of low-dose equipment. In addition, this study can ultimately contribute to decrease radiation dose to patients for radiation safety.
Monte Carlo method for polarized radiative transfer in gradient-index media
International Nuclear Information System (INIS)
Zhao, J.M.; Tan, J.Y.; Liu, L.H.
2015-01-01
Light transfer in gradient-index media generally follows curved ray trajectories, which will cause light beam to converge or diverge during transfer and induce the rotation of polarization ellipse even when the medium is transparent. Furthermore, the combined process of scattering and transfer along curved ray path makes the problem more complex. In this paper, a Monte Carlo method is presented to simulate polarized radiative transfer in gradient-index media that only support planar ray trajectories. The ray equation is solved to the second order to address the effect induced by curved ray trajectories. Three types of test cases are presented to verify the performance of the method, which include transparent medium, Mie scattering medium with assumed gradient index distribution, and Rayleigh scattering with realistic atmosphere refractive index profile. It is demonstrated that the atmospheric refraction has significant effect for long distance polarized light transfer. - Highlights: • A Monte Carlo method for polarized radiative transfer in gradient index media. • Effect of curved ray paths on polarized radiative transfer is considered. • Importance of atmospheric refraction for polarized light transfer is demonstrated
Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations
Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.
2018-04-01
One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We show how to estimate the statistical uncertainty given the output of just a single radiative-transfer simulation in which the number of photon packets follows a Poisson distribution and the weight (e.g. energy or luminosity) of a single packet may follow an arbitrary distribution. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalise existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.
International Nuclear Information System (INIS)
Jabbari, N.; Hashemi-Malayeri, B.; Farajollahi, A. R.; Kazemnejad, A.
2007-01-01
In radiotherapy with electron beams, scattered radiation from an electron applicator influences the dose distribution in the patient. The contribution of this radiation to the patient dose is significant, even in modern accelerators. In most of radiotherapy treatment planning systems, this component is not explicitly included. In addition, the scattered radiation produced by applicators varies based on the applicator design as well as the field size and distance from the applicators. The aim of this study was to calculate the amount of scattered dose contribution from applicators. We also tried to provide an extensive set of calculated data that could be used as input or benchmark data for advanced treatment planning systems that use Monte Carlo algorithms for dose distribution calculations. Electron beams produced by a NEPTUN 10PC medical linac were modeled using the BEAMnrc system. Central axis depth dose curves of the electron beams were measured and calculated, with and without the applicators in place, for different field sizes and energies. The scattered radiation from the applicators was determined by subtracting the central axis depth dose curves obtained without the applicators from that with the applicator. The results of this study indicated that the scattered radiation from the electron applicators of the NEPTUN 10PC is significant and cannot be neglected in advanced treatment planning systems. Furthermore, our results showed that the scattered radiation depends on the field size and decreases almost linearly with depth. (author)
Applying graphics processor units to Monte Carlo dose calculation in radiation therapy
Directory of Open Access Journals (Sweden)
Bakhtiari M
2010-01-01
Full Text Available We investigate the potential in using of using a graphics processor unit (GPU for Monte-Carlo (MC-based radiation dose calculations. The percent depth dose (PDD of photons in a medium with known absorption and scattering coefficients is computed using a MC simulation running on both a standard CPU and a GPU. We demonstrate that the GPU′s capability for massive parallel processing provides a significant acceleration in the MC calculation, and offers a significant advantage for distributed stochastic simulations on a single computer. Harnessing this potential of GPUs will help in the early adoption of MC for routine planning in a clinical environment.
International Nuclear Information System (INIS)
Gerlach, M.; Krumrey, M.; Cibik, L.; Mueller, P.; Ulm, G.
2009-01-01
Monte Carlo techniques are powerful tools to simulate the interaction of electromagnetic radiation with matter. One of the most widespread simulation program packages is Geant4. Almost all physical interaction processes can be included. However, it is not evident what accuracy can be obtained by a simulation. In this work, results of scattering experiments using monochromatized synchrotron radiation in the X-ray regime are quantitatively compared to the results of simulations using Geant4. Experiments were performed for various scattering foils made of different materials such as copper and gold. For energy-dispersive measurements of the scattered radiation, a cadmium telluride detector was used. The detector was fully characterized and calibrated with calculable undispersed as well as monochromatized synchrotron radiation. The obtained quantum efficiency and the response functions are in very good agreement with the corresponding Geant4 simulations. At the electron storage ring BESSY II the number of incident photons in the scattering experiments was measured with a photodiode that had been calibrated against a cryogenic radiometer, so that a direct comparison of scattering experiments with Monte Carlo simulations using Geant4 was possible. It was shown that Geant4 describes the photoeffect, including fluorescence as well as the Compton and Rayleigh scattering, with high accuracy, resulting in a deviation of typically less than 20%. Even polarization effects are widely covered by Geant4, and for Doppler broadening of Compton-scattered radiation the extension G4LECS can be included, but the fact that both features cannot be combined is a limitation. For most polarization-dependent simulations, good agreement with the experimental results was found, except for some orientations where Rayleigh scattering was overestimated in the simulation.
Gerlach, M.; Krumrey, M.; Cibik, L.; Müller, P.; Ulm, G.
2009-09-01
Monte Carlo techniques are powerful tools to simulate the interaction of electromagnetic radiation with matter. One of the most widespread simulation program packages is Geant4. Almost all physical interaction processes can be included. However, it is not evident what accuracy can be obtained by a simulation. In this work, results of scattering experiments using monochromatized synchrotron radiation in the X-ray regime are quantitatively compared to the results of simulations using Geant4. Experiments were performed for various scattering foils made of different materials such as copper and gold. For energy-dispersive measurements of the scattered radiation, a cadmium telluride detector was used. The detector was fully characterized and calibrated with calculable undispersed as well as monochromatized synchrotron radiation. The obtained quantum efficiency and the response functions are in very good agreement with the corresponding Geant4 simulations. At the electron storage ring BESSY II the number of incident photons in the scattering experiments was measured with a photodiode that had been calibrated against a cryogenic radiometer, so that a direct comparison of scattering experiments with Monte Carlo simulations using Geant4 was possible. It was shown that Geant4 describes the photoeffect, including fluorescence as well as the Compton and Rayleigh scattering, with high accuracy, resulting in a deviation of typically less than 20%. Even polarization effects are widely covered by Geant4, and for Doppler broadening of Compton-scattered radiation the extension G4LECS can be included, but the fact that both features cannot be combined is a limitation. For most polarization-dependent simulations, good agreement with the experimental results was found, except for some orientations where Rayleigh scattering was overestimated in the simulation.
Energy Technology Data Exchange (ETDEWEB)
Gerlach, M. [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany); Krumrey, M. [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany)], E-mail: Michael.Krumrey@ptb.de; Cibik, L.; Mueller, P.; Ulm, G. [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany)
2009-09-11
Monte Carlo techniques are powerful tools to simulate the interaction of electromagnetic radiation with matter. One of the most widespread simulation program packages is Geant4. Almost all physical interaction processes can be included. However, it is not evident what accuracy can be obtained by a simulation. In this work, results of scattering experiments using monochromatized synchrotron radiation in the X-ray regime are quantitatively compared to the results of simulations using Geant4. Experiments were performed for various scattering foils made of different materials such as copper and gold. For energy-dispersive measurements of the scattered radiation, a cadmium telluride detector was used. The detector was fully characterized and calibrated with calculable undispersed as well as monochromatized synchrotron radiation. The obtained quantum efficiency and the response functions are in very good agreement with the corresponding Geant4 simulations. At the electron storage ring BESSY II the number of incident photons in the scattering experiments was measured with a photodiode that had been calibrated against a cryogenic radiometer, so that a direct comparison of scattering experiments with Monte Carlo simulations using Geant4 was possible. It was shown that Geant4 describes the photoeffect, including fluorescence as well as the Compton and Rayleigh scattering, with high accuracy, resulting in a deviation of typically less than 20%. Even polarization effects are widely covered by Geant4, and for Doppler broadening of Compton-scattered radiation the extension G4LECS can be included, but the fact that both features cannot be combined is a limitation. For most polarization-dependent simulations, good agreement with the experimental results was found, except for some orientations where Rayleigh scattering was overestimated in the simulation.
Monte Carlo-based dose calculation engine for minibeam radiation therapy.
Martínez-Rovira, I; Sempau, J; Prezado, Y
2014-02-01
Minibeam radiation therapy (MBRT) is an innovative radiotherapy approach based on the well-established tissue sparing effect of arrays of quasi-parallel micrometre-sized beams. In order to guide the preclinical trials in progress at the European Synchrotron Radiation Facility (ESRF), a Monte Carlo-based dose calculation engine has been developed and successfully benchmarked with experimental data in anthropomorphic phantoms. Additionally, a realistic example of treatment plan is presented. Despite the micron scale of the voxels used to tally dose distributions in MBRT, the combination of several efficiency optimisation methods allowed to achieve acceptable computation times for clinical settings (approximately 2 h). The calculation engine can be easily adapted with little or no programming effort to other synchrotron sources or for dose calculations in presence of contrast agents. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
ROEED, K; BRUGGER, M; CALVIANI, M; CERUTTI, F; CHIN, P W; CHRISTOV, A; FERRARI, A; KRAMER, D; KWEE, R E; LEBBOS, E; LECHNER, A; LOSITO, R; MALA, P; MEREGHETTI, A; NOWAK, E M; SINUELA PASTOR, D; SPIEZIA, G; THORNTON, A; VERSACI, R; VLACHOUDIS, V; WEISS, C; CERN. Geneva. ATS Department
2011-01-01
At the LHC various underground areas are partly equipped with commercial electronic devices not specifically designed to be radiation tolerant. A major concern is therefore radiation induced failures in particular due to Single Event Upsets (SEU). To ensure safe and acceptable operation of the LHC electronics a combination of both FLUKA Monte Carlo simulations and dedicated online monitoring is applied to determine the expected radiation levels in critical areas. The LHC Radiation Monitor (RadMon) which is used for this purpose has already been extensively calibrated for its radiation response in various irradiation facilities. It is nevertheless of high importance to also provide a real LHC application benchmark to validate the approach of combined simulations and montoring to correctly measure and predict radiation levels. This report therefore presents a comparison between FLUKA Monte Carlo simulations and measurements results using the RadMon in the LHC collimation region IR7. The work is carried out with...
International Nuclear Information System (INIS)
Vautrin, M.
2011-01-01
Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr
Premar-2: a Monte Carlo code for radiative transport simulation in atmospheric environments
International Nuclear Information System (INIS)
Cupini, E.
1999-01-01
The peculiarities of the PREMAR-2 code, aimed at radiation transport Monte Carlo simulation in atmospheric environments in the infrared-ultraviolet frequency range, are described. With respect to the previously developed PREMAR code, besides plane multilayers, spherical multilayers and finite sequences of vertical layers, each one with its own atmospheric behaviour, are foreseen in the new code, together with the refraction phenomenon, so that long range, highly slanted paths can now be more faithfully taken into account. A zenithal angular dependence of the albedo coefficient has moreover been introduced. Lidar systems, with spatially independent source and telescope, are allowed again to be simulated, and, in this latest version of the code, sensitivity analyses to be performed. According to this last feasibility, consequences on radiation transport of small perturbations in physical components of the atmospheric environment may be analyze and the related effects on searched results estimated. The availability of a library of physical data (reaction coefficients, phase functions and refraction indexes) is required by the code, providing the essential features of the environment of interest needed of the Monte Carlo simulation. Variance reducing techniques have been enhanced in the Premar-2 code, by introducing, for instance, a local forced collision technique, especially apt to be used in Lidar system simulations. Encouraging comparisons between code and experimental results carried out at the Brasimone Centre of ENEA, have so far been obtained, even if further checks of the code are to be performed [it
Model planetary nebulae: the effect of shadowed filaments on low ionization potential ion radiation
International Nuclear Information System (INIS)
Katz, A.
1977-01-01
Previous homogeneous model planetary nebulae calculations No. 4 have yielded emission strengths for low ionization potential No. 4 ions which are considerably lower than those observed. Several attempts were to correct this problem by the inclusion of optically thin condensations, the use of energy flux distributions from stellar model calculations instead of blackbody spectrum stars, and the inclusion of dust in the nebulae. The effect that shadowed filaments have on the ionization and thermal structure of model nebulae and the resultant line strengths are considered. These radial filaments are shielded from the direct stellar ionizing radiation by optically thick condensations in the nebula. Theoretical observational evidence exists for the presence of condensations and filaments. Since the only source of ionizing photons in the shadowed filaments is due to diffuse photons produced by recombination, ions of lower ionization potential are expected to exist there in greater numbers than those found in the rest of the nebula. This leads to increased line strengths from these ions and increases their values to match the observational values. It is shown that these line strengths in the filaments increase by over one to two orders of magnitude relative to values found in homogeneous models. This results in an increase of approximately one order of magnitude for these lines when contributions from both components of the nebula are considered. The parameters that determine the exact value of the increase are the radial location of the filaments in the nebula and the fraction of the nebular volume occupied by the filaments
Energy Technology Data Exchange (ETDEWEB)
Manchado de Sola, F.; Vilches Pacheco, M.; Lallena Rojo, A. M.; Prezado, Y.
2013-07-01
Still in testing phase, radiation therapy with mini-beams is presented as a promising form of treatment. The irradiation with beams constituted by a group of parallel strips of radiation and shade (peaks and valleys), each an of the which has a width of the order of microns. We studied using Monte Carlo simulation, the effect of the brain caused by the heartbeat pulsed on the reason of dose peak-valley in cranial radiotherapy with mini-beams, depending on the width of peak and the rate of irradiation. (Author)
International Nuclear Information System (INIS)
Lim, Chang Hwy; Park, Jong Won; Lee, Junghee; Moon, Myung Kook; Kim, Jongyul; Lee, Suhyun
2015-01-01
A plastic scintillator in the RPM is suited for the γ-ray detection of various-range energy and is the cost effective radiation detection material. In order to well inspect emitted radiation from the container cargo, the radiation detection area of a plastic scintillator should be larger than other general purpose radiation detector. However, the large size plastic scintillator affects the light collection efficiency at the photo-sensitive sensor due to the long light transport distance and light collisions in a plastic scintillator. Therefore, the improvement of light collection efficiency in a RPM is one of the major issues for the high performance RPM development. We calculated the change of the number of collected light according to changing of the attachment position and number of PMT. To calculate the number of collected light, the DETECT2000 and MCNP6 Monte Carlo simulation software tool was used. Response signal performance of RPM system is affected by the position of the incident radiation. If the distance between the radiation source and a PMT is long, the number of loss signal is larger. Generally, PMTs for signal detection in RPM system has been attached on one side of plastic scintillator. In contrast, RPM model in the study have 2 PMTs, which attached at the two side of plastic scintillator. We estimated difference between results using the old method and our method. According to results, uniformity of response signal was better than method using one side. If additive simulation and experiment is performed, it will be possible to develop the improved RPM system. In the future, we will perform additive simulation about many difference RPM model
Díaz, Oliver; García, Eloy; Oliver, Arnau; Martí, Joan; Martí, Robert
2017-03-01
Scattered radiation is an undesired signal largely present in most digital breast tomosynthesis (DBT) projection images as no physically rejection methods, i.e. anti-scatter grids, are regularly employed, in contrast to full- field digital mammography. This scatter signal might reduce the visibility of small objects in the image, and potentially affect the detection of small breast lesions. Thus accurate scatter models are needed to minimise the scattered radiation signal via post-processing algorithms. All prior work on scattered radiation estimation has assumed a rigid breast compression paddle (RP) and reported large contribution of scatter signal from RP in the detector. However, in this work, flexible paddles (FPs) tilting from 0° to 10° will be studied using Monte Carlo simulations to analyse if the scatter distribution differs from RP geometries. After reproducing the Hologic Selenia Dimensions geometry (narrow angle) with two (homogeneous and heterogeneous) compressed breast phantoms, results illustrate that the scatter distribution recorded at the detector varies up to 22% between RP and FP geometries (depending on the location), mainly due to the decrease in thickness of the breast observed for FP. However, the relative contribution from the paddle itself (3-12% of the total scatter) remains approximately unchanged for both setups and their magnitude depends on the distance to the breast edge.
Monte Carlo simulation of muon radiation environment in China Jinping Underground Laboratory
International Nuclear Information System (INIS)
Su Jian; Zeng Zhi; Liu Yue; Yue Qian; Ma Hao; Cheng Jianping
2012-01-01
Muon radiation background of China Jinping Underground Laboratory (CJPL) was simulated by Monte Carlo method. According to the Gaisser formula and the MUSIC soft, the model of cosmic ray muons was established. Then the yield and the average energy of muon-induced photons and muon-induced neutrons were simulated by FLUKA. With the single-energy approximation, the contribution to the radiation background of shielding structure by secondary photons and neutrons was evaluated. The estimation results show that the average energy of residual muons is 369 GeV and the flux is 3.17 × 10 -6 m -2 · s -1 . The fluence rate of secondary photons is about 1.57 × 10 -4 m -2 · s -1 , and the fluence rate of secondary neutrons is about 8.37 × 10 -7 m -2 · s -1 . The muon radiation background of CJPL is lower than those of most other underground laboratories in the world. (authors)
Modeling of radiation-induced bystander effect using Monte Carlo methods
Xia, Junchao; Liu, Liteng; Xue, Jianming; Wang, Yugang; Wu, Lijun
2009-03-01
Experiments showed that the radiation-induced bystander effect exists in cells, or tissues, or even biological organisms when irradiated with energetic ions or X-rays. In this paper, a Monte Carlo model is developed to study the mechanisms of bystander effect under the cells sparsely populated conditions. This model, based on our previous experiment which made the cells sparsely located in a round dish, focuses mainly on the spatial characteristics. The simulation results successfully reach the agreement with the experimental data. Moreover, other bystander effect experiment is also computed by this model and finally the model succeeds in predicting the results. The comparison of simulations with the experimental results indicates the feasibility of the model and the validity of some vital mechanisms assumed.
International Nuclear Information System (INIS)
Arias Pullaguari, Ines Yolanda
2003-01-01
The objective of this study was to establish the biological effects on occupational workers. In this study, have made a bibliographic review of the changes on skin of 217 professionals; between 21 and 70 years radiologists, X-ray technicians, radioisotope workers, nurses and others, which were exposed to ionizing radiation, in the departments of Diagnosis and Treatment of the Hospital Carlos Andrade Marin of the Quito city. From this universe 133 workers were excluded of the analysis. From the totality of lesions produced on the skin; the depilation constituted 40.18%, hyper pigmentation 19.34%, hypo pigmentation 9 %, capillary fragility 13.39%, erythema 13.39%, alopecia 5.37%. From the totality of lesions produced in blood: the leukopenia constituted 20.23% between all workers. The percentage method was used for statical calculation. A bibliographic update is done and the most relevant clinical aspects are reviewed. (The author)
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
International Nuclear Information System (INIS)
Smith, L.M.; Hochstedler, R.D.
1997-01-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code)
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
Smith, L. M.; Hochstedler, R. D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).
Spence, H. E.
2017-12-01
We examine and compare the energetic particle ionizing radiation environments at airless planetary surfaces throughout the solar system. Energetic charged particles fill interplanetary space and bathe the environments of planetary objects with a ceaseless source of sometimes powerful yet ever-present ionizing radiation. In turn, these charged particles interact with planetary bodies in various ways, depending upon the properties of the body as well as upon the nature of the charged particles themselves. The Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaisance Orbiter (LRO), launched in 2009, continues to provide new insights into the ways by which the lunar surface is influenced by these energetic particles. In this presentation, we briefly review some of these mechanisms and how they operate at the Moon, and then compare and contrast the radiation environments at other atmospherereless planetary objects within our solar system that are potential future human exploration targets. In particular, we explore two primary sources of ionizing radiation, galactic cosmic rays (GCR) and solar energetic particles (SEP), in the environments of planetary objects that have weak or absent atmospheres and intrinsic magnetic fields. We motivate the use of simplified scaling relationships with heliocentric distance to estimate their intensity, which then serves as a basis for estimating the relative importance of various energetic particle and planetary surface physical interactions, in the context of humankind's expanding explorations beyond low-Earth orbit.
International Nuclear Information System (INIS)
Zazula, J.M.
1984-01-01
This work concerns calculation of a neutron response, caused by a neutron field perturbed by materials surrounding the source or the detector. Solution of a problem is obtained using coupling of the Monte Carlo radiation transport computation for the perturbed region and the discrete ordinates transport computation for the unperturbed system. (author). 62 refs
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, Bruno L.; Tomal, Alessandra [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Instituto de Fisica Gleb Wataghin
2016-07-01
Mammography is the main tool for breast cancer diagnosis, and it is based on the use of X-rays to obtain images. However, the glandular tissue present within the breast is highly sensitive to ionizing radiation, and therefore requires strict quality control in order to minimize the absorbed dose. The quantification of the absorbed dose in the breast tissue can be done by using Monte Carlo simulation, which allows a detailed study of the deposition of energy in different regions of the breast. Besides, the results obtained from the simulation can be associated with experimental data and provide values of dose interest, such as the dose deposited in glandular tissue. (author)
Directory of Open Access Journals (Sweden)
Robert Pincus
2009-06-01
Full Text Available Large-eddy simulation (LES refers to a class of calculations in which the large energy-rich eddies are simulated directly and are insensitive to errors in the modeling of sub-grid scale processes. Flows represented by LES are often driven by radiative heating and therefore require the calculation of radiative transfer along with the fluid-dynamical simulation. Current methods for detailed radiation calculations, even those using simple one-dimensional radiative transfer, are far too expensive for routine use, while popular shortcuts are either of limited applicability or run the risk of introducing errors on time and space scales that might affect the overall simulation. A new approximate method is described that relies on Monte Carlo sampling of the spectral integration in the heating rate calculation and is applicable to any problem. The error introduced when using this method is substantial for individual samples (single columns at single times but is uncorrelated in time and space and so does not bias the statistics of scales that are well resolved by the LES. The method is evaluated through simulation of two test problems; these behave as expected. A scaling analysis shows that the errors introduced by the method diminish as flow features become well resolved. Errors introduced by the approximation increase with decreasing spatial scale but the spurious energy introduced by the approximation is less than the energy expected in the unperturbed flow, i.e. the energy associated with the spectral cascade from the large scale, even on the grid scale.
International Nuclear Information System (INIS)
Serikov, A.; Fischer, U.; Leichtle, D.; Pitcher, C.S.
2012-01-01
Highlights: ► Systematic neutronics analyses were conducted to assess the ITER Equatorial Port Plug radiation shielding performance. ► Shielding optimization was achieved by parametric analyses of several design variants using the MCNP5, FISPACT-2007, and R2Smesh codes. ► Dominant effect of radiation streaming along the port plug gaps was recognized. ► Combination of the gap labyrinths and streaming stoppers or rails reduces shutdown doses by 2 orders of magnitude. ► Using the proposed shielding, the shutdown dose in the ITER port interspace is less than the personnel access limit of 100 μSv/h. - Abstract: This paper addresses neutronics aspects of the design development of the Diagnostic Generic Equatorial Port Plug (EPP) in ITER. To secure the personnel access at the EPP back-end interspace, parametric neutronics analyses of the EPP radiation environment have been performed and practical shielding solutions have been found. Radiation transport was performed with the Monte Carlo MCNP5 code. Activation calculations were conducted with the FISPACT-2007 inventory code. The R2Smesh approach was applied to couple transport and activation calculations. Newly created EPP local MCNP5 model was devised by extracting the EPP and adjacent blanket modules from the ITER Alite-4.1 model with proper modification of the EPP geometry in accordance with recent 3D CAD CATIA model. The EPP local model reproduces the EPP neutronically important features and allows investigation of the EPP neutronics effects in isolation from all other ITER components. Thorough EPP parametric analyses revealed dominant effect of gaps around EPP and several EPP design improvements were implemented as the outcomes of the analyses. Gap labyrinths and streaming stoppers inserted into the gaps were shown are capable to reduce the shutdown dose rate which is below the 100 μSv/h limit of personnel access and by 2 orders of magnitude less than the value in the model with straight gaps.
Monte Carlo calculation of the energy response characteristics of a RadFET radiation detector
Belicev, P.; Spasic Jokic, V.; Mayer, S.; Milosevic, M.; Ilic, R.; Pesic, M.
2010-07-01
The Metal -Oxide Semiconductor Field-Effect-Transistor (MOSFET, RadFET) is frequently used as a sensor of ionizing radiation in nuclear-medicine, diagnostic-radiology, radiotherapy quality-assurance and in the nuclear and space industries. We focused our investigations on calculating the energy response of a p-type RadFET to low-energy photons in range from 12 keV to 2 MeV and on understanding the influence of uncertainties in the composition and geometry of the device in calculating the energy response function. All results were normalized to unit air kerma incident on the RadFET for incident photon energy of 1.1 MeV. The calculations of the energy response characteristics of a RadFET radiation detector were performed via Monte Carlo simulations using the MCNPX code and for a limited number of incident photon energies the FOTELP code was also used for the sake of comparison. The geometry of the RadFET was modeled as a simple stack of appropriate materials. Our goal was to obtain results with statistical uncertainties better than 1% (fulfilled in MCNPX calculations for all incident energies which resulted in simulations with 1 - 2×109 histories.
Monte Carlo simulation of the sequential probability ratio test for radiation monitoring
International Nuclear Information System (INIS)
Coop, K.L.
1984-01-01
A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table
A Monte Carlo Radiation Transfer Study of Photospheric Emission in Gamma-Ray Bursts
Parsotan, Tyler; Lazzati, Davide
2018-01-01
We present the analysis of photospheric emission for a set of hydrodynamic simulations of long duration gamma-ray burst jets from massive compact stars. The results are obtained by using the Monte Carlo Radiation Transfer code (MCRaT) to simulate thermal photons scattering through the collimated outflows. MCRaT allows us to study explicitly the time evolution of the photosphere within the photospheric region, as well as the gradual decoupling of the photon and matter counterparts of the jet. The results of the radiation transfer simulations are also used to construct light curves and time-resolved spectra at various viewing angles, which are then used to make comparisons with observed data and outline the agreement and strain points between the photospheric model and long duration gamma-ray burst observations. We find that our fitted time-resolved spectral Band β parameters are in agreement with observations, even though we do not consider the effects of nonthermal particles. Finally, the results are found to be consistent with the Yonetoku correlation, but bear some strain with the Amati correlation.
Radiation dose performance in the triple-source CT based on a Monte Carlo method
Yang, Zhenyu; Zhao, Jun
2012-10-01
Multiple-source structure is promising in the development of computed tomography, for it could effectively eliminate motion artifacts in the cardiac scanning and other time-critical implementations with high temporal resolution. However, concerns about the dose performance shade this technique, as few reports on the evaluation of dose performance of multiple-source CT have been proposed for judgment. Our experiments focus on the dose performance of one specific multiple-source CT geometry, the triple-source CT scanner, whose theories and implementations have already been well-established and testified by our previous work. We have modeled the triple-source CT geometry with the help of EGSnrc Monte Carlo radiation transport code system, and simulated the CT examinations of a digital chest phantom with our modified version of the software, using x-ray spectrum according to the data of physical tube. Single-source CT geometry is also estimated and tested for evaluation and comparison. Absorbed dose of each organ is calculated according to its real physics characteristics. Results show that the absorbed radiation dose of organs with the triple-source CT is almost equal to that with the single-source CT system. As the advantage of temporal resolution, the triple-source CT would be a better choice in the x-ray cardiac examination.
GPU-BASED MONTE CARLO DUST RADIATIVE TRANSFER SCHEME APPLIED TO ACTIVE GALACTIC NUCLEI
International Nuclear Information System (INIS)
Heymann, Frank; Siebenmorgen, Ralf
2012-01-01
A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman and Wood method to reduce the calculation time, and the Fleck and Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 μm silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.
Development of an accurate 3D Monte Carlo broadband atmospheric radiative transfer model
Jones, Alexandra L.
Radiation is the ultimate source of energy that drives our weather and climate. It is also the fundamental quantity detected by satellite sensors from which earth's properties are inferred. Radiative energy from the sun and emitted from the earth and atmosphere is redistributed by clouds in one of their most important roles in the atmosphere. Without accurately representing these interactions we greatly decrease our ability to successfully predict climate change, weather patterns, and to observe our environment from space. The remote sensing algorithms and dynamic models used to study and observe earth's atmosphere all parameterize radiative transfer with approximations that reduce or neglect horizontal variation of the radiation field, even in the presence of clouds. Despite having complete knowledge of the underlying physics at work, these approximations persist due to perceived computational expense. In the current context of high resolution modeling and remote sensing observations of clouds, from shallow cumulus to deep convective clouds, and given our ever advancing technological capabilities, these approximations have been exposed as inappropriate in many situations. This presents a need for accurate 3D spectral and broadband radiative transfer models to provide bounds on the interactions between clouds and radiation to judge the accuracy of similar but less expensive models and to aid in new parameterizations that take into account 3D effects when coupled to dynamic models of the atmosphere. Developing such a state of the art model based on the open source, object-oriented framework of the I3RC Monte Carlo Community Radiative Transfer ("IMC-original") Model is the task at hand. It has involved incorporating (1) thermal emission sources of radiation ("IMC+emission model"), allowing it to address remote sensing problems involving scattering of light emitted at earthly temperatures as well as spectral cooling rates, (2) spectral integration across an arbitrary
Energy Technology Data Exchange (ETDEWEB)
Fomin, B.A. [CPTEC/INPE, Rod. Presidente Dutra, km.40, Cachoeira Paulsta, Sao Paulo, 12630-000 (Brazil)]. E-mail: fomin@cptec.inpe.br
2006-03-15
An algorithm for calculations of the longwave radiation in cloudy and aerosol slab atmospheres is described. It is based on the line-by-line and Monte-Carlo methods and is suitable for accurate treatment of both the gaseous absorption and the particulate multiple scattering in any spectral regions; other published algorithms as accurate as this can only make calculations in narrow spectral regions. It is recommended that this algorithm is well suited for radiation code validations as well as for theoretical investigations of radiative transfer in clouds and aerosols and satellite signal simulations.
Measurement of the He II radiation field in planetary nebulae through Bowen fluorescence
Bhatia, A. K.; Kastner, S. O.
1987-01-01
Excitation of O III by He II is treated for sources over a useful range of densities to give accurate predictions of Bowen/non-Bowen line ratios. These are applied to recent observations of planetary nebulae to show that Bowen excitation increases monotonically with excitation class, and to deduce other important consequences.
International Nuclear Information System (INIS)
Mercier, B.; Meurant, G.; Tassart, J.
1985-04-01
The description of the equations in the fluid frame has been done recently. A simplification of the collision term is obtained, but the streaming term now has to include angular deviation and the Doppler shift. We choose the latter description which is more convenient for our purpose. We introduce some notations and recall some facts about stochastic kernels and the Monte-Carlo method. We show how to apply the Monte-Carlo method to a transport equation with an arbitrary streaming term; in particular we show that the track length estimator is unbiased. We review some properties of the radiation hydrodynamics equations, and show how energy conservation is obtained. Then, we apply the Monte-Carlo method explained in section 2 to the particular case of the transfer equation in the fluid frame. Finally, we describe a physical example and give some numerical results
International Nuclear Information System (INIS)
Medeiros, Marcos P.C.; Rebello, Wilson F.; Andrade, Edson R.; Silva, Ademir X.
2015-01-01
Nuclear explosions are usually described in terms of its total yield and associated shock wave, thermal radiation and nuclear radiation effects. The nuclear radiation produced in such events has several components, consisting mainly of alpha and beta particles, neutrinos, X-rays, neutrons and gamma rays. For practical purposes, the radiation from a nuclear explosion is divided into i nitial nuclear radiation , referring to what is issued within one minute after the detonation, and 'residual nuclear radiation' covering everything else. The initial nuclear radiation can also be split between 'instantaneous or 'prompt' radiation, which involves neutrons and gamma rays from fission and from interactions between neutrons and nuclei of surrounding materials, and 'delayed' radiation, comprising emissions from the decay of fission products and from interactions of neutrons with nuclei of the air. This work aims at presenting isodose curves calculations at ground level by Monte Carlo simulation, allowing risk assessment and consequences modeling in radiation protection context. The isodose curves are related to neutrons produced by the prompt nuclear radiation from a hypothetical nuclear explosion with a total yield of 20 KT. Neutron fluency and emission spectrum were based on data available in the literature. Doses were calculated in the form of ambient dose equivalent due to neutrons H*(10) n - . (author)
Energy Technology Data Exchange (ETDEWEB)
Medeiros, Marcos P.C.; Rebello, Wilson F.; Andrade, Edson R., E-mail: rebello@ime.eb.br, E-mail: daltongirao@yahoo.com.br [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Secao de Engenharia Nuclear; Silva, Ademir X., E-mail: ademir@nuclear.ufrj.br [Corrdenacao dos Programas de Pos-Graduacao em Egenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear
2015-07-01
Nuclear explosions are usually described in terms of its total yield and associated shock wave, thermal radiation and nuclear radiation effects. The nuclear radiation produced in such events has several components, consisting mainly of alpha and beta particles, neutrinos, X-rays, neutrons and gamma rays. For practical purposes, the radiation from a nuclear explosion is divided into {sup i}nitial nuclear radiation{sup ,} referring to what is issued within one minute after the detonation, and 'residual nuclear radiation' covering everything else. The initial nuclear radiation can also be split between 'instantaneous or 'prompt' radiation, which involves neutrons and gamma rays from fission and from interactions between neutrons and nuclei of surrounding materials, and 'delayed' radiation, comprising emissions from the decay of fission products and from interactions of neutrons with nuclei of the air. This work aims at presenting isodose curves calculations at ground level by Monte Carlo simulation, allowing risk assessment and consequences modeling in radiation protection context. The isodose curves are related to neutrons produced by the prompt nuclear radiation from a hypothetical nuclear explosion with a total yield of 20 KT. Neutron fluency and emission spectrum were based on data available in the literature. Doses were calculated in the form of ambient dose equivalent due to neutrons H*(10){sub n}{sup -}. (author)
Monte Carlo simulation of proton boron fusion reaction for radiation therapy
Energy Technology Data Exchange (ETDEWEB)
Kim, Sun Mi; Yoon, Do Kun; Suh, Tae Suk [Catholic University of Korea, Seoul (Korea, Republic of)
2016-05-15
The principle of the proton boron fusion therapy (PBFT) is based on this reaction as the radiation therapy technique. First, because three alpha particles can contribute to the death of the tumor cell by the use of one proton, high therapy efficiency can be achieved by using smaller flux than conventional proton therapy or the boron neutron capture therapy (BNCT), after the thermal neutron was captured by the labeled boron in the tumor region, an alpha particle is emitted from the capture reaction point. An alpha particle induces the death of the tumor cell by the one capture reaction. However, three alpha particles are emitted from the point of the proton boron fusion reaction. If this reaction is applied to the radiation therapy, the therapy results could be more effective in inducing the death of tumor cells using a smaller flux. In addition, the proton's energy loss during its propagation through matter is described by the Bragg-peak. After the boron-labeled compound is accumulated in the tumor region, if the portion of the proton's maximum dose (Bragg-peak) is included at the tumor region, which is the boron uptake region (BUR), a dramatic therapy effect with less damage to normal tissue can be expected. This study was performed to introduce a therapy method using the proton boron fusion reaction and verify the theoretical validity of PBFT using Monte Carlo simulations. In this study, there are two parts of the simulation to confirm the validity of PBFT. First, the variation of the Bragg-peak of the proton depending on the location of the BUR was examined. The other simulation was performed to confirm the existence of the prompt gamma ray peak of 719 keV from energy spectrum simulation. PBFT method is still at the conceptual stage, the verification of its effectiveness is required for the use of a physical approach.
Monte Carlo simulations of relativistic radiation-mediated shocks - I. Photon-rich regime
Ito, Hirotaka; Levinson, Amir; Stern, Boris E.; Nagataki, Shigehiro
2018-02-01
We explore the physics of relativistic radiation-mediated shocks (RRMSs) in the regime where photon advection dominates over photon generation. For this purpose, a novel iterative method for deriving a self-consistent steady-state structure of RRMS is developed, based on a Monte Carlo code that solves the transfer of photons subject to Compton scattering and pair production/annihilation. Systematic study is performed by imposing various upstream conditions which are characterized by the following three parameters: the photon-to-baryon inertia ratio ξu*, the photon-to-baryon number ratio \\tilde{n}, and the shock Lorentz factor γu. We find that the properties of RRMSs vary considerably with these parameters. In particular, while a smooth decline in the velocity, accompanied by a gradual temperature increase is seen for ξu* ≫ 1, an efficient bulk Comptonization, that leads to a heating precursor, is found for ξu* ≲ 1. As a consequence, although particle acceleration is highly inefficient in these shocks, a broad non-thermal spectrum is produced in the latter case. The generation of high-energy photons through bulk Comptonization leads, in certain cases, to a copious production of pairs that provide the dominant opacity for Compton scattering. We also find that for certain upstream conditions a weak subshock appears within the flow. For a choice of parameters suitable to gamma-ray bursts, the radiation spectrum within the shock is found to be compatible with that of the prompt emission, suggesting that subphotospheric shocks may give rise to the observed non-thermal features despite the absence of accelerated particles.
Tauscher, Courtney; Schuerger, Andrew C; Nicholson, Wayne L
2006-08-01
Bacterial spores have been considered as microbial life that could survive interplanetary transport by natural impact processes or human spaceflight activity. Deposition of terrestrial microbes or their biosignature molecules onto the surface of Mars could negatively impact life detection experiments and planetary protection measures. Simulated Mars solar radiation, particularly the ultraviolet component, has been shown to reduce spore viability, but its effect on spore germination and resulting production of biosignature molecules has not been explored. We examined the survival and germinability of Bacillus subtilis spores exposed to simulated martian conditions that include solar radiation. Spores of B. subtilis that contain luciferase resulting from expression of an sspB-luxAB gene fusion were deposited on aluminum coupons to simulate deposition on spacecraft surfaces and exposed to simulated Mars atmosphere and solar radiation. The equivalent of 42 min of simulated Mars solar radiation exposure reduced spore viability by nearly 3 logs, while germination-induced bioluminescence, a measure of germination metabolism, was reduced by less than 1 log. The data indicate that spores can retain the potential to initiate germination-associated metabolic processes and produce biological signature molecules after being rendered nonviable by exposure to Mars solar radiation.
Radiation field characterization of a BNCT research facility using Monte Carlo method - code MCNP-4B
International Nuclear Information System (INIS)
Hernandez, Antonio Carlos
2002-01-01
Boron Neutron Capture Therapy - BNCT - is a selective cancer treatment and arises as an alternative therapy to treat cancer when usual techniques - surgery, chemotherapy or radiotherapy - show no satisfactory results. The main proposal of this work is to project a facility to BNCT studies. This facility relies on the use of an Am Be neutron source and on a set of moderators, filters and shielding which will provide the best neutron/gamma beam characteristic for these Becton studies, i.e., high intensity thermal and/or epithermal neutron fluxes and with the minimum feasible gamma rays and fast neutrons contaminants. A computational model of the experiment was used to obtain the radiation field in the sample irradiation position. The calculations have been performed with the MCNP 4B Monte Carlo Code and the results obtained can be regarded as satisfactory, i.e., a thermal neutron fluencyN T = 1,35x10 8 n/cm , a fast neutron dose of 5,86x10 -10 Gy/N T and a gamma ray dose of 8,30x10 -14 Gy/N T . (author)
Monte Carlo simulation of direct and indirect effects of radiation on DNA target structure
International Nuclear Information System (INIS)
Tomita, H.; Kai, M.; Kusama, T.; Aoki, Y.; Ito, A.
1993-01-01
A new theoretical model for estimating yields of DNA strand break induced by several monoenergetic electrons is presented. It is based on the Monte-Carlo track structure simulation and on new DNA structure models (1 turn of double-strand DNA, nucleosome, solenoid), and links physical and chemical stages of radiation action. Direct and indirect effects are strictly distinguished. Some results of calculations indicated: (i) The number of single strand breaks per nucleus (6 μmφ) per Gy in pure water was about ten times that in a cell environment. (ii) The contribution of indirect effects to total damage decreased as the order of the DNA target model structure used in the simulation increased (e.g., a 1-turn model of double-strand DNA, ∼98.4%; but 30 nm solenoid model, ∼86.1%). The present study indicated that the information from morphological and biochemical examinations of the cell environment must be considered more carefully in computer simulation
Evaluation of the scattered radiation components produced in a gamma camera using Monte Carlo method
Energy Technology Data Exchange (ETDEWEB)
Polo, Ivon Oramas, E-mail: ivonoramas67@gmail.com [Department of Nuclear Engineering, Faculty of Nuclear Sciences and Technologies, Higher Institute of Applied Science and Technology (InSTEC), La Habana (Cuba)
2014-07-01
Introduction: this paper presents a simulation for evaluation of the scattered radiation components produced in a gamma camera PARK using Monte Carlo code SIMIND. It simulates a whole body study with MDP (Methylene Diphosphonate) radiopharmaceutical based on Zubal anthropomorphic phantom, with some spinal lesions. Methods: the simulation was done by comparing 3 configurations for the detected photons. The corresponding energy spectra were obtained using Low Energy High Resolution collimator. The parameters related with the interactions and the fraction of events in the energy window, the simulated events of the spectrum and scatter events were calculated. Results: the simulation confirmed that the images without influence of scattering events have a higher number of valid recorded events and it improved the statistical quality of them. A comparison among different collimators was made. The parameters and detector energy spectrum were calculated for each simulation configuration with these collimators using {sup 99m}Tc. Conclusion: the simulation corroborated that LEHS collimator has higher sensitivity and HEHR collimator has lower sensitivity when they are used with low energy photons. (author)
International Nuclear Information System (INIS)
Fraass, Benedick A.; Smathers, James; Deye, James
2003-01-01
Due to the significant interest in Monte Carlo dose calculations for external beam megavoltage radiation therapy from both the research and commercial communities, a workshop was held in October 2001 to assess the status of this computational method with regard to use for clinical treatment planning. The Radiation Research Program of the National Cancer Institute, in conjunction with the Nuclear Data and Analysis Group at the Oak Ridge National Laboratory, gathered a group of experts in clinical radiation therapy treatment planning and Monte Carlo dose calculations, and examined issues involved in clinical implementation of Monte Carlo dose calculation methods in clinical radiotherapy. The workshop examined the current status of Monte Carlo algorithms, the rationale for using Monte Carlo, algorithmic concerns, clinical issues, and verification methodologies. Based on these discussions, the workshop developed recommendations for future NCI-funded research and development efforts. This paper briefly summarizes the issues presented at the workshop and the recommendations developed by the group
Energy Technology Data Exchange (ETDEWEB)
Zucca Aparcio, D.; Perez Moreno, J. M.; Fernandez Leton, P.; Garcia Ruiz-Zorrila, J.
2016-10-01
The commissioning procedures of a Monte Carlo treatment planning system (MC) for photon beams from a dedicated stereotactic body radiosurgery (SBRT) unit has been reported in this document. XVMC has been the MC Code available in the treatment planning system evaluated (BrainLAB iPlan RT Dose) which is based on Virtual Source Models that simulate the primary and scattered radiation, besides the electronic contamination, using gaussian components for whose modelling are required measurements of dose profiles, percentage depth dose and output factors, performed both in water and in air. The dosimetric accuracy of the particle transport simulation has been analyzed by validating the calculations in homogeneous and heterogeneous media versus measurements made under the same conditions as the dose calculation, and checking the stochastic behaviour of Monte Carlo calculations when using different statistical variances. Likewise, it has been verified how the planning system performs the conversion from dose to medium to dose to water, applying the stopping power ratio water to medium, in the presence of heterogeneities where this phenomenon is relevant, such as high density media (cortical bone). (Author)
Directory of Open Access Journals (Sweden)
Daniel G Zhang
Full Text Available MRI is often used in tumor localization for radiotherapy treatment planning, with gadolinium (Gd-containing materials often introduced as a contrast agent. Motexafin gadolinium is a novel radiosensitizer currently being studied in clinical trials. The nanoparticle technologies can target tumors with high concentration of high-Z materials. This Monte Carlo study is the first detailed quantitative investigation of high-Z material Gd-induced dose enhancement in megavoltage external beam photon therapy. BEAMnrc, a radiotherapy Monte Carlo simulation package, was used to calculate dose enhancement as a function of Gd concentration. Published phase space files for the TrueBeam flattening filter free (FFF and conventional flattened 6MV photon beams were used. High dose rate (HDR brachytherapy with Ir-192 source was also investigated as a reference. The energy spectra difference caused a dose enhancement difference between the two beams. Since the Ir-192 photons have lower energy yet, the photoelectric effect in the presence of Gd leads to even higher dose enhancement in HDR. At depth of 1.8 cm, the percent mean dose enhancement for the FFF beam was 0.38±0.12, 1.39±0.21, 2.51±0.34, 3.59±0.26, and 4.59±0.34 for Gd concentrations of 1, 5, 10, 15, and 20 mg/mL, respectively. The corresponding values for the flattened beam were 0.09±0.14, 0.50±0.28, 1.19±0.29, 1.68±0.39, and 2.34±0.24. For Ir-192 with direct contact, the enhanced were 0.50±0.14, 2.79±0.17, 5.49±0.12, 8.19±0.14, and 10.80±0.13. Gd-containing materials used in MRI as contrast agents can also potentially serve as radiosensitizers in radiotherapy. This study demonstrates that Gd can be used to enhance radiation dose in target volumes not only in HDR brachytherapy, but also in 6 MV FFF external beam radiotherapy, but higher than the currently used clinical concentration (>5 mg/mL would be needed.
Tryggestad, E; Armour, M; Iordachita, I; Verhaegen, F; Wong, J W
2009-09-07
Our group has constructed the small animal radiation research platform (SARRP) for delivering focal, kilo-voltage radiation to targets in small animals under robotic control using cone-beam CT guidance. The present work was undertaken to support the SARRP's treatment planning capabilities. We have devised a comprehensive system for characterizing the radiation dosimetry in water for the SARRP and have developed a Monte Carlo dose engine with the intent of reproducing these measured results. We find that the SARRP provides sufficient therapeutic dose rates ranging from 102 to 228 cGy min(-1) at 1 cm depth for the available set of high-precision beams ranging from 0.5 to 5 mm in size. In terms of depth-dose, the mean of the absolute percentage differences between the Monte Carlo calculations and measurement is 3.4% over the full range of sampled depths spanning 0.5-7.2 cm for the 3 and 5 mm beams. The measured and computed profiles for these beams agree well overall; of note, good agreement is observed in the profile tails. Especially for the smallest 0.5 and 1 mm beams, including a more realistic description of the effective x-ray source into the Monte Carlo model may be important.
Yeh, Peter C. Y.; Lee, C. C.; Chao, T. C.; Tung, C. J.
2017-11-01
Intensity-modulated radiation therapy is an effective treatment modality for the nasopharyngeal carcinoma. One important aspect of this cancer treatment is the need to have an accurate dose algorithm dealing with the complex air/bone/tissue interface in the head-neck region to achieve the cure without radiation-induced toxicities. The Acuros XB algorithm explicitly solves the linear Boltzmann transport equation in voxelized volumes to account for the tissue heterogeneities such as lungs, bone, air, and soft tissues in the treatment field receiving radiotherapy. With the single beam setup in phantoms, this algorithm has already been demonstrated to achieve the comparable accuracy with Monte Carlo simulations. In the present study, five nasopharyngeal carcinoma patients treated with the intensity-modulated radiation therapy were examined for their dose distributions calculated using the Acuros XB in the planning target volume and the organ-at-risk. Corresponding results of Monte Carlo simulations were computed from the electronic portal image data and the BEAMnrc/DOSXYZnrc code. Analysis of dose distributions in terms of the clinical indices indicated that the Acuros XB was in comparable accuracy with Monte Carlo simulations and better than the anisotropic analytical algorithm for dose calculations in real patients.
Bahadori, Amir Alexander
Astronauts are exposed to a unique radiation environment in space. United States terrestrial radiation worker limits, derived from guidelines produced by scientific panels, do not apply to astronauts. Limits for astronauts have changed throughout the Space Age, eventually reaching the current National Aeronautics and Space Administration limit of 3% risk of exposure induced death, with an administrative stipulation that the risk be assured to the upper 95% confidence limit. Much effort has been spent on reducing the uncertainty associated with evaluating astronaut risk for radiogenic cancer mortality, while tools that affect the accuracy of the calculations have largely remained unchanged. In the present study, the impacts of using more realistic computational phantoms with size variability to represent astronauts with simplified deterministic radiation transport were evaluated. Next, the impacts of microgravity-induced body changes on space radiation dosimetry using the same transport method were investigated. Finally, dosimetry and risk calculations resulting from Monte Carlo radiation transport were compared with results obtained using simplified deterministic radiation transport. The results of the present study indicated that the use of phantoms that more accurately represent human anatomy can substantially improve space radiation dose estimates, most notably for exposures from solar particle events under light shielding conditions. Microgravity-induced changes were less important, but results showed that flexible phantoms could assist in optimizing astronaut body position for reducing exposures during solar particle events. Finally, little overall differences in risk calculations using simplified deterministic radiation transport and 3D Monte Carlo radiation transport were found; however, for the galactic cosmic ray ion spectra, compensating errors were observed for the constituent ions, thus exhibiting the need to perform evaluations on a particle
ETRAN, Electron Transport and Gamma Transport with Secondary Radiation in Slab by Monte-Carlo
International Nuclear Information System (INIS)
1992-01-01
A - Nature of physical problem solved: ETRAN computes the transport of electrons and photons through plane-parallel slab targets that have a finite thickness in one dimension and are unbound in the other two-dimensions. The incident radiation can consist of a beam of either electrons or photons with specified spectral and directional distribution. Options are available by which all orders of the electron-photon cascade can be included in the calculation. Thus electrons are allowed to give rise to secondary knock-on electrons, continuous Bremsstrahlung and characteristic x-rays; and photons are allowed to produce photo-electrons, Compton electrons, and electron- positron pairs. Annihilation quanta, fluorescence radiation, and Auger electrons are also taken into account. If desired, the Monte- Carlo histories of all generations of secondary radiations are followed. The information produced by ETRAN includes the following items: 1) reflection and transmission of electrons or photons, differential in energy and direction; 2) the production of continuous Bremsstrahlung and characteristic x-rays by electrons and the emergence of such radiations from the target (differential in photon energy and direction); 3) the spectrum of the amounts of energy left behind in a thick target by an incident electron beam; 4) the deposition of energy and charge by an electron beam as function of the depth in the target; 5) the flux of electrons, differential in energy, as function of the depth in the target. B - Method of solution: A programme called DATAPAC-4 takes data for a particular material from a library tape and further processes them. The function of DATAPAC-4 is to produce single-scattering and multiple-scattering data in the form of tabular arrays (again stored on magnetic tape) which facilitate the rapid sampling of electron and photon Monte Carlo histories in ETRAN. The photon component of the electron-photon cascade is calculated by conventional random sampling that imitates
Kovtanyuk, Andrey E.
2012-01-01
Radiative-conductive heat transfer in a medium bounded by two reflecting and radiating plane surfaces is considered. This process is described by a nonlinear system of two differential equations: an equation of the radiative heat transfer and an equation of the conductive heat exchange. The problem is characterized by anisotropic scattering of the medium and by specularly and diffusely reflecting boundaries. For the computation of solutions of this problem, two approaches based on iterative techniques are considered. First, a recursive algorithm based on some modification of the Monte Carlo method is proposed. Second, the diffusion approximation of the radiative transfer equation is utilized. Numerical comparisons of the approaches proposed are given in the case of isotropic scattering. © 2011 Elsevier Ltd. All rights reserved.
Premar-2: a Monte Carlo code for radiative transport simulation in atmospheric environments
Energy Technology Data Exchange (ETDEWEB)
Cupini, E. [ENEA, Centro Ricerche Ezio Clementel, Bologna, (Italy). Dipt. Innovazione
1999-07-01
The peculiarities of the PREMAR-2 code, aimed at radiation transport Monte Carlo simulation in atmospheric environments in the infrared-ultraviolet frequency range, are described. With respect to the previously developed PREMAR code, besides plane multilayers, spherical multilayers and finite sequences of vertical layers, each one with its own atmospheric behaviour, are foreseen in the new code, together with the refraction phenomenon, so that long range, highly slanted paths can now be more faithfully taken into account. A zenithal angular dependence of the albedo coefficient has moreover been introduced. Lidar systems, with spatially independent source and telescope, are allowed again to be simulated, and, in this latest version of the code, sensitivity analyses to be performed. According to this last feasibility, consequences on radiation transport of small perturbations in physical components of the atmospheric environment may be analyze and the related effects on searched results estimated. The availability of a library of physical data (reaction coefficients, phase functions and refraction indexes) is required by the code, providing the essential features of the environment of interest needed of the Monte Carlo simulation. Variance reducing techniques have been enhanced in the Premar-2 code, by introducing, for instance, a local forced collision technique, especially apt to be used in Lidar system simulations. Encouraging comparisons between code and experimental results carried out at the Brasimone Centre of ENEA, have so far been obtained, even if further checks of the code are to be performed. [Italian] Nel presente rapporto vengono descritte le principali caratteristiche del codice di calcolo PREMAR-2, che esegue la simulazione Montecarlo del trasporto della radiazione elettromagnetica nell'atmosfera, nell'intervallo di frequenza che va dall'infrarosso all'ultravioletto. Rispetto al codice PREMAR precedentemente sviluppato, il codice
Radiation protection studies for medical particle accelerators using FLUKA Monte Carlo code
International Nuclear Information System (INIS)
Infantino, Angelo; Mostacci, Domiziano; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Marengo, Mario
2017-01-01
Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41 Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. (authors)
Minibeam radiation therapy for the management of osteosarcomas: A Monte Carlo study
Energy Technology Data Exchange (ETDEWEB)
Martínez-Rovira, I.; Prezado, Y., E-mail: prezado@gmail.com [Laboratoire d’Imagerie et Modélisation en Neurobiologie et Cancérologie (IMNC), Centre National de la Recherche Scientifique (CNRS), Campus universitaire, Bât. 440, 1er étage, 15 rue Georges Clemenceau, 91406 Orsay cedex (France)
2014-06-15
Purpose: Minibeam radiation therapy (MBRT) exploits the well-established tissue-sparing effect provided by the combination of submillimetric field sizes and a spatial fractionation of the dose. The aim of this work is to evaluate the feasibility and potential therapeutic gain of MBRT, in comparison with conventional radiotherapy, for osteosarcoma treatments. Methods: Monte Carlo simulations (PENELOPE/PENEASY code) were used as a method to study the dose distributions resulting from MBRT irradiations of a rat femur and a realistic human femur phantoms. As a figure of merit, peak and valley doses and peak-to-valley dose ratios (PVDR) were assessed. Conversion of absorbed dose to normalized total dose (NTD) was performed in the human case. Several field sizes and irradiation geometries were evaluated. Results: It is feasible to deliver a uniform dose distribution in the target while the healthy tissue benefits from a spatial fractionation of the dose. Very high PVDR values (⩾20) were achieved in the entrance beam path in the rat case. PVDR values ranged from 2 to 9 in the human phantom. NTD{sub 2.0} of 87 Gy might be reached in the tumor in the human femur while the healthy tissues might receive valley NTD{sub 2.0} lower than 20 Gy. The doses in the tumor and healthy tissues might be significantly higher and lower than the ones commonly delivered used in conventional radiotherapy. Conclusions: The obtained dose distributions indicate that a gain in normal tissue sparing might be expected. This would allow the use of higher (and potentially curative) doses in the tumor. Biological experiments are warranted.
Effect of gold nanoparticles on radiation doses in tumor treatment: a Monte Carlo study.
Al-Musywel, H A; Laref, A
2017-12-01
Radiotherapy is an extensively used treatment for most tumor types. However, ionizing radiation does not discriminate between cancerous and normal cells surrounding the tumor, which can be considered as a dose-limiting factor. This can lead to the reduction of the effectiveness of tumor cell eradication with this treatment. A potential solution to this problem is loading the tumor with high-Z materials prior to radiotherapy as this can induce higher toxicity in tumor cells compared to normal ones. New advances in nanotechnology have introduced the promising use of heavy metal nanoparticles to enhance tumor treatment. The primary studies showed that gold nanoparticles (GNPs) have unique characteristics as biocompatible radiosensitizers for tumor cells. This study aimed to quantify the dose enhancement effect and its radial dose distribution by Monte Carlo simulations utilizing the EGSnrc code for the water-gold phantom loaded with seven different concentrations of Au: 0, 7, 18, 30, 50, 75, and 100 mg-Au/g-water. The phantom was irradiated with two different radionuclide sources, Ir-192 and Cs-137, which are commonly used in brachytherapy, for all concentrations. The results exhibited that gold nanoparticle-aided radiotherapy (GNRT) increases the efficacy of radiotherapy with low-energy photon sources accompanied with high Au concentration loads of up to 30 mg-Au/g-water. Our finding conducts also to the detection of dose enhancement effects in a short average range of 650 μm outside the region loaded with Au. This can indicate that the location determination is highly important in this treatment method.
Peak Skin and Eye Lens Radiation Dose From Brain Perfusion CT Based on Monte Carlo Simulation
Zhang, Di; Cagnon, Chris H.; Pablo Villablanca, J.; McCollough, Cynthia H.; Cody, Dianna D.; Stevens, Donna M.; Zankl, Maria; Demarco, John J.; Turner, Adam C.; Khatonabadi, Maryam; McNitt-Gray, Michael F.
2014-01-01
OBJECTIVE. The purpose of our study was to accurately estimate the radiation dose to skin and the eye lens from clinical CT brain perfusion studies, investigate how well scanner output (expressed as volume CT dose index [CTDIvol]) matches these estimated doses, and investigate the efficacy of eye lens dose reduction techniques. MATERIALS AND METHODS. Peak skin dose and eye lens dose were estimated using Monte Carlo simulation methods on a voxelized patient model and 64-MDCT scanners from four major manufacturers. A range of clinical protocols was evaluated. CTDIvol for each scanner was obtained from the scanner console. Dose reduction to the eye lens was evaluated for various gantry tilt angles as well as scan locations. RESULTS. Peak skin dose and eye lens dose ranged from 81 mGy to 348 mGy, depending on the scanner and protocol used. Peak skin dose and eye lens dose were observed to be 66–79% and 59–63%, respectively, of the CTDIvol values reported by the scanners. The eye lens dose was significantly reduced when the eye lenses were not directly irradiated. CONCLUSION. CTDIvol should not be interpreted as patient dose; this study has shown it to overestimate dose to the skin or eye lens. These results may be used to provide more accurate estimates of actual dose to ensure that protocols are operated safely below thresholds. Tilting the gantry or moving the scanning region further away from the eyes are effective for reducing lens dose in clinical practice. These actions should be considered when they are consistent with the clinical task and patient anatomy. PMID:22268186
Peak skin and eye lens radiation dose from brain perfusion CT based on Monte Carlo simulation.
Zhang, Di; Cagnon, Chris H; Villablanca, J Pablo; McCollough, Cynthia H; Cody, Dianna D; Stevens, Donna M; Zankl, Maria; Demarco, John J; Turner, Adam C; Khatonabadi, Maryam; McNitt-Gray, Michael F
2012-02-01
The purpose of our study was to accurately estimate the radiation dose to skin and the eye lens from clinical CT brain perfusion studies, investigate how well scanner output (expressed as volume CT dose index [CTDI(vol)]) matches these estimated doses, and investigate the efficacy of eye lens dose reduction techniques. Peak skin dose and eye lens dose were estimated using Monte Carlo simulation methods on a voxelized patient model and 64-MDCT scanners from four major manufacturers. A range of clinical protocols was evaluated. CTDI(vol) for each scanner was obtained from the scanner console. Dose reduction to the eye lens was evaluated for various gantry tilt angles as well as scan locations. Peak skin dose and eye lens dose ranged from 81 mGy to 348 mGy, depending on the scanner and protocol used. Peak skin dose and eye lens dose were observed to be 66-79% and 59-63%, respectively, of the CTDI(vol) values reported by the scanners. The eye lens dose was significantly reduced when the eye lenses were not directly irradiated. CTDI(vol) should not be interpreted as patient dose; this study has shown it to overestimate dose to the skin or eye lens. These results may be used to provide more accurate estimates of actual dose to ensure that protocols are operated safely below thresholds. Tilting the gantry or moving the scanning region further away from the eyes are effective for reducing lens dose in clinical practice. These actions should be considered when they are consistent with the clinical task and patient anatomy.
Efficient Radiation Simulation in Complex Geometries with Applications to Planetary Entry, Phase I
National Aeronautics and Space Administration — NASA aerocapture missions require an accurate evaluation of radiative thermal transport in order to simulate the aerothermal environment around space vehicles....
Europa Planetary Protection for Juno Jupiter Orbiter
Bernard, Douglas E.; Abelson, Robert D.; Johannesen, Jennie R.; Lam, Try; McAlpine, William J.; Newlin, Laura E.
2010-01-01
NASA's Juno mission launched in 2011 and will explore the Jupiter system starting in 2016. Juno's suite of instruments is designed to investigate the atmosphere, gravitational fields, magnetic fields, and auroral regions. Its low perijove polar orbit will allow it to explore portions of the Jovian environment never before visited. While the Juno mission is not orbiting or flying close to Europa or the other Galilean satellites, planetary protection requirements for avoiding the contamination of Europa have been taken into account in the Juno mission design.The science mission is designed to conclude with a deorbit burn that disposes of the spacecraft in Jupiter's atmosphere. Compliance with planetary protection requirements is verified through a set of analyses including analysis of initial bioburden, analysis of the effect of bioburden reduction due to the space and Jovian radiation environments, probabilistic risk assessment of successful deorbit, Monte-Carlo orbit propagation, and bioburden reduction in the event of impact with an icy body.
Dauchet, Jérémi; Blanco, Stéphane; Cornet, Jean-François; El Hafi, Mouna; Eymet, Vincent; Fournier, Richard
2013-10-01
The present text illustrates the practice of integral formulation, zero-variance approaches and sensitivity evaluations in the field of radiative transfer Monte Carlo simulation, as well as the practical implementation of the corresponding algorithms, for such realistic systems as photobioreactors involving spectral integration, multiple scattering and complex geometries. We try to argue that even in such non-academic contexts, strong benefits can be expected from the effort of translating the considered Monte Carlo algorithm into a rigorously equivalent integral formulation. Modifying the initial algorithm to simultaneously compute sensitivities is then straightforward (except for domain deformation sensitivities) and the question of enhancing convergence is turned into that of modeling a set of well identified physical quantities.
Energy Technology Data Exchange (ETDEWEB)
Jang, Dong Gun [Dept. of Nuclear Medicine, Dongnam Institute of Radiological and Medical Sciences Cancer Center, Pusan (Korea, Republic of); Kang, SeSik; Kim, Jung Hoon; KIm, Chang Soo [Dept. of Radiological Science, College of Health Sciences, Catholic University, Pusan (Korea, Republic of)
2015-12-15
Workers in nuclear medicine have performed various tasks such as production, distribution, preparation and injection of radioisotope. This process could cause high radiation exposure to workers’ hand. The purpose of this study was to investigate shielding effect for r-rays of 140 and 511 keV by using Monte-Carlo simulation. As a result, it was effective, regardless of lead thickness for radiation shielding in 140 keV r-ray. However, it was effective in shielding material with thickness of more than only 1.1 mm in 511 keV r-ray. And also it doesn’t effective in less than 1.1 mm due to secondary scatter ray and exposure dose was rather increased. Consequently, energy of radionuclide and thickness of shielding materials should be considered to reduce radiation exposure.
Grisogono, Branko; Subanović, Nebojša; Koračin, Darko
1989-01-01
It is shown that the process of the air-cooling is dominated by the divergence of the longwave radiative flux in cases of night-time clear-sky conditions and with weak wind conditions. The parameterization of the longwave radiative flux divergence is derived according to the emissivity concept and the Stefan-Boltzman law, assuming that the water vapor is the only absorber of longwave radiative. The parameterization of the turbulent temperature flux divergence has been based on the O’Brie...
Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes
Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.
2010-01-01
The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,
Li, Yongbao; Tian, Zhen; Shi, Feng; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2015-04-07
Intensity-modulated radiation treatment (IMRT) plan optimization needs beamlet dose distributions. Pencil-beam or superposition/convolution type algorithms are typically used because of their high computational speed. However, inaccurate beamlet dose distributions may mislead the optimization process and hinder the resulting plan quality. To solve this problem, the Monte Carlo (MC) simulation method has been used to compute all beamlet doses prior to the optimization step. The conventional approach samples the same number of particles from each beamlet. Yet this is not the optimal use of MC in this problem. In fact, there are beamlets that have very small intensities after solving the plan optimization problem. For those beamlets, it may be possible to use fewer particles in dose calculations to increase efficiency. Based on this idea, we have developed a new MC-based IMRT plan optimization framework that iteratively performs MC dose calculation and plan optimization. At each dose calculation step, the particle numbers for beamlets were adjusted based on the beamlet intensities obtained through solving the plan optimization problem in the last iteration step. We modified a GPU-based MC dose engine to allow simultaneous computations of a large number of beamlet doses. To test the accuracy of our modified dose engine, we compared the dose from a broad beam and the summed beamlet doses in this beam in an inhomogeneous phantom. Agreement within 1% for the maximum difference and 0.55% for the average difference was observed. We then validated the proposed MC-based optimization schemes in one lung IMRT case. It was found that the conventional scheme required 10(6) particles from each beamlet to achieve an optimization result that was 3% difference in fluence map and 1% difference in dose from the ground truth. In contrast, the proposed scheme achieved the same level of accuracy with on average 1.2 × 10(5) particles per beamlet. Correspondingly, the computation
Johnston, Christopher O.; Gnoffo, Peter A.; Mazaheri, Alireza
2013-01-01
A review of recently published coupled radiation and ablation capabilities involving the simulation of hypersonic flowfields relevant to Earth, Mars, or Venus entry is presented. The three fundamental mechanisms of radiation coupling are identified as radiative cooling, precursor photochemistry, and ablation-radiation interaction. The impact of these mechanisms are shown to be significant for a 3 m radius sphere entering Earth at hypothetical Mars return conditions (approximately 15 km/s). To estimate the influence precursor absorption on the radiative flux for a wide range of conditions, a simplified approach is developed that requires only the non-precursor solution. Details of a developed coupled ablation approach, which is capable of treating both massively ablating flowfields in the sublimation regime and weakly ablating diffusion Climited oxidation cases, are presented. A review of the two primary uncoupled ablation approximations, identified as the blowing correction and film coefficient approximations, is made and their impact for Earth and Mars entries is shown to be significant for recession and convective heating predictions. Fully coupled ablation and radiation simulations are presented for the Mars return sphere throughout its entire trajectory. Applying to the Mars return sphere the Pioneer- Venus heritage carbon phenolic heatshield, which has properties available in the open literature, the differences between steady state ablation and coupling to a material response code are shown to be significant.
Directory of Open Access Journals (Sweden)
F. Spada
2006-01-01
Full Text Available A new multiple-scattering Monte Carlo 3-D radiative transfer model named McSCIA (Monte Carlo for SCIAmachy is presented. The backward technique is used to efficiently simulate narrow field of view instruments. The McSCIA algorithm has been formulated as a function of the Earth's radius, and can thus perform simulations for both plane-parallel and spherical atmospheres. The latter geometry is essential for the interpretation of limb satellite measurements, as performed by SCIAMACHY on board of ESA's Envisat. The model can simulate UV-vis-NIR radiation. First the ray-tracing algorithm is presented in detail, and then successfully validated against literature references, both in plane-parallel and in spherical geometry. A simple 1-D model is used to explain two different ways of treating absorption. One method uses the single scattering albedo while the other uses the equivalence theorem. The equivalence theorem is based on a separation of absorption and scattering. It is shown that both methods give, in a statistical way, identical results for a wide variety of scenarios. Both absorption methods are included in McSCIA, and it is shown that also for a 3-D case both formulations give identical results. McSCIA limb profiles for atmospheres with and without absorption compare well with the one of the state of the art Monte Carlo radiative transfer model MCC++. A simplification of the photon statistics may lead to very fast calculations of absorption features in the atmosphere. However, these simplifications potentially introduce biases in the results. McSCIA does not use simplifications and is therefore a relatively slow implementation of the equivalence theorem.
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy
Energy Technology Data Exchange (ETDEWEB)
Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)
2012-05-15
Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at
Žukauskaite, A; Plukiene, R; Plukis, A
2007-01-01
Particle accelerators and other high energy facilities produce penetrating ionizing radiation (neutrons and γ-rays) that must be shielded. The objective of this work was to model photon and neutron transport in various materials, usually used as shielding, such as concrete, iron or graphite. Monte Carlo method allows obtaining answers by simulating individual particles and recording some aspects of their average behavior. In this work several nuclear experiments were modeled: AVF 65 – γ-ray beams (1-10 MeV), HIMAC and ISIS-800 – high energy neutrons (20-800 MeV) transport in iron and concrete. The results were then compared with experimental data.
International Nuclear Information System (INIS)
Kling, A.; Barao, F.J.C.; Nakagawa, M.; Tavora, L.
2001-01-01
The following topics were dealt with: Electron and photon interactions and transport mechanisms, random number generation, applications in medical physisc, microdosimetry, track structure, radiobiological modeling, Monte Carlo method in radiotherapy, dosimetry, and medical accelerator simulation, neutron transport, high-energy hadron transport. (HSI)
Sun, Hai-Feng; Sun, Feng-Xian; Xia, Xin-Lin
2018-01-01
A hybrid method combing the unstructured finite volume method and the Monte Carlo method and incorporating the line-by-line model has been developed to simulate the radiative transfer in highly spectral and inhomogeneous medium. In this method, the unstructured finite volume method is adopted to solve the spectral radiative transfer equation at wave numbers or spectral locations determined by the Monte Carlo method. The Monte Carlo method takes effects by firstly defining the monotonic random number relations corresponding to the spectral emitted power density of every discretized element of the concerning medium, and then by reversing the spectral location through comparison of these relations with predefined random numbers. Through this Monte Carlo method, the actual number of spectral locations on which the spectral radiative transfer equations are solved may be reduced: only the spectral locations that have higher spectral emissive powers would be more possibly selected. To increase the performance of the presented method, the total variation diminishing scheme on unstructured grids is adopted in treating the spectral radiative intensity at interface between control volumes. And, the discretized radiative transfer equation is implicitly and iteratively solved by an algebraic multi-grid solution approach to accelerate the convergence of the equation. The presented method was applied to 3D homogeneous and inhomogeneous cases for the validation and performance studies. Results show that for both cases, the presented method agree well with pure Monte Carlo benchmark solutions with acceptable number of spectral locations and computing time.
Brooks, S. M.; Spilker, L. J.; Pilorz, S.; Edgington, S. G.; Deau, E.; Morishima, R.
2012-12-01
Since arriving at Saturn in 2004, Cassini's Composite Infrared Spectrometer has recorded tens of millions of spectra of Saturn's rings (personal communication, M. Segura). CIRS records far infrared radiation (16.7-1000 microns) at focal plane 1 (FP1). Thermal emission from Saturn's rings peaks at FP1 wavelengths. CIRS spectra are well characterized as blackbody emission at an effective temperature Te, multiplied by a scalar factor related to ring emissivity (Spilker et al. [2005, 2006]). CIRS can therefore characterize the rings' temperature and study the thermal environment to which the ring particles are subject. We focus on CIRS data from the 2009 Saturnian equinox. As the Sun's disk crossed the ring plane, CIRS obtained several radial scans of the rings at a variety of phase angles, local hour angles and distances. With the Sun's rays striking the rings at an incidence angle of zero, solar heating is virtually absent, and thermal radiation from Saturn and sunlight reflected by Saturn dominate the thermal environment. These observations appear to present a paradox. Equinox data show that the flux of thermal energy radiated by the rings can even exceed the energy incident upon them as prescribed by thermal models, particularly in the C ring and Cassini Division (Ferrari and Leyrat [2006], Morishima et al. [2009, 2010]). Conservation principles suggest that such models underestimate heating of the rings in these cases, as it is clearly unphysical for the rings to radiate significantly more energy than is incident upon them. In this presentation, we will describe our efforts to resolve this paradox and determine what doing so can teach us about Saturn's rings. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with NASA. Copyright 2012 California Institute of Technology. Government sponsorship acknowledged.
International Nuclear Information System (INIS)
Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian
2013-01-01
The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX’s MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application. (paper)
International Nuclear Information System (INIS)
Kwok, S.
1980-01-01
A two-component dust model is suggested to explain the infrared emission from planetary nebulae. A cold dust component located in the extensive remnant of the red-giant envelope exterior to the visible nebula is responsible for the far-infrared emission. A ward dust component, which is condensed after the formation of the planetary nebula and confined within the ionized gas shell, emits most of the near- and mid-infrared radiation. The observations of NGC 7027 are shown to be consisten with such a model. The correlation of silicate emission in several planetary nebulae with an approximately +1 spectral index at low radio frequencies suggests that both the silicate and radio emissions originate from the remnant of the circumstellar envelope of th precursor star and are observable only while the planetary nebula is young. It is argued that oxygen-rich stars as well as carbon-rich stars can be progenitors of planetary nebulae
The use of Monte Carlo codes in metrology of ionizing radiations
International Nuclear Information System (INIS)
Bathe, J.; Gouriou, J.; Daures, J.; Ostrowsky, A.; Bordy, J.M.
2003-01-01
The use of Monte Carlo codes allows to get corrective values more exact or inaccessible by traditional methods. Here are presented several results got in te frame of dose metrology (influence of vacuum interstices in a calorimeter, influence of walls in a chemical dosemeter) as well as in this one of radioactivity metrology ( efficiency and spectra of energy deposition in a detector, spectra in energy of thick sources). (N.C.)
Energy Technology Data Exchange (ETDEWEB)
Liaparinos, Panagiotis [Department of Medical Physics, Medical School, University of Patras, 265 00 Patras (Greece); Kandarakis, Ioannis [Department of Medical Instruments Technology, Technological Educational Institution of Athens, Ag. Spyridonos Street, Aigaleo, 122 10 Athens (Greece); Cavouras, Dionisis [Department of Medical Instruments Technology, Technological Educational Institution of Athens, Ag. Spyridonos Street, Aigaleo, 122 10 Athens (Greece); Delis, Harry [Department of Medical Physics, Medical School, University of Patras, 265 00 Patras (Greece); Panayiotakis, George [Department of Medical Physics, Medical School, University of Patras, 265 00 Patras (Greece)]. E-mail: panayiot@upatras.gr
2006-12-20
The aim of this study was to evaluate the effect of K-characteristic radiation on the performance of scintillator crystals incorporated in nuclear medicine detectors (LSO, BGO, GSO). K-characteristic radiation is produced within materials of at least one high atomic number element (e.g. Lu, Gd, Bi). This radiation may either be reabsorbed or it may escape the scintillator. In both cases the light emission efficiency of the scintillator may be affected resulting in either spatial or energy resolution degradation. A computational program, based on Monte Carlo methods, was developed in order to simulate the transport of K-characteristic radiation within the most commonly used scintillator materials. Crystal thickness was allowed to vary from 0.5 up to 15 mm. A monoenergetic pencil beam, with energy varying from 0.60 to 0.511 MeV was considered to fall on the center of the crystal surface. The dominant {gamma}-ray interactions (elastic and inelastic scattering and photoelectric absorption) were taken into account in the simulation. Results showed that, depending on crystal thickness, incident photon energy and scintillator's intrinsic properties (L or K-fluorescence yield, effective atomic number and density), the scintillator's emission efficiency may be significantly reduced and affect spatial or energy resolution.
Liaparinos, Panagiotis; Kandarakis, Ioannis; Cavouras, Dionisis; Delis, Harry; Panayiotakis, George
2006-12-01
The aim of this study was to evaluate the effect of K-characteristic radiation on the performance of scintillator crystals incorporated in nuclear medicine detectors (LSO, BGO, GSO). K-characteristic radiation is produced within materials of at least one high atomic number element (e.g. Lu, Gd, Bi). This radiation may either be reabsorbed or it may escape the scintillator. In both cases the light emission efficiency of the scintillator may be affected resulting in either spatial or energy resolution degradation. A computational program, based on Monte Carlo methods, was developed in order to simulate the transport of K-characteristic radiation within the most commonly used scintillator materials. Crystal thickness was allowed to vary from 0.5 up to 15 mm. A monoenergetic pencil beam, with energy varying from 0.60 to 0.511 MeV was considered to fall on the center of the crystal surface. The dominant γ-ray interactions (elastic and inelastic scattering and photoelectric absorption) were taken into account in the simulation. Results showed that, depending on crystal thickness, incident photon energy and scintillator's intrinsic properties (L or K-fluorescence yield, effective atomic number and density), the scintillator's emission efficiency may be significantly reduced and affect spatial or energy resolution.
International Nuclear Information System (INIS)
Liaparinos, Panagiotis; Kandarakis, Ioannis; Cavouras, Dionisis; Delis, Harry; Panayiotakis, George
2006-01-01
The aim of this study was to evaluate the effect of K-characteristic radiation on the performance of scintillator crystals incorporated in nuclear medicine detectors (LSO, BGO, GSO). K-characteristic radiation is produced within materials of at least one high atomic number element (e.g. Lu, Gd, Bi). This radiation may either be reabsorbed or it may escape the scintillator. In both cases the light emission efficiency of the scintillator may be affected resulting in either spatial or energy resolution degradation. A computational program, based on Monte Carlo methods, was developed in order to simulate the transport of K-characteristic radiation within the most commonly used scintillator materials. Crystal thickness was allowed to vary from 0.5 up to 15 mm. A monoenergetic pencil beam, with energy varying from 0.60 to 0.511 MeV was considered to fall on the center of the crystal surface. The dominant γ-ray interactions (elastic and inelastic scattering and photoelectric absorption) were taken into account in the simulation. Results showed that, depending on crystal thickness, incident photon energy and scintillator's intrinsic properties (L or K-fluorescence yield, effective atomic number and density), the scintillator's emission efficiency may be significantly reduced and affect spatial or energy resolution
Energy Technology Data Exchange (ETDEWEB)
Smekens, F; Freud, N; Letang, J M; Babot, D [CNDRI (Nondestructive Testing using Ionizing Radiations) Laboratory, INSA-Lyon, 69621 Villeurbanne Cedex (France); Adam, J-F; Elleaume, H; Esteve, F [INSERM U-836, Equipe 6 ' Rayonnement Synchrotron et Recherche Medicale' , Institut des Neurosciences de Grenoble (France); Ferrero, C; Bravin, A [European Synchrotron Radiation Facility, Grenoble (France)], E-mail: francois.smekens@insa-lyon.fr
2009-08-07
A hybrid approach, combining deterministic and Monte Carlo (MC) calculations, is proposed to compute the distribution of dose deposited during stereotactic synchrotron radiation therapy treatment. The proposed approach divides the computation into two parts: (i) the dose deposited by primary radiation (coming directly from the incident x-ray beam) is calculated in a deterministic way using ray casting techniques and energy-absorption coefficient tables and (ii) the dose deposited by secondary radiation (Rayleigh and Compton scattering, fluorescence) is computed using a hybrid algorithm combining MC and deterministic calculations. In the MC part, a small number of particle histories are simulated. Every time a scattering or fluorescence event takes place, a splitting mechanism is applied, so that multiple secondary photons are generated with a reduced weight. The secondary events are further processed in a deterministic way, using ray casting techniques. The whole simulation, carried out within the framework of the Monte Carlo code Geant4, is shown to converge towards the same results as the full MC simulation. The speed of convergence is found to depend notably on the splitting multiplicity, which can easily be optimized. To assess the performance of the proposed algorithm, we compare it to state-of-the-art MC simulations, accelerated by the track length estimator technique (TLE), considering a clinically realistic test case. It is found that the hybrid approach is significantly faster than the MC/TLE method. The gain in speed in a test case was about 25 for a constant precision. Therefore, this method appears to be suitable for treatment planning applications.
Use of implicit Monte Carlo radiation transport with hydrodynamics and compton scattering
International Nuclear Information System (INIS)
Fleck, J.A. Jr.
1971-03-01
It is shown that the combination of implicit radiation transport and hydrodynamics, Compton scattering, and any other energy transport can be simply carried out by a ''splitting'' procedure. Contributions to material energy exchange can be reckoned separately for hydrodynamics, radiation transport without scattering, Compton scattering, plus any other possible energy exchange mechanism. The radiation transport phase of the calculation would be implicit, but the hydrodynamics and Compton portions would not, leading to possible time step controls. The time step restrictions which occur on radiation transfer due to large Planck mean absorption cross-sections would not occur
A solution algorithm for calculating photon radiation fields with the aid of the Monte Carlo method
International Nuclear Information System (INIS)
Zappe, D.
1978-04-01
The MCTEST program and its subroutines for the solution of the Boltzmann transport equation is presented. The program renders possible to calculate photon radiation fields of point or plane gamma sources. After changing two subroutines the calculation can also be carried out for the case of directed incidence of radiation on plane shields of iron or concrete. (author)
Monte Carlo simulation of radiative processes in electron-positron scattering
International Nuclear Information System (INIS)
Kleiss, R.H.P.
1982-01-01
The Monte Carlo simulation of scattering processes has turned out to be one of the most successful methods of translating theoretical predictions into experimentally meaningful quantities. It is the purpose of this thesis to describe how this approach can be applied to higher-order QED corrections to several fundamental processes. In chapter II a very brief overview of the currently interesting phenomena in e +- scattering is given. It is argued that accurate information on higher-order QED corrections is very important and that the Monte Carlo approach is one of the most flexible and general methods to obtain this information. In chapter III the author describes various techniques which are useful in this context, and makes a few remarks on the numerical aspects of the proposed method. In the following three chapters he applies this to the processes e + e - → μ + μ - (γ) and e + e - → qanti q(sigma). In chapter IV he motivates his choice of these processes in view of their experimental and theoretical relevance. The formulae necessary for a computer simulation of all quantities of interest, up to order α 3 , is given. Chapters V and VI describe how this simulation can be performed using the techniques mentioned in chapter III. In chapter VII it is shown how additional dynamical quantities, namely the polarization of the incoming and outgoing particles, can be incorporated in our treatment, and the relevant formulae for the example processes mentioned above are given. Finally, in chapter VIII the author presents some examples of the comparison between theoretical predictions based on Monte Carlo simulations as outlined here, and the results from actual experiments. (Auth.)
Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices
International Nuclear Information System (INIS)
Zhang, Guoqing
2011-01-01
Monte Carlo methods based on random sampling are widely used in different fields for the capability of solving problems with a large number of coupled degrees of freedom. In this work, Monte Carlos methods are successfully applied for the simulation of the mixed neutron-gamma field in an interim storage facility and neutron dosimeters of different types. Details are discussed in two parts: In the first part, the method of simulating an interim storage facility loaded with CASTORs is presented. The size of a CASTOR is rather large (several meters) and the CASTOR wall is very thick (tens of centimeters). Obtaining the results of dose rates outside a CASTOR with reasonable errors costs usually hours or even days. For the simulation of a large amount of CASTORs in an interim storage facility, it needs weeks or even months to finish a calculation. Variance reduction techniques were used to reduce the calculation time and to achieve reasonable relative errors. Source clones were applied to avoid unnecessary repeated calculations. In addition, the simulations were performed on a cluster system. With the calculation techniques discussed above, the efficiencies of calculations can be improved evidently. In the second part, the methods of simulating the response of neutron dosimeters are presented. An Alnor albedo dosimeter was modelled in MCNP, and it has been simulated in the facility to calculate the calibration factor to get the evaluated response to a Cf-252 source. The angular response of Makrofol detectors to fast neutrons has also been investigated. As a kind of SSNTD, Makrofol can detect fast neutrons by recording the neutron induced heavy charged recoils. To obtain the information of charged recoils, general-purpose Monte Carlo codes were used for transporting incident neutrons. The response of Makrofol to fast neutrons is dependent on several factors. Based on the parameters which affect the track revealing, the formation of visible tracks was determined. For
Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices
Energy Technology Data Exchange (ETDEWEB)
Zhang, Guoqing
2011-12-22
Monte Carlo methods based on random sampling are widely used in different fields for the capability of solving problems with a large number of coupled degrees of freedom. In this work, Monte Carlos methods are successfully applied for the simulation of the mixed neutron-gamma field in an interim storage facility and neutron dosimeters of different types. Details are discussed in two parts: In the first part, the method of simulating an interim storage facility loaded with CASTORs is presented. The size of a CASTOR is rather large (several meters) and the CASTOR wall is very thick (tens of centimeters). Obtaining the results of dose rates outside a CASTOR with reasonable errors costs usually hours or even days. For the simulation of a large amount of CASTORs in an interim storage facility, it needs weeks or even months to finish a calculation. Variance reduction techniques were used to reduce the calculation time and to achieve reasonable relative errors. Source clones were applied to avoid unnecessary repeated calculations. In addition, the simulations were performed on a cluster system. With the calculation techniques discussed above, the efficiencies of calculations can be improved evidently. In the second part, the methods of simulating the response of neutron dosimeters are presented. An Alnor albedo dosimeter was modelled in MCNP, and it has been simulated in the facility to calculate the calibration factor to get the evaluated response to a Cf-252 source. The angular response of Makrofol detectors to fast neutrons has also been investigated. As a kind of SSNTD, Makrofol can detect fast neutrons by recording the neutron induced heavy charged recoils. To obtain the information of charged recoils, general-purpose Monte Carlo codes were used for transporting incident neutrons. The response of Makrofol to fast neutrons is dependent on several factors. Based on the parameters which affect the track revealing, the formation of visible tracks was determined. For
Monte Carlo Radiative Transfer Modeling of Lightning Observed in Galileo Images of Jupiter
Dyudine, U. A.; Ingersoll, Andrew P.
2002-01-01
We study lightning on Jupiter and the clouds illuminated by the lightning using images taken by the Galileo orbiter. The Galileo images have a resolution of 25 km/pixel and axe able to resolve the shape of the single lightning spots in the images, which have full widths at half the maximum intensity in the range of 90-160 km. We compare the measured lightning flash images with simulated images produced by our ED Monte Carlo light-scattering model. The model calculates Monte Carlo scattering of photons in a ED opacity distribution. During each scattering event, light is partially absorbed. The new direction of the photon after scattering is chosen according to a Henyey-Greenstein phase function. An image from each direction is produced by accumulating photons emerging from the cloud in a small range (bins) of emission angles. Lightning bolts are modeled either as points or vertical lines. Our results suggest that some of the observed scattering patterns axe produced in a 3-D cloud rather than in a plane-parallel cloud layer. Lightning is estimated to occur at least as deep as the bottom of the expected water cloud. For the six cases studied, we find that the clouds above the lightning are optically thick (tau > 5). Jovian flashes are more regular and circular than the largest terrestrial flashes observed from space. On Jupiter there is nothing equivalent to the 30-40-km horizontal flashes which axe seen on Earth.
International Nuclear Information System (INIS)
Kadri, Omrane
2005-01-01
The present work presents an overview of application of the Monte Carlo code, GEANT4, in the gamma irradiation processing field. In order to check the validity of such code, a successful calculation of expected dose rate and photon flux in the Tunisian gamma irradiation facility was carried out. In the same course of study, an ample set of comparison tests were done using the PMMA dosimeters and the GEANT4 version 8.2 code, for measurement and calculation purposes. Thus, the excellent agreement seen between data and calculations allow us to apply the GEANT4-based tool in order to optimize some process parameters, specific to the studied 60 Co facility, and to systematically improve the dose uniformity within irradiated targets having different densities and volumes. Therefore, three irradiation processing procedures were studied let us to conclude that for a given carrier dimensions, more the product density is higher than a determined value, more a specific procedure will be performed. It is shown that Monte Carlo simulation improves the gamma irradiation process understanding. (Author)
Monte-Carlo study on primary knock-on atom energy spectrum produced by neutron radiation
International Nuclear Information System (INIS)
Zhou Wei; Liu Yongkang; Deng Yongjun; Ma Jimin
2012-01-01
Computational method on energy distribution of primary knock-on atom (PKA) produced by neutron radiation was built in the paper. Based on the DBCN card in MCNP, reaction position, reaction type and energy transfer between neutrons and atoms were recorded. According to statistic of these data, energy and space distributions of PKAs were obtained. The method resolves preferably randomicity of random number and efficiency of random sampling computation. The results show small statistical fluctuation and well statistical. Three-dimensional figure of energy and space distribution of PKAs were obtained, which would be important to evaluate radiation capability of materials and study radiation damage by neutrons. (authors)
Evaluating the radiation detection of the RbGd 2Br 7:Ce scintillator by Monte Carlo methods
Liaparinos, Panagiotis; Kandarakis, Ioannis; Cavouras, Dionisis; Delis, Harry; Panayiotakis, George
2006-12-01
The purpose of this study was to investigate the radiation detection efficiency of the recently introduced RbGd 2Br 7:Ce (RGB) scintillator material by a custom developed Monte Carlo simulation code. Considering its fast principal decay constant (45 ns) and its high light yield (56 000 photons/MeV), RbGd 2Br 7:Ce appears to be a quite promising scintillator for applications in nuclear medical imaging systems. In this work, gamma-ray interactions, within the scintillator mass were studied. In addition, the effect of K-characteristic fluorescence radiation emission, re-absorption or escape, as well as the effect of scattering events on the spatial distribution of absorbed energy was examined. Various scintillator crystal thicknesses (5-25 mm), used in positron emission imaging, were considered to be irradiated by 511 keV photons. Similar simulations were performed on the well known Lu 2SiO 5:Ce (LSO) scintillator for comparison purposes. Simulation results allowed the determination of the quantum detection efficiency as well as the fraction of the energy absorbed due to the K-characteristic radiation. Results were obtained as a function of scintillator crystal thickness. The Lu 2SiO 5:Ce scintillator material showed to exhibit better radiation absorption properties in comparison with RbGd 2Br 7:Ce. However, RGB showed to be less affected by the production of K-characteristic radiation. Taking into account its very short decay time and its high light yield, this material could be considered to be employed in positron imaging (PET) detectors.
Energy Technology Data Exchange (ETDEWEB)
Sarrut, David, E-mail: david.sarrut@creatis.insa-lyon.fr [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon (France); Université Lyon 1 (France); Centre Léon Bérard (France); Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault [Inserm, UMR1037 CRCT, F-31000 Toulouse, France and Université Toulouse III-Paul Sabatier, UMR1037 CRCT, F-31000 Toulouse (France); Boussion, Nicolas [INSERM, UMR 1101, LaTIM, CHU Morvan, 29609 Brest (France); Freud, Nicolas; Létang, Jean-Michel [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, 69008 Lyon (France); Jan, Sébastien [CEA/DSV/I2BM/SHFJ, Orsay 91401 (France); Loudos, George [Department of Medical Instruments Technology, Technological Educational Institute of Athens, Athens 12210 (Greece); Maigne, Lydia; Perrot, Yann [UMR 6533 CNRS/IN2P3, Université Blaise Pascal, 63171 Aubière (France); Papadimitroulas, Panagiotis [Department of Biomedical Engineering, Technological Educational Institute of Athens, 12210, Athens (Greece); Pietrzyk, Uwe [Institut für Neurowissenschaften und Medizin, Forschungszentrum Jülich GmbH, 52425 Jülich, Germany and Fachbereich für Mathematik und Naturwissenschaften, Bergische Universität Wuppertal, 42097 Wuppertal (Germany); Robert, Charlotte [IMNC, UMR 8165 CNRS, Universités Paris 7 et Paris 11, Orsay 91406 (France); and others
2014-06-15
In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.
Žukauskaitėa, A; Plukienė, R; Ridikas, D
2007-01-01
Particle accelerators and other high energy facilities produce penetrating ionizing radiation (neutrons and γ-rays) that must be shielded. The objective of this work was to model photon and neutron transport in various materials, usually used as shielding, such as concrete, iron or graphite. Monte Carlo method allows obtaining answers by simulating individual particles and recording some aspects of their average behavior. In this work several nuclear experiments were modeled: AVF 65 (AVF cyclotron of Research Center of Nuclear Physics, Osaka University, Japan) – γ-ray beams (1-10 MeV), HIMAC (heavy-ion synchrotron of the National Institute of Radiological Sciences in Chiba, Japan) and ISIS-800 (ISIS intensive spallation neutron source facility of the Rutherford Appleton laboratory, UK) – high energy neutron (20-800 MeV) transport in iron and concrete. The calculation results were then compared with experimental data.compared with experimental data.
Sarrut, David; Bardiès, Manuel; Boussion, Nicolas; Freud, Nicolas; Jan, Sébastien; Létang, Jean-Michel; Loudos, George; Maigne, Lydia; Marcatili, Sara; Mauxion, Thibault; Papadimitroulas, Panagiotis; Perrot, Yann; Pietrzyk, Uwe; Robert, Charlotte; Schaart, Dennis R; Visvikis, Dimitris; Buvat, Irène
2014-06-01
In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same framework is emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.
Stepanek, J; Laissue, J A; Lyubimova, N; Di Michiel, F; Slatkin, D N
2000-01-01
Microbeam radiation therapy (MRT) is a currently experimental method of radiotherapy which is mediated by an array of parallel microbeams of synchrotron-wiggler-generated X-rays. Suitably selected, nominally supralethal doses of X-rays delivered to parallel microslices of tumor-bearing tissues in rats can be either palliative or curative while causing little or no serious damage to contiguous normal tissues. Although the pathogenesis of MRT-mediated tumor regression is not understood, as in all radiotherapy such understanding will be based ultimately on our understanding of the relationships among the following three factors: (1) microdosimetry, (2) damage to normal tissues, and (3) therapeutic efficacy. Although physical microdosimetry is feasible, published information on MRT microdosimetry to date is computational. This report describes Monte Carlo-based computational MRT microdosimetry using photon and/or electron scattering and photoionization cross-section data in the 1 e V through 100 GeV range distrib...
International Nuclear Information System (INIS)
Densmore, J.D.; Park, H.; Wollaber, A.B.; Rauenzahn, R.M.; Knoll, D.A.
2015-01-01
We present a moment-based acceleration algorithm applied to Monte Carlo simulation of thermal radiative-transfer problems. Our acceleration algorithm employs a continuum system of moments to accelerate convergence of stiff absorption–emission physics. The combination of energy-conserving tallies and the use of an asymptotic approximation in optically thick regions remedy the difficulties of local energy conservation and mitigation of statistical noise in such regions. We demonstrate the efficiency and accuracy of the developed method. We also compare directly to the standard linearization-based method of Fleck and Cummings [1]. A factor of 40 reduction in total computational time is achieved with the new algorithm for an equivalent (or more accurate) solution as compared with the Fleck–Cummings algorithm
International Nuclear Information System (INIS)
Sarrut, David; Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault; Boussion, Nicolas; Freud, Nicolas; Létang, Jean-Michel; Jan, Sébastien; Loudos, George; Maigne, Lydia; Perrot, Yann; Papadimitroulas, Panagiotis; Pietrzyk, Uwe; Robert, Charlotte
2014-01-01
In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities
International Nuclear Information System (INIS)
Cechak, T.
1982-01-01
Applying Gardner's method of double evaluation one detector should be positioned such that its response should be independent of the material density and the second detector should be positioned so as to maximize changes in response due to density changes. The experimental scanning for optimal energy is extremely time demanding. A program was written based on the Monte Carlo method which solves the problem of error magnitude in case the computation of gamma radiation backscattering neglects multiply scattered photons, the problem of how this error depends on the atomic number of the scattering material as well as the problem of whether the representation of individual scatterings in the spectrum of backscattered photons depends on the positioning of the detector. 42 detectors, 8 types of material and 10 different density values were considered. The computed dependences are given graphically. (M.D.)
International Nuclear Information System (INIS)
Amnuehl', P.R.
1985-01-01
The history of planetary nebulae discovery and their origin and evolution studies is discussed in a popular way. The problem of planetary nebulae central star is considered. The connection between the white-draft star and the planetary nebulae formulation is shown. The experimental data available acknowledge the hypothesis of red giant - planetary nebula nucleus - white-draft star transition process. Masses of planetary nebulae white-draft stars and central stars are distributed practically similarly: the medium mass is close to 0.6Msub(Sun) (Msub(Sun) - is the mass of the Sun)
International Nuclear Information System (INIS)
Kramer, R; Vieira, J W; Khoury, H J; Lima, F R A; Fuelle, D
2003-01-01
The MAX (Male Adult voXel) phantom has been developed from existing segmented images of a male adult body, in order to achieve a representation as close as possible to the anatomical properties of the reference adult male specified by the ICRP. The study describes the adjustments of the soft-tissue organ masses, a new dosimetric model for the skin, a new model for skeletal dosimetry and a computational exposure model based on coupling the MAX phantom with the EGS4 Monte Carlo code. Conversion coefficients between equivalent dose to the red bone marrow as well as effective MAX dose and air-kerma free in air for external photon irradiation from the front and from the back, respectively, are presented and compared with similar data from other human phantoms
Kwan, Betty P.; O'Brien, T. Paul
2015-06-01
The Aerospace Corporation performed a study to determine whether static percentiles of AE9/AP9 can be used to approximate dynamic Monte Carlo runs for radiation analysis of spiral transfer orbits. Solar panel degradation is a major concern for solar-electric propulsion because solar-electric propulsion depends on the power output of the solar panel. Different spiral trajectories have different radiation environments that could lead to solar panel degradation. Because the spiral transfer orbits only last weeks to months, an average environment does not adequately address the possible transient enhancements of the radiation environment that must be accounted for in optimizing the transfer orbit trajectory. Therefore, to optimize the trajectory, an ensemble of Monte Carlo simulations of AE9/AP9 would normally be run for every spiral trajectory to determine the 95th percentile radiation environment. To avoid performing lengthy Monte Carlo dynamic simulations for every candidate spiral trajectory in the optimization, we found a static percentile that would be an accurate representation of the full Monte Carlo simulation for a representative set of spiral trajectories. For 3 LEO to GEO and 1 LEO to MEO trajectories, a static 90th percentile AP9 is a good approximation of the 95th percentile fluence with dynamics for 4-10 MeV protons, and a static 80th percentile AE9 is a good approximation of the 95th percentile fluence with dynamics for 0.5-2 MeV electrons. While the specific percentiles chosen cannot necessarily be used in general for other orbit trade studies, the concept of determining a static percentile as a quick approximation to a full Monte Carlo ensemble of simulations can likely be applied to other orbit trade studies. We expect the static percentile to depend on the region of space traversed, the mission duration, and the radiation effect considered.
Efficient Sequential Monte Carlo Sampling for Continuous Monitoring of a Radiation Situation
Czech Academy of Sciences Publication Activity Database
Šmídl, Václav; Hofman, Radek
2014-01-01
Roč. 56, č. 4 (2014), s. 514-527 ISSN 0040-1706 R&D Projects: GA MV VG20102013018 Institutional support: RVO:67985556 Keywords : radiation protection * atmospheric dispersion model * importance sampling Subject RIV: BD - Theory of Information Impact factor: 1.814, year: 2014 http:// library .utia.cas.cz/separaty/2014/AS/smidl-0433631.pdf
Cohen, D; Stamnes, S; Tanikawa, T; Sommersten, E R; Stamnes, J J; Lotsberg, J K; Stamnes, K
2013-04-22
A comparison is presented of two different methods for polarized radiative transfer in coupled media consisting of two adjacent slabs with different refractive indices, each slab being a stratified medium with no change in optical properties except in the direction of stratification. One of the methods is based on solving the integro-differential radiative transfer equation for the two coupled slabs using the discrete ordinate approximation. The other method is based on probabilistic and statistical concepts and simulates the propagation of polarized light using the Monte Carlo approach. The emphasis is on non-Rayleigh scattering for particles in the Mie regime. Comparisons with benchmark results available for a slab with constant refractive index show that both methods reproduce these benchmark results when the refractive index is set to be the same in the two slabs. Computed results for test cases with coupling (different refractive indices in the two slabs) show that the two methods produce essentially identical results for identical input in terms of absorption and scattering coefficients and scattering phase matrices.
Directory of Open Access Journals (Sweden)
Kazempour M.
2015-06-01
Full Text Available Background: In diagnostic radiology lead apron, are usually used to protect patients and radiology staff against ionizing radiation. Lead apron is a desirable shield due to high absorption and effective attenuation of x-ray photons in the diagnostic radiology range. Objective: Although lead aprons have good radiation protection properties, in recent years, researchers have been looking for alternative materials to be used instead of lead apron because of some problems derived from lead-content of aprons. Because of its lead-content, these radiation protection garments are so heavy and uncomfortable for the staff to wear, particularly in long-time uses. In addition, lead is a toxic element and its disposal is associated with environmental and human-health hazards. Method: In this study, several new combinations of lead free materials ((W-Si, (W-Sn-Ba-EPVC , (W-Sn-Cd-EPVC have been investigated in the energy range of diagnostic radiology in two geometries: narrow and broad beam. Geometries of the radiation attenuation characteristics of these materials was assessed in 40, 60, 90 and 120 kVp and the results compared with those of some lead-containing materials ((Pb-Si, (Pb-EPVC. Results: Lead shields still provide better protection in low energies (below 40 kVp. Combination of W-Sn-Cd-EPVC has shown the best radiation attenuation features in 60 and 90 kVp and the composition of (W-Sn-Ba-EPVC represents the best attenuation in 120 kVp, even better than previously mentioned lead- containing composites. Conclusion: Lead free shields are completely effective for protection against X-ray energies in the range of 60 to 120 kVp.
GATE Monte Carlo simulation in radiation therapy for complex and dynamic beams in IMRT
International Nuclear Information System (INIS)
Benhalouche, Saadia
2014-01-01
Radiotherapy is one of the three methods of cancer treatment along with surgery and chemotherapy. It has evolved with the development of treatment techniques such as IMRT and VMAT along with IGRT for patient positioning. The aim is to effectively treat tumors while limiting the dose to healthy organs. In our work, we use the GATE Monte Carlo simulation platform to model a LINAC for a 6 MV photon beam. The resulting model is then validated with a dosimetric study by calculating relevant parameters for the beam quality. The LINAC model is then used for simulating clinical IMRT treatment plans in the ORL domain. Simulation results are compared with experimental measurements. We also explored the possibility of modeling the LINAC portal imaging system. This technique referred to as MV-CBCT combine the LINAC source with a flat panel detector to acquire 3D images of the patient. This part was validated first by acquiring 2D projections on patient and anthropomorphic phantom, and by reconstructing 3D volumes. Here again, validation was performed by comparing simulated and actual images. As a second step, a dosimetric validation was done by evaluating the dose deposited by IMRT beams, by means of portal signal only. We show in the present work the ability of GATE to perform complex IMRT treatments and portal images as they are performed routinely for dosimetric quality control. (author) [fr
An application benchmark between the LHC Radiation Monitor and FLUKA Monte Carlo simulations at CERF
Roeed, K; Lebbos, E; Lendaro, J; Kramer, D; Mala, P; Spiezia, G; Pignard, C; CERN. Geneva. ATS Department
2011-01-01
This report presents a comparison between FLUKA simulations and measurements performed with the LHC Radiation Monitor at the CERF facility in the north area of CERN. The main ojective of the work was to compare measurements of Single Event Upsets (SEU), and thereby measurements of high energy hadron and thermal neutron fluences, to the predicted values from FLUKA simulations. The measurements are done in a mixed radiation field comparable to the LHC environment. The RadMon can be operated at two different bias voltages (3 V and 5 V) for which the sensitivity to High Energy Hadrons (HEH) and thermal neutrons is different. Performing measurements at both voltages thus makes it possible to extract the corresponding values for the high energy hadron and thermal neutron fluence.
A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes
Schnittman, Jeremy David; Krolik, Julian H.
2013-01-01
We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.
International Nuclear Information System (INIS)
Koukorava, C; Farah, J; Clairand, I; Donadille, L; Struelens, L; Vanhavere, F; Dimitriou, P
2014-01-01
Monte Carlo calculations were used to investigate the efficiency of radiation protection equipment in reducing eye and whole body doses during fluoroscopically guided interventional procedures. Eye lens doses were determined considering different models of eyewear with various shapes, sizes and lead thickness. The origin of scattered radiation reaching the eyes was also assessed to explain the variation in the protection efficiency of the different eyewear models with exposure conditions. The work also investigates the variation of eye and whole body doses with ceiling-suspended shields of various shapes and positioning. For all simulations, a broad spectrum of configurations typical for most interventional procedures was considered. Calculations showed that ‘wrap around’ glasses are the most efficient eyewear models reducing, on average, the dose by 74% and 21% for the left and right eyes respectively. The air gap between the glasses and the eyes was found to be the primary source of scattered radiation reaching the eyes. The ceiling-suspended screens were more efficient when positioned close to the patient’s skin and to the x-ray field. With the use of such shields, the H p (10) values recorded at the collar, chest and waist level and the H p (3) values for both eyes were reduced on average by 47%, 37%, 20% and 56% respectively. Finally, simulations proved that beam quality and lead thickness have little influence on eye dose while beam projection, the position and head orientation of the operator as well as the distance between the image detector and the patient are key parameters affecting eye and whole body doses. (paper)
Koukorava, C; Farah, J; Struelens, L; Clairand, I; Donadille, L; Vanhavere, F; Dimitriou, P
2014-09-01
Monte Carlo calculations were used to investigate the efficiency of radiation protection equipment in reducing eye and whole body doses during fluoroscopically guided interventional procedures. Eye lens doses were determined considering different models of eyewear with various shapes, sizes and lead thickness. The origin of scattered radiation reaching the eyes was also assessed to explain the variation in the protection efficiency of the different eyewear models with exposure conditions. The work also investigates the variation of eye and whole body doses with ceiling-suspended shields of various shapes and positioning. For all simulations, a broad spectrum of configurations typical for most interventional procedures was considered. Calculations showed that 'wrap around' glasses are the most efficient eyewear models reducing, on average, the dose by 74% and 21% for the left and right eyes respectively. The air gap between the glasses and the eyes was found to be the primary source of scattered radiation reaching the eyes. The ceiling-suspended screens were more efficient when positioned close to the patient's skin and to the x-ray field. With the use of such shields, the Hp(10) values recorded at the collar, chest and waist level and the Hp(3) values for both eyes were reduced on average by 47%, 37%, 20% and 56% respectively. Finally, simulations proved that beam quality and lead thickness have little influence on eye dose while beam projection, the position and head orientation of the operator as well as the distance between the image detector and the patient are key parameters affecting eye and whole body doses.
Directory of Open Access Journals (Sweden)
Mary Yip
Full Text Available Detection of buried improvised explosive devices (IEDs is a delicate task, leading to a need to develop sensitive stand-off detection technology. The shape, composition and size of the IEDs can be expected to be revised over time in an effort to overcome increasingly sophisticated detection methods. As an example, for the most part, landmines are found through metal detection which has led to increasing use of non-ferrous materials such as wood or plastic containers for chemical based explosives being developed.Monte Carlo simulations have been undertaken considering three different commercially available detector materials (hyperpure-Ge (HPGe, lanthanum(III bromide (LaBr and thallium activated sodium iodide (NaI(Tl, applied at a stand-off distance of 50 cm from the surface and burial depths of 0, 5 and 10 cm, with sand as the obfuscating medium. Target materials representing medium density wood and mild steel have been considered. Each detector has been modelled as a 10 cm thick cylinder with a 20 cm diameter.It appears that HPGe represents the most promising detector for this application. Although it was not the highest density material studied, its excellent energy resolving capability leads to the highest quality spectra from which detection decisions can be inferred.The simulation work undertaken here suggests that a vehicle-born threat detection system could be envisaged using a single betatron and a series of detectors operating in parallel observing the space directly in front of the vehicle path. Furthermore, results show that non-ferrous materials such as wood can be effectively discerned in such remote-operated detection system, with the potential to apply a signature analysis template matching technique for real-time analysis of such data.
International Nuclear Information System (INIS)
Lee, Choonsik; Nagaoka, Tomoaki; Lee, Jai-Ki
2006-01-01
Japanese male and female tomographic phantoms, which have been developed for radio-frequency electromagnetic-field dosimetry, were implemented into multi-particle Monte Carlo transport code to evaluate realistic dose distribution in human body exposed to radiation field. Japanese tomographic phantoms, which were developed from the whole body magnetic resonance images of Japanese average adult male and female, were processed as follows to be implemented into general purpose multi-particle Monte Carlo code, MCNPX2.5. Original array size of Japanese male and female phantoms, 320 x 160 x 866 voxels and 320 x 160 x 804 voxels, respectively, were reduced into 320 x 160 x 433 voxels and 320 x 160 x 402 voxels due to the limitation of memory use in MCNPX2.5. The 3D voxel array of the phantoms were processed by using the built-in repeated structure algorithm, where the human anatomy was described by the repeated lattice of tiny cube containing the information of material composition and organ index number. Original phantom data were converted into ASCII file, which can be directly ported into the lattice card of MCNPX2.5 input deck by using in-house code. A total of 30 material compositions obtained from International Commission on Radiation Units and Measurement (ICRU) report 46 were assigned to 54 and 55 organs and tissues in the male and female phantoms, respectively, and imported into the material card of MCNPX2.5 along with the corresponding cross section data. Illustrative calculation of absorbed doses for 26 internal organs and effective dose were performed for idealized broad parallel photon and neutron beams in anterior-posterior irradiation geometry, which is typical for workers at nuclear power plant. The results were compared with the data from other Japanese and Caucasian tomographic phantom, and International Commission on Radiological Protection (ICRP) report 74. The further investigation of the difference in organ dose and effective dose among tomographic
Energy Technology Data Exchange (ETDEWEB)
Bathe, J.; Gouriou, J.; Daures, J.; Ostrowsky, A.; Bordy, J.M. [CEA Saclay, Dir. de la Recherche Technologique (DRT/DIMRI - LNHB), 91 - Gif sur Yvette (France)
2003-07-01
The use of Monte Carlo codes allows to get corrective values more exact or inaccessible by traditional methods. Here are presented several results got in te frame of dose metrology (influence of vacuum interstices in a calorimeter, influence of walls in a chemical dosemeter) as well as in this one of radioactivity metrology ( efficiency and spectra of energy deposition in a detector, spectra in energy of thick sources). (N.C.)
Directory of Open Access Journals (Sweden)
Boris Fomin
2012-10-01
Full Text Available This paper presents a new version of radiative transfer model called the Fast Line-by-Line Model (FLBLM, which is based on the Line-by-Line (LbL and Monte Carlo (MC methods and rigorously treats particulate and molecular scattering alongside absorption. The advantage of this model consists in the use of the line-by-line model that allows for the computing of high-resolution spectra quite quickly. We have developed the model by taking into account the polarization state of light and carried out some validations by comparison against benchmark results. FLBLM calculates the Stokes parameters spectra of shortwave radiation in vertically inhomogeneous atmospheres. This update makes the model applicable for the assessment of cloud and aerosol influence on radiances as measured by the SW high-resolution polarization spectrometers. In sample results we demonstrate that the high-resolution spectra of the Stokes parameters contain more detailed information about clouds and aerosols than the medium- and low-resolution spectra wherein lines are not resolved. The presented model is rapid enough for many practical applications (e.g., validations and might be useful especially for the remote sensing. FLBLM is suitable for development of the reliable technique for retrieval of optical and microphysical properties of clouds and aerosols from high-resolution satellites data.
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
International Nuclear Information System (INIS)
Yang, Ching-Ching; Chan, Kai-Chieh
2013-06-01
-Small animal PET allows qualitative assessment and quantitative measurement of biochemical processes in vivo, but the accuracy and reproducibility of imaging results can be affected by several parameters. The first aim of this study was to investigate the performance of different CT-based attenuation correction strategies and assess the resulting impact on PET images. The absorbed dose in different tissues caused by scanning procedures was also discussed to minimize biologic damage generated by radiation exposure due to PET/CT scanning. A small animal PET/CT system was modeled based on Monte Carlo simulation to generate imaging results and dose distribution. Three energy mapping methods, including the bilinear scaling method, the dual-energy method and the hybrid method which combines the kVp conversion and the dual-energy method, were investigated comparatively through assessing the accuracy of estimating linear attenuation coefficient at 511 keV and the bias introduced into PET quantification results due to CT-based attenuation correction. Our results showed that the hybrid method outperformed the bilinear scaling method, while the dual-energy method achieved the highest accuracy among the three energy mapping methods. Overall, the accuracy of PET quantification results have similar trend as that for the estimation of linear attenuation coefficients, whereas the differences between the three methods are more obvious in the estimation of linear attenuation coefficients than in the PET quantification results. With regards to radiation exposure from CT, the absorbed dose ranged between 7.29-45.58 mGy for 50-kVp scan and between 6.61-39.28 mGy for 80-kVp scan. For 18 F radioactivity concentration of 1.86x10 5 Bq/ml, the PET absorbed dose was around 24 cGy for tumor with a target-to-background ratio of 8. The radiation levels for CT scans are not lethal to the animal, but concurrent use of PET in longitudinal study can increase the risk of biological effects. The
Kramer, R; Khoury, H J; Vieira, J W; Loureiro, E C M; Lima, V J M; Lima, F R A; Hoff, G
2004-12-07
The International Commission on Radiological Protection (ICRP) has created a task group on dose calculations, which, among other objectives, should replace the currently used mathematical MIRD phantoms by voxel phantoms. Voxel phantoms are based on digital images recorded from scanning of real persons by computed tomography or magnetic resonance imaging (MRI). Compared to the mathematical MIRD phantoms, voxel phantoms are true to the natural representations of a human body. Connected to a radiation transport code, voxel phantoms serve as virtual humans for which equivalent dose to organs and tissues from exposure to ionizing radiation can be calculated. The principal database for the construction of the FAX (Female Adult voXel) phantom consisted of 151 CT images recorded from scanning of trunk and head of a female patient, whose body weight and height were close to the corresponding data recommended by the ICRP in Publication 89. All 22 organs and tissues at risk, except for the red bone marrow and the osteogenic cells on the endosteal surface of bone ('bone surface'), have been segmented manually with a technique recently developed at the Departamento de Energia Nuclear of the UFPE in Recife, Brazil. After segmentation the volumes of the organs and tissues have been adjusted to agree with the organ and tissue masses recommended by ICRP for the Reference Adult Female in Publication 89. Comparisons have been made with the organ and tissue masses of the mathematical EVA phantom, as well as with the corresponding data for other female voxel phantoms. The three-dimensional matrix of the segmented images has eventually been connected to the EGS4 Monte Carlo code. Effective dose conversion coefficients have been calculated for exposures to photons, and compared to data determined for the mathematical MIRD-type phantoms, as well as for other voxel phantoms.
Energy Technology Data Exchange (ETDEWEB)
Kramer, R [Departamento de Energia Nuclear, Universidade Federal de Pernambuco, Av. Prof. Luiz Freire 1000, Cidade Universitaria, CEP 50740-540, Recife, PE (Brazil); Khoury, H J [Departamento de Energia Nuclear, Universidade Federal de Pernambuco, Av. Prof. Luiz Freire 1000, Cidade Universitaria, CEP 50740-540, Recife, PE (Brazil); Vieira, J W [Centro Federal de Educacao Tecnologica de Pernambuco, Recife, PE (Brazil); Loureiro, E C M [Escola Politecnica, UPE, Recife, PE (Brazil); Lima, V J M [Departamento de Anatomia, Universidade Federal de Pernambuco, Prof. Moraes Rego, 1235 Cidade Universitaria CEP 50670-420 Recife, PE (Brazil); Lima, F R A [Centro Regional de Ciencias Nucleares, R. Conego Barata 999, Recife, PE (Brazil); Hoff, G [Faculdade de FIsica, PUCRS, Porto Alegre, RS (Brazil)
2004-12-07
The International Commission on Radiological Protection (ICRP) has created a task group on dose calculations, which, among other objectives, should replace the currently used mathematical MIRD phantoms by voxel phantoms. Voxel phantoms are based on digital images recorded from scanning of real persons by computed tomography or magnetic resonance imaging (MRI). Compared to the mathematical MIRD phantoms, voxel phantoms are true to the natural representations of a human body. Connected to a radiation transport code, voxel phantoms serve as virtual humans for which equivalent dose to organs and tissues from exposure to ionizing radiation can be calculated. The principal database for the construction of the FAX (Female Adult voXel) phantom consisted of 151 CT images recorded from scanning of trunk and head of a female patient, whose body weight and height were close to the corresponding data recommended by the ICRP in Publication 89. All 22 organs and tissues at risk, except for the red bone marrow and the osteogenic cells on the endosteal surface of bone ('bone surface'), have been segmented manually with a technique recently developed at the Departamento de Energia Nuclear of the UFPE in Recife, Brazil. After segmentation the volumes of the organs and tissues have been adjusted to agree with the organ and tissue masses recommended by ICRP for the Reference Adult Female in Publication 89. Comparisons have been made with the organ and tissue masses of the mathematical EVA phantom, as well as with the corresponding data for other female voxel phantoms. The three-dimensional matrix of the segmented images has eventually been connected to the EGS4 Monte Carlo code. Effective dose conversion coefficients have been calculated for exposures to photons, and compared to data determined for the mathematical MIRD-type phantoms, as well as for other voxel phantoms.
International Nuclear Information System (INIS)
Kramer, R; Khoury, H J; Vieira, J W; Loureiro, E C M; Lima, V J M; Lima, F R A; Hoff, G
2004-01-01
The International Commission on Radiological Protection (ICRP) has created a task group on dose calculations, which, among other objectives, should replace the currently used mathematical MIRD phantoms by voxel phantoms. Voxel phantoms are based on digital images recorded from scanning of real persons by computed tomography or magnetic resonance imaging (MRI). Compared to the mathematical MIRD phantoms, voxel phantoms are true to the natural representations of a human body. Connected to a radiation transport code, voxel phantoms serve as virtual humans for which equivalent dose to organs and tissues from exposure to ionizing radiation can be calculated. The principal database for the construction of the FAX (Female Adult voXel) phantom consisted of 151 CT images recorded from scanning of trunk and head of a female patient, whose body weight and height were close to the corresponding data recommended by the ICRP in Publication 89. All 22 organs and tissues at risk, except for the red bone marrow and the osteogenic cells on the endosteal surface of bone ('bone surface'), have been segmented manually with a technique recently developed at the Departamento de Energia Nuclear of the UFPE in Recife, Brazil. After segmentation the volumes of the organs and tissues have been adjusted to agree with the organ and tissue masses recommended by ICRP for the Reference Adult Female in Publication 89. Comparisons have been made with the organ and tissue masses of the mathematical EVA phantom, as well as with the corresponding data for other female voxel phantoms. The three-dimensional matrix of the segmented images has eventually been connected to the EGS4 Monte Carlo code. Effective dose conversion coefficients have been calculated for exposures to photons, and compared to data determined for the mathematical MIRD-type phantoms, as well as for other voxel phantoms
Mesbahi, Asghar; Khaldari, Rezvan
2017-09-01
In the current study the neutron and photon scattering properties of some newly developed high density concretes (HDCs) were calculated by using MCNPX Monte Carlo code. Five high-density concretes including Steel-Magnetite, Barite, Datolite-Galena, Ilmenite-ilmenite, Magnetite-Lead with the densities ranging from 5.11 g/cm3 and ordinary concrete with density of 2.3 g/cm3 were studied in our simulations. The photon beam spectra of 4 and 18 MV from Varian linac and neutron spectra of clinical 18 MeV photon beam was used for calculations. The fluence of scattered photon and neutron from all studied concretes was calculated in different angles. Overall, the ordinary concrete showed higher scattered photons and Datolite-Galena concrete (4.42 g/cm3) had the lowest scattered photons among all studied concretes. For neutron scattering, fluence at the angle of 180 was higher relative to other angles while for photons scattering fluence was maximum at 90 degree. The scattering fluence for photons and neutrons was dependent on the angle and composition of concrete. The results showed that the fluence of scattered photons and neutrons changes with the composition of high density concrete. Also, for high density concretes, the variation of scattered fluence with angle was very pronounced for neutrons but it changed slightly for photons. The results can be used for design of radiation therapy bunkers.
International Nuclear Information System (INIS)
Petrov, Eh.E.; Fadeev, I.A.
1979-01-01
A possibility to use displaced sampling from a bulk gamma source in calculating the secondary gamma fields by the Monte Carlo method is discussed. The algorithm proposed is based on the concept of conjugate functions alongside the dispersion minimization technique. For the sake of simplicity a plane source is considered. The algorithm has been put into practice on the M-220 computer. The differential gamma current and flux spectra in 21cm-thick lead have been calculated. The source of secondary gamma-quanta was assumed to be a distributed, constant and isotropic one emitting 4 MeV gamma quanta with the rate of 10 9 quanta/cm 3 xs. The calculations have demonstrated that the last 7 cm of lead are responsible for the whole gamma spectral pattern. The spectra practically coincide with the ones calculated by the ROZ computer code. Thus the algorithm proposed can be offectively used in the calculations of secondary gamma radiation transport and reduces the computation time by 2-4 times
International Nuclear Information System (INIS)
Noblet, Caroline
2014-01-01
Innovating irradiators dedicated to small animal allow to mimic clinical treatments in image-guided radiation therapy. Clinical practice is scaled down to the small animal by reducing beam dimensions (from cm to mm) and energy (from MeV to keV). Millimeter medium energy beams (<300 keV) are used to treat animals. This scaling induces higher constraints than in clinical practice especially for absorbed dose calculation in animals. Due to the beam dimensions and the medium energy range, clinical dose calculation methods are not easily applicable to the preclinical practice. Monte Carlo methods are needed. To this aim, a Monte Carlo model of the XRAD225Cx preclinical irradiator has been developed with the GATE (Geant4) framework. This model was validated by comparing simulation results against measurements and results obtained with a reference Monte Carlo code in external beam radiation therapy, EGSnrc. A specific issue has been highlighted: the significant dosimetric impact of tissue segmentation in the animal CT images. Indeed, at medium energy range, thresholding based on electronic density cannot accurately take into account the heterogeneities. Materials should be defined using both the tissue elemental composition and the mass density. An original segmentation method has been developed to obtain realistic dose distributions in small animals. Finally, our Monte Carlo platform has been successfully used for several radiobiological studies with mice and rats. (author) [fr
Coherent Backscattering by Particulate Planetary Media of Nonspherical Particles
Muinonen, Karri; Penttila, Antti; Wilkman, Olli; Videen, Gorden
2014-11-01
The so-called radiative-transfer coherent-backscattering method (RT-CB) has been put forward as a practical Monte Carlo method to compute multiple scattering in discrete random media mimicking planetary regoliths (K. Muinonen, Waves in Random Media 14, p. 365, 2004). In RT-CB, the interaction between the discrete scatterers takes place in the far-field approximation and the wave propagation faces exponential extinction. There is a significant constraint in the RT-CB method: it has to be assumed that the form of the scattering matrix is that of the spherical particle. We aim to extend the RT-CB method to nonspherical single particles showing significant depolarization characteristics. First, ensemble-averaged single-scattering albedos and phase matrices of nonspherical particles are matched using a phenomenological radiative-transfer model within a microscopic volume element. Second, the phenomenologial single-particle model is incorporated into the Monte Carlo RT-CB method. In the ray tracing, the electromagnetic phases within the microscopic volume elements are omitted as having negligible lengths, whereas the phases are duly accounted for in the paths between two or more microscopic volume elements. We assess the computational feasibility of the extended RT-CB method and show preliminary results for particulate media mimicking planetary regoliths. The present work can be utilized in the interpretation of astronomical observations of asteroids and other planetary objects. In particular, the work sheds light on the depolarization characteristics of planetary regoliths at small phase angles near opposition. The research has been partially funded by the ERC Advanced Grant No 320773 entitled “Scattering and Absorption of Electromagnetic Waves in Particulate Media” (SAEMPL), by the Academy of Finland (contract 257966), NASA Outer Planets Research Program (contract NNX10AP93G), and NASA Lunar Advanced Science and Exploration Research Program (contract NNX11AB25G).
International Nuclear Information System (INIS)
White, Travis; Hack, Joe; Nathan, Steve; Barnett, Marvin
2001-01-01
solutions for scattering of neutrons through multi-legged penetrations are readily available in the literature; similar analytical solutions for photon scattering through penetrations, however, are not. Therefore, computer modeling must be relied upon to perform our analyses. The computer code typically used by Westinghouse SMS in the evaluation of photon transport through complex geometries is the MCNP Monte Carlo computer code. Yet, geometries of this nature can cause problems even with the Monte Carlo codes. Striking a balance between how the code handles bulk transport through the wall with transport through the penetration void, particularly with the use of typical variance reduction methods, is difficult when trying to ensure that all the important regions of the model are sampled appropriately. The problem was broken down into several roughly independent cases. First, scatter through the penetration was considered. Second, bulk transport through the hot leg of the duct and then through the remaining thickness of wall was calculated to determine the amount of supplemental shielding required in the wall. Similar analyses were performed for the middle and cold legs of the penetration. Finally, additional external shielding from radiation streaming through the duct was determined for cases where the minimum offset distance was not feasible. Each case was broken down further into two phases. In the first phase of each case, photons were transported from the source material to an area at the face of the wall, or the opening of the duct, where photon energy and angular distributions were tallied, representing the source incident on the wall or opening. Then, a simplified model for each case was developed and analyzed using the data from the first phase and the new source term. (authors)
Directory of Open Access Journals (Sweden)
Nilseia Aparecida Barbosa
2014-08-01
heterogeneous eye model, indicating that the homogeneous water eye model is a reasonable one. The determined isodose curves give a good visualization of dose distributions inside the eye structures, pointing out their most exposed volume....................................................Cite this article as:Barbosa NA, da Rosa LAR, de Menezes AF, Reis JP, Facure A, Braz D. Assessment of ocular beta radiation dose distribution due to 106Ru/106Rh brachytherapy applicators using MCNPX Monte Carlo code. Int J Cancer Ther Oncol 2014; 2(3:02038. DOI: 10.14319/ijcto.0203.8
International Nuclear Information System (INIS)
Cho, S H
2005-01-01
A recent mice study demonstrated that gold nanoparticles could be safely administered and used to enhance the tumour dose during radiation therapy. The use of gold nanoparticles seems more promising than earlier methods because of the high atomic number of gold and because nanoparticles can more easily penetrate the tumour vasculature. However, to date, possible dose enhancement due to the use of gold nanoparticles has not been well quantified, especially for common radiation treatment situations. Therefore, the current preliminary study estimated this dose enhancement by Monte Carlo calculations for several phantom test cases representing radiation treatments with the following modalities: 140 kVp x-rays, 4 and 6 MV photon beams, and 192 Ir gamma rays. The current study considered three levels of gold concentration within the tumour, two of which are based on the aforementioned mice study, and assumed either no gold or a single gold concentration level outside the tumour. The dose enhancement over the tumour volume considered for the 140 kVp x-ray case can be at least a factor of 2 at an achievable gold concentration of 7 mg Au/g tumour assuming no gold outside the tumour. The tumour dose enhancement for the cases involving the 4 and 6 MV photon beams based on the same assumption ranged from about 1% to 7%, depending on the amount of gold within the tumour and photon beam qualities. For the 192 Ir cases, the dose enhancement within the tumour region ranged from 5% to 31%, depending on radial distance and gold concentration level within the tumour. For the 7 mg Au/g tumour cases, the loading of gold into surrounding normal tissue at 2 mg Au/g resulted in an increase in the normal tissue dose, up to 30%, negligible, and about 2% for the 140 kVp x-rays, 6 MV photon beam, and 192 Ir gamma rays, respectively, while the magnitude of dose enhancement within the tumour was essentially unchanged. (note)
Technical Note: A Monte Carlo study of magnetic-field-induced radiation dose effects in mice
Energy Technology Data Exchange (ETDEWEB)
Rubinstein, Ashley E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 and The University of Texas Graduate School of Biomedical Sciences, Houston, Texas 77030 (United States); Liao, Zhongxing [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Melancon, Adam D.; Followill, David S.; Tailor, Ramesh C. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Guindani, Michele [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Hazle, John D. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Court, Laurence E., E-mail: lecourt@mdanderson.org [Departments of Radiation Physics and Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)
2015-09-15
Purpose: Magnetic fields are known to alter radiation dose deposition. Before patients receive treatment using an MRI-linear accelerator (MRI-Linac), preclinical studies are needed to understand the biological consequences of magnetic-field-induced dose effects. In the present study, the authors sought to identify a beam energy and magnetic field strength combination suitable for preclinical murine experiments. Methods: Magnetic field dose effects were simulated in a mouse lung phantom using various beam energies (225 kVp, 350 kVp, 662 keV [Cs-137], 2 MV, and 1.25 MeV [Co-60]) and magnetic field strengths (0.75, 1.5, and 3 T). The resulting dose distributions were compared with those in a simulated human lung phantom irradiated with a 6 or 8 MV beam and orthogonal 1.5 T magnetic field. Results: In the human lung phantom, the authors observed a dose increase of 45% and 54% at the soft-tissue-to-lung interface and a dose decrease of 41% and 48% at the lung-to-soft-tissue interface for the 6 and 8 MV beams, respectively. In the mouse simulations, the magnetic fields had no measurable effect on the 225 or 350 kVp dose distribution. The dose increases with the Cs-137 beam for the 0.75, 1.5, and 3 T magnetic fields were 9%, 29%, and 42%, respectively. The dose decreases were 9%, 21%, and 37%. For the 2 MV beam, the dose increases were 16%, 33%, and 31% and the dose decreases were 9%, 19%, and 30%. For the Co-60 beam, the dose increases were 19%, 54%, and 44%, and the dose decreases were 19%, 42%, and 40%. Conclusions: The magnetic field dose effects in the mouse phantom using a Cs-137, 3 T combination or a Co-60, 1.5 or 3 T combination most closely resemble those in simulated human treatments with a 6 MV, 1.5 T MRI-Linac. The effects with a Co-60, 1.5 T combination most closely resemble those in simulated human treatments with an 8 MV, 1.5 T MRI-Linac.
Connerney, J. E. P.
2007-01-01
The chapter on Planetary Magnetism by Connerney describes the magnetic fields of the planets, from Mercury to Neptune, including the large satellites (Moon, Ganymede) that have or once had active dynamos. The chapter describes the spacecraft missions and observations that, along with select remote observations, form the basis of our knowledge of planetary magnetic fields. Connerney describes the methods of analysis used to characterize planetary magnetic fields, and the models used to represent the main field (due to dynamo action in the planet's interior) and/or remnant magnetic fields locked in the planet's crust, where appropriate. These observations provide valuable insights into dynamo generation of magnetic fields, the structure and composition of planetary interiors, and the evolution of planets.
International Nuclear Information System (INIS)
Guerra, Pedro; Ledesma-Carbayo, María J; Santos, Andrés; Udías, José M; Herranz, Elena; Herraiz, Joaquín L; Santos-Miranda, Juan Antonio; Calvo, Felipe A; Valdivieso, Manlio F; Rodríguez, Raúl; Illana, Carlos; Calama, Juan A; Pascau, Javier
2014-01-01
This work analysed the feasibility of using a fast, customized Monte Carlo (MC) method to perform accurate computation of dose distributions during pre- and intraplanning of intraoperative electron radiation therapy (IOERT) procedures. The MC method that was implemented, which has been integrated into a specific innovative simulation and planning tool, is able to simulate the fate of thousands of particles per second, and it was the aim of this work to determine the level of interactivity that could be achieved. The planning workflow enabled calibration of the imaging and treatment equipment, as well as manipulation of the surgical frame and insertion of the protection shields around the organs at risk and other beam modifiers. In this way, the multidisciplinary team involved in IOERT has all the tools necessary to perform complex MC dosage simulations adapted to their equipment in an efficient and transparent way. To assess the accuracy and reliability of this MC technique, dose distributions for a monoenergetic source were compared with those obtained using a general-purpose software package used widely in medical physics applications. Once accuracy of the underlying simulator was confirmed, a clinical accelerator was modelled and experimental measurements in water were conducted. A comparison was made with the output from the simulator to identify the conditions under which accurate dose estimations could be obtained in less than 3 min, which is the threshold imposed to allow for interactive use of the tool in treatment planning. Finally, a clinically relevant scenario, namely early-stage breast cancer treatment, was simulated with pre- and intraoperative volumes to verify that it was feasible to use the MC tool intraoperatively and to adjust dose delivery based on the simulation output, without compromising accuracy. The workflow provided a satisfactory model of the treatment head and the imaging system, enabling proper configuration of the treatment planning
Energy Technology Data Exchange (ETDEWEB)
Silva, Laura E. da; Nicolucci, Patricia, E-mail: laura.emilia.fm@gmail.com [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras
2014-04-15
The development of nanotechnology has boosted the use of nanoparticles in radiation therapy in order to achieve greater therapeutic ratio between tumor and healthy tissues. Gold has been shown to be most suitable to this task due to the high biocompatibility and high atomic number, which contributes to a better in vivo distribution and for the local energy deposition. As a result, this study proposes to study, nanoparticle in the tumor cell. At a range of 11 nm from the nanoparticle surface, results have shown an absorbed dose 141 times higher for the medium with the gold nanoparticle compared to the water for an incident energy spectrum with maximum photon energy of 50 keV. It was also noted that when only scattered radiation is interacting with the gold nanoparticles, the dose was 134 times higher compared to enhanced local dose that remained significant even for scattered radiation. (author)
Energy Technology Data Exchange (ETDEWEB)
Duch, M. A.; Zaragoza, F. J.; Sempau, J.; Ginjaume, M.; Vano, E.; Sanchez, R.; Fernandez, J. M.
2013-07-01
The study shows that the MC simulation is a useful tool to facilitate the assessment of the spatial distribution of the dose due to the radiation scattered in interventional radiology procedures, as well as to determine the influence of various operational parameters in the same , avoiding experimental measures that require much time of use the Cath Labs. (Author)
Energy Technology Data Exchange (ETDEWEB)
May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael [University of Erlangen, Department of Radiology, Erlangen (Germany); Deak, Paul; Kalender, Willi A. [University of Erlangen, Department of Medical Physics, Erlangen (Germany); Keller, Andrea K.; Haeberle, Lothar [University of Erlangen, Department of Medical Informatics, Biometry and Epidemiology, Erlangen (Germany); Achenbach, Stephan; Seltmann, Martin [University of Erlangen, Department of Cardiology, Erlangen (Germany)
2012-03-15
To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 {+-} 2.1 mSv/100 mAs for TCM and 12.5 {+-} 5.3 mSv/100 mAs for CTC (P < 0.001). Relative dose reduction at low HR ({<=}60 bpm) was highest (49 {+-} 5%) compared to intermediate (60-70 bpm, 33 {+-} 12%) and high HR (>70 bpm, 29 {+-} 12%). However lowest ED is achieved at high HR (5.2 {+-} 1.5 mSv/100 mAs), compared with intermediate (6.7 {+-} 1.6 mSv/100 mAs) and low (8.3 {+-} 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)
International Nuclear Information System (INIS)
Anlauf, H.; Manakos, P.; Ohl, T.; Dahmen, H.D.; Mannel, T.
1991-09-01
We present the Monte Carlo event generator KRONOS for deep inelastic lepton hadron scattering at HERA. KRONOS focusses on the description of electromagnetic corrections beyond the existing fixed order calculations. (orig.)
Rodriguez, M.; Brualla, L.
2018-04-01
Monte Carlo simulation of radiation transport is computationally demanding to obtain reasonably low statistical uncertainties of the estimated quantities. Therefore, it can benefit in a large extent from high-performance computing. This work is aimed at assessing the performance of the first generation of the many-integrated core architecture (MIC) Xeon Phi coprocessor with respect to that of a CPU consisting of a double 12-core Xeon processor in Monte Carlo simulation of coupled electron-photonshowers. The comparison was made twofold, first, through a suite of basic tests including parallel versions of the random number generators Mersenne Twister and a modified implementation of RANECU. These tests were addressed to establish a baseline comparison between both devices. Secondly, through the p DPM code developed in this work. p DPM is a parallel version of the Dose Planning Method (DPM) program for fast Monte Carlo simulation of radiation transport in voxelized geometries. A variety of techniques addressed to obtain a large scalability on the Xeon Phi were implemented in p DPM. Maximum scalabilities of 84 . 2 × and 107 . 5 × were obtained in the Xeon Phi for simulations of electron and photon beams, respectively. Nevertheless, in none of the tests involving radiation transport the Xeon Phi performed better than the CPU. The disadvantage of the Xeon Phi with respect to the CPU owes to the low performance of the single core of the former. A single core of the Xeon Phi was more than 10 times less efficient than a single core of the CPU for all radiation transport simulations.
Reddell, Brandon
2015-01-01
Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
International Nuclear Information System (INIS)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described
International Nuclear Information System (INIS)
Russell, C.T.
1980-01-01
Planetary spacecraft have now probed the magnetic fields of all the terrestrial planets, the moon, Jupiter, and Saturn. These measurements reveal that dynamos are active in at least four of the planets, Mercury, the earth, Jupiter, and Saturn but that Venus and Mars appear to have at most only very weak planetary magnetic fields. The moon may have once possessed an internal dynamo, for the surface rocks are magnetized. The large satellites of the outer solar system are candidates for dynamo action in addition to the large planets themselves. Of these satellites the one most likely to generate its own internal magnetic field is Io
Park, Jong Min; Kim, Kyubo; In Park, Jong; Shin, Kyung Hwan; Jin, Ung Sik; Kim, Jung-in
2017-06-01
To investigate the dosimetric effect of the internal metallic port (IMP) in a tissue expander (TE) on the dose distribution of postmastectomy radiation therapy (PMRT). A total of 10 patients who have received PMRT with a TE were selected retrospectively. For each patient, the dose distributions of treatment plans with a 10 MV photon beam were calculated using the Monte Carlo (MC) method with CT images. The dose distributions without the TE were also calculated by designating the mass densities of the TE including the IMP as those of tissue. From the MC calculations, the dose-volumetric parameters were calculated and analyzed for several structures: the planning target volume (PTV) including the TE, the PTV excluding the TE (PTVreal), the TE alone, heart, and lungs. For the PTV and PTVreal, dose-volumetric parameters did not appear to depend on the IMP. Within the TE volume, the maximum dose and D 1% were higher with the IMP than without the IMP (62.8 ± 1.4 Gy versus 57.9 ± 1.3 Gy with p < 0.001 and 58.6 ± 1.6 Gy versus 57.0 ± 1.2 Gy with p = 0.035). The values of V 100% and V 95% were lower with the IMP than without the IMP (77.9% ± 7.6% versus 87.2% ± 5.3% with p = 0.008 and 89.5% ± 5.6% versus 94.6% ± 2.9% with p = 0.027). The IMP did not affect dose-volumetric parameters of heart and lungs. Dosimetric changes due to the IMP occurred mainly within the TE, and not in the target volume, heart, and lungs.
MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics
International Nuclear Information System (INIS)
Pater, P; Vallieres, M; Seuntjens, J
2014-01-01
Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dose deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges
Argento, D.; Reedy, R. C.; Stone, J.
2010-12-01
Cosmogenic Nuclides (CNs) are a critical new tool for geomorphology, allowing researchers to date Earth surface events and measure process rates [1]. Prior to CNs, many of these events and processes had no absolute method for measurement and relied entirely on relative methods [2]. Continued improvements in CN methods are necessary for expanding analytic capability in geomorphology. In the last two decades, significant progress has been made in refining these methods and reducing analytic uncertainties [1,3]. Calibration data and scaling methods are being developed to provide a self consistent platform for use in interpreting nuclide concentration values into geologic data [4]. However, nuclide dependent scaling has been difficult to address due to analytic uncertainty and sparseness in altitude transects. Artificial target experiments are underway, but these experiments take considerable time for nuclide buildup in lower altitudes. In this study, a Monte Carlo method radiation transport code, MCNPX, is used to model the galactic cosmic-ray radiation impinging on the upper atmosphere and track the resulting secondary particles through a model of the Earth’s atmosphere and lithosphere. To address the issue of nuclide dependent scaling, the neutron flux values determined by the MCNPX simulation are folded in with estimated cross-section values [5,6]. Preliminary calculations indicate that scaling of nuclide production potential in free air seems to be a function of both altitude and nuclide production pathway. At 0 g/cm2 (sea-level) all neutron spallation pathways have attenuation lengths within 1% of 130 g/cm2. However, the differences in attenuation length are exacerbated with increasing altitude. At 530 g/cm2 atmospheric height (~5,500 m), the apparent attenuation lengths for aggregate SiO2(n,x)10Be, aggregate SiO2(n,x)14C and K(n,x)36Cl become 149.5 g/cm2, 151 g/cm2 and 148 g/cm2 respectively. At 700 g/cm2 atmospheric height (~8,400m - close to the highest
Cooper, M A
2000-01-01
We present various approximations for the angular distribution of particles emerging from an optically thick, purely isotropically scattering region into a vacuum. Our motivation is to use such a distribution for the Fleck-Canfield random walk method [1] for implicit Monte Carlo (IMC) [2] radiation transport problems. We demonstrate that the cosine distribution recommended in the original random walk paper [1] is a poor approximation to the angular distribution predicted by transport theory. Then we examine other approximations that more closely match the transport angular distribution.
International Nuclear Information System (INIS)
Dinh Nhu Thao
2008-01-01
We have applied a self-consistent ensemble Monte Carlo simulation procedure using an extended valley model to consider the THz radiation from GaAs p-i-n diodes under high electric fields. The present calculation has shown an important improvement of the numerical results when using this model instead of the usual valley model. It has been shown the importance of the full band-structure in the simulation of processes in semiconductors, especially under the influence of high electric fields. (author)
International Nuclear Information System (INIS)
Nikolopoulos, Dimitrios; Kandarakis, Ioannis; Tsantilas, Xenophon; Valais, Ioannis; Cavouras, Dionisios; Louizi, Anna
2006-01-01
The radiation detection efficiency of four scintillators employed, or designed to be employed, in positron emission imaging (PET) was evaluated as a function of the crystal thickness by applying Monte Carlo Methods. The scintillators studied were the LuSiO 5 (LSO), LuAlO 3 (LuAP), Gd 2 SiO 5 (GSO) and the YAlO 3 (YAP). Crystal thicknesses ranged from 0 to 50 mm. The study was performed via a previously generated photon transport Monte Carlo code. All photon track and energy histories were recorded and the energy transferred or absorbed in the scintillator medium was calculated together with the energy redistributed and retransported as secondary characteristic fluorescence radiation. Various parameters were calculated e.g. the fraction of the incident photon energy absorbed, transmitted or redistributed as fluorescence radiation, the scatter to primary ratio, the photon and energy distribution within each scintillator block etc. As being most significant, the fraction of the incident photon energy absorbed was found to increase with increasing crystal thickness tending to form a plateau above the 30 mm thickness. For LSO, LuAP, GSO and YAP scintillators, respectively, this fraction had the value of 44.8, 36.9 and 45.7% at the 10 mm thickness and 96.4, 93.2 and 96.9% at the 50 mm thickness. Within the plateau area approximately (57-59)% (59-63)% (52-63)% and (58-61)% of this fraction was due to scattered and reabsorbed radiation for the LSO, GSO, YAP and LuAP scintillators, respectively. In all cases, a negligible fraction (<0.1%) of the absorbed energy was found to escape the crystal as fluorescence radiation
Statistical-likelihood Exo-Planetary Habitability Index (SEPHI)
Rodríguez-Mozos, J. M.; Moya, A.
2017-11-01
A new index, the Statistical-likelihood Exo-Planetary Habitability Index (SEPHI), is presented. It has been developed to cover the current and future features required for a classification scheme disentangling whether any exoplanet discovered is potentially habitable compared with life on Earth. SEPHI uses likelihood functions to estimate the habitability potential. It is defined as the geometric mean of four sub-indexes related to four comparison criteria: Is the planet telluric? Does it have an atmosphere dense enough and a gravity compatible with life? Does it have liquid water on its surface? Does it have a magnetic field shielding its surface from harmful radiation and stellar winds? SEPHI can be estimated with only seven physical characteristics: planetary mass, planetary radius, planetary orbital period, stellar mass, stellar radius, stellar effective temperature and planetary system age. We have applied SEPHI to all the planets in the Exoplanet Encyclopaedia using a Monte Carlo method. Kepler-1229b, Kepler-186f and Kepler-442b have the largest SEPHI values assuming certain physical descriptions. Kepler-1229b is the most unexpected planet in this privileged position since no previous study pointed to this planet as a potentially interesting and habitable one. In addition, most of the tidally locked Earth-like planets present a weak magnetic field, incompatible with habitability potential. We must stress that our results are linked to the physics used in this study. Any change in the physics used implies only an updating of the likelihood functions. We have developed a web application allowing the online estimation of SEPHI (http://sephi.azurewebsites.net/).
Tiscareno, Matthew S.
Planetary rings are the only nearby astrophysical disks and the only disks that have been investigated by spacecraft (especially the Cassini spacecraft orbiting Saturn). Although there are significant differences between rings and other disks, chiefly the large planet/ring mass ratio that greatly enhances the flatness of rings (aspect ratios as small as 10- 7), understanding of disks in general can be enhanced by understanding the dynamical processes observed at close range and in real time in planetary rings.We review the known ring systems of the four giant planets, as well as the prospects for ring systems yet to be discovered. We then review planetary rings by type. The A, B, and C rings of Saturn, plus the Cassini Division, comprise our solar system's only dense broad disk and host many phenomena of general application to disks including spiral waves, gap formation, self-gravity wakes, viscous overstability and normal modes, impact clouds, and orbital evolution of embedded moons. Dense narrow rings are found both at Uranus (where they comprise the main rings entirely) and at Saturn (where they are embedded in the broad disk) and are the primary natural laboratory for understanding shepherding and self-stability. Narrow dusty rings, likely generated by embedded source bodies, are surprisingly found to sport azimuthally confined arcs at Neptune, Saturn, and Jupiter. Finally, every known ring system includes a substantial component of diffuse dusty rings.Planetary rings have shown themselves to be useful as detectors of planetary processes around them, including the planetary magnetic field and interplanetary impactors as well as the gravity of nearby perturbing moons. Experimental rings science has made great progress in recent decades, especially numerical simulations of self-gravity wakes and other processes but also laboratory investigations of coefficient of restitution and spectroscopic ground truth. The age of self-sustained ring systems is a matter of
International Nuclear Information System (INIS)
Yeh, C.Y.; Lee, C.C.; Chao, T.C.; Lin, M.H.; Lai, P.A.; Liu, F.H.; Tung, C.J.
2014-01-01
This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs
International Nuclear Information System (INIS)
Johnson, J.O.
2000-01-01
The Department of Energy (DOE) has given the Spallation Neutron Source (SNS) project approval to begin Title I design of the proposed facility to be built at Oak Ridge National Laboratory (ORNL) and construction is scheduled to commence in FY01 . The SNS initially will consist of an accelerator system capable of delivering an ∼0.5 microsecond pulse of 1 GeV protons, at a 60 Hz frequency, with 1 MW of beam power, into a single target station. The SNS will eventually be upgraded to a 2 MW facility with two target stations (a 60 Hz station and a 10 Hz station). The radiation transport analysis, which includes the neutronic, shielding, activation, and safety analyses, is critical to the design of an intense high-energy accelerator facility like the proposed SNS, and the Monte Carlo method is the cornerstone of the radiation transport analyses
How Extreme is TRAPPIST-1? A look into the planetary system’s extreme-UV radiation environment
Peacock, Sarah; Barman, Travis; Shkolnik, Evgenya L.
2018-01-01
The ultracool dwarf star TRAPPIST-1 hosts three earth-sized planets at orbital distances where water has the potential to exist in liquid form on the planets’ surface. Close-in exoplanets, such as these, become vulnerable to water loss as stellar XUV radiation heats and expands their upper atmospheres. Currently, little is known about the high-energy radiation environment around TRAPPIST-1. Recent efforts to quantify the XUV radiation rely on empirical relationships based on X-ray or Lyman alpha line observations and yield very different results. The scaling relations used between the X-ray and EUV emission result in high-energy irradiation of the planets 10-1000x greater than present day Earth, stripping atmospheres and oceans in 1 Gyr, while EUV estimated from Lyman alpha flux is much lower. Here we present upper-atmosphere PHOENIX models representing the minimum and maximum potential EUV stellar flux from TRAPPIST-1. We use GALEX FUV and NUV photometry for similar aged M stars to determine the UV flux extrema in an effort to better constrain the high-energy radiation environment around TRAPPIST-1.
International Nuclear Information System (INIS)
Slavik, O.; Kucharova, D.; Listjak, M.; Fueloep, M.
2008-01-01
The aim of this paper is to evaluate maximal dose rate (DR) of gamma radiation above different configurations of reservoirs with spent nuclear fuel with cooling period 1.8 year and to compare by buildup factor method (Visiplan) and Monte Carlo simulations and to appreciate influence of scattered photons in the case of calculation of fully filled fuel transfer storage (FTS). On the ground of performed accounts it was shown, that relative contributions of photons from adjacent reservoirs are in the case buildup factor method (Visiplan) similar to Monte Carlo simulations. It means, that Visiplan can be used also for valuation of contributions of of dose rates from neighbouring reservoirs. It was shown, that calculations of DR by Visiplan are conservatively overestimated for this source of radiation and thickness of shielding approximately 2.6 - 3 times. Also following these calculations resulted, that by storage of reservoirs with cooling period 1.8 years in FTS is not needed any additional protection measures for workers against primal safety report. Calculated DR also above fully filled FTS by these reservoirs in Jaslovske Bohunice is very low on the level 0.03 μSv/h. (authors)
International Nuclear Information System (INIS)
Slavik, O.; Kucharova, D.; Listjak, M.; Fueloep, M.
2009-01-01
The aim of this paper is to evaluate maximal dose rate (DR) of gamma radiation above different configurations of reservoirs with spent nuclear fuel with cooling period 1.8 year and to compare by buildup factor method (Visiplan) and Monte Carlo simulations and to appreciate influence of scattered photons in the case of calculation of fully filled fuel transfer storage (FTS). On the ground of performed accounts it was shown, that relative contributions of photons from adjacent reservoirs are in the case buildup factor method (Visiplan) similar to Monte Carlo simulations. It means, that Visiplan can be used also for valuation of contributions of of dose rates from neighbouring reservoirs. It was shown, that calculations of DR by Visiplan are conservatively overestimated for this source of radiation and thickness of shielding approximately 2.6 - 3 times. Also following these calculations resulted, that by storage of reservoirs with cooling period 1.8 years in FTS is not needed any additional protection measures for workers against primal safety report. Calculated DR also above fully filled FTS by these reservoirs in Jaslovske Bohunice is very low on the level 0.03 μSv/h. (authors)
Murdin, P.
2000-11-01
Carl Sagan, Bruce Murray and Louis Friedman founded the non-profit Planetary Society in 1979 to advance the exploration of the solar system and to continue the search for extraterrestrial life. The Society has its headquarters in Pasadena, California, but is international in scope, with 100 000 members worldwide, making it the largest space interest group in the world. The Society funds a var...
International Nuclear Information System (INIS)
Lal, D.; Rao, M.N.
1986-01-01
Salient features of the atmosheres of Venus and Mars are described and compared with those of the earth. Their temperature profiles are given. Degassing of planetary interiors by volcanic and plate tectonic processes is described. Noble gas abundances in the atmospheres of these planets are compared. Information provided by Pioneer, Venera space probes and the Viking-landers on Mars is studied. (B.G.W.)
Pollack, James B.; Sagan, Carl
1991-01-01
Assuming commercial fusion power, heavy lift vehicles and major advances in genetic engineering, the authors survey possible late-21st century methods of working major transformations in planetary environments. Much more Earthlike climates may be produced on Mars by generating low freezing point greenhouse gases from indigenous materials; on Venus by biological conversion of CO2 to graphite, by canceling the greenhouse effect with high-altitude absorbing fine particles, or by a sunshield at the first Lagrangian point; and on Titan by greenhouses and/or fusion warming. However, in our present state of ignorance we cannot guarantee a stable endstate or exclude unanticipated climatic feedbacks or other unintended consequences. Moreover, as the authors illustrate by several examples, many conceivable modes of planetary engineering are so wasteful of scarce solar system resources and so destructive of important scientific information as to raise profound ethical issues, even if they were economically feasible, which they are not. Global warming on Earth may lead to calls for mitigation by planetary engineering, e.g., emplacement and replenishment of anti-greenhouse layers at high altitudes, or sunshields in space. But here especially we must be concerned about precision, stability, and inadvertent side-effects. The safest and most cost-effective means of countering global warming - beyond, e.g., improved energy efficiency, CFC bans and alternative energy sources - is the continuing reforestation of approximately 2 times 107 sq km of the Earth's surface. This can be accomplished with present technology and probably at the least cost.
Pollack, James B.; Sagan, Carl
Assuming commercial fusion power, heavy lift vehicles and major advances in genetic engineering, the authors survey possible late-21st century methods of working major transformations in planetary environments. Much more Earthlike climates may be produced on Mars by generating low freezing point greenhouse gases from indigenous materials; on Venus by biological conversion of CO2 to graphite, by canceling the greenhouse effect with high-altitude absorbing fine particles, or by a sunshield at the first Lagrangian point; and on Titan by greenhouses and/or fusion warming. However, in our present state of ignorance we cannot guarantee a stable endstate or exclude unanticipated climatic feedbacks or other unintended consequences. Moreover, as the authors illustrate by several examples, many conceivable modes of planetary engineering are so wasteful of scarce solar system resources and so destructive of important scientific information as to raise profound ethical issues, even if they were economically feasible, which they are not. Global warming on Earth may lead to calls for mitigation by planetary engineering, e.g., emplacement and replenishment of anti-greenhouse layers at high altitudes, or sunshields in space. But here especially we must be concerned about precision, stability, and inadvertent side-effects. The safest and most cost-effective means of countering global warming - beyond, e.g., improved energy efficiency, CFC bans and alternative energy sources - is the continuing reforestation of approximately 2 times 107 sq km of the Earth's surface. This can be accomplished with present technology and probably at the least cost.
Brooks, Shawn M.; Spilker, L.; Edgington, S. G.; Déau, E.; Pilorz, S. H.
2012-10-01
Since arriving at Saturn in 2004, Cassini's Composite Infrared Spectrometer has recorded tens of millions of spectra of Saturn’s rings (personal communication, M. Segura). CIRS records far infrared radiation (16.7-1000 microns) at focal plane 1 (FP1). Thermal emission from Saturn’s rings peaks at FP1 wavelengths. CIRS spectra are well characterized as blackbody emission at an effective temperature Te, multiplied by a scalar factor related to ring emissivity (Spilker et al. [2005, 2006]). CIRS can therefore characterize the rings' temperature and study the thermal environment to which the ring particles are subject. We focus on CIRS data from the 2009 Saturnian equinox. As the Sun's disk crossed the ring plane, CIRS obtained several radial scans of the rings at a variety of phase angles, local hour angles and distances. With the Sun's rays striking the rings at an incidence angle of zero, solar heating is virtually absent, and thermal radiation from Saturn and sunlight reflected by Saturn dominate the thermal environment. These observations present an apparent paradox. Equinox data show that the flux of thermal energy radiated by the rings is roughly equivalent to or even exceeds the energy incident upon them as prescribed by thermal models (Froidevaux [1981], Ferrari and Leyrat [2006], Morishima et al. [2009, 2010]). This apparent energy excess is largest in the C ring and Cassini Division. Conservation principles suggest that models underestimate heating of the rings, as it is clearly unphysical for the rings to radiate significantly more energy than is incident upon them. In this presentation, we will attempt to resolve this paradox and determine what this can teach us about Saturn's rings. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with NASA. Copyright 2012 California Institute of Technology. Government sponsorship acknowledged.
DEFF Research Database (Denmark)
Nathan, R.P.; Thomas, P.J.; Jain, M.
2003-01-01
-e distributions and it is important to characterise this effect, both to ensure that dose distributions are not misinterpreted, and that an accurate beta dose rate is employed in dating calculations. In this study, we make a first attempt providing a description of potential problems in heterogeneous environments...... and identify the likely size of these effects on D-e distributions. The study employs the MCNP 4C Monte Carlo electron/photon transport model, supported by an experimental validation of the code in several case studies. We find good agreement between the experimental measurements and the Monte Carlo...... simulations. It is concluded that the effect of beta, heterogeneity in complex environments for luminescence dating is two fold: (i) the infinite matrix dose rate is not universally applicable; its accuracy depends on the scale of the heterogeneity, and (ii) the interpretation of D-e distributions is complex...
Quilligan, G.; DuMonthier, J.; Aslam, S.; Lakew, B.; Kleyner, I.; Katz, R.
2015-01-01
Thermal radiometers such as proposed for the Europa Clipper flyby mission require low noise signal processing for thermal imaging with immunity to Total Ionizing Dose (TID) and Single Event Latchup (SEL). Described is a second generation Multi- Channel Digitizer (MCD2G) Application Specific Integrated Circuit (ASIC) that accurately digitizes up to 40 thermopile pixels with greater than 50 Mrad (Si) immunity TID and 174 MeV-sq cm/mg SEL. The MCD2G ASIC uses Radiation Hardened By Design (RHBD) techniques with a 180 nm CMOS process node.
Bublitz, Jesse; Kastner, Joel H.; Santander-García, Miguel; Montez, Rodolfo; Alcolea, Javier; Balick, Bruce; Bujarrabal, Valentín
2018-01-01
We report the results of a survey of mm-wave molecular line emission from nine nearby (molecular line frequencies were chosen to investigate the molecular chemistry of these nebulae. New detections of one or more of five molecules -- the molecular mass tracer 13CO and the chemically important trace species HCO+, CN, HCN, and HNC -- were made in at least one PN. We present analysis of emission line flux ratios that are potential diagnostics of the influence that ultraviolet and X-ray radiation have on the chemistry of residual molecular gas in PNe.
Gasselt, Stephan
2018-01-01
This book provides an up-to-date interdisciplinary geoscience-focused overview of solid solar system bodies and their evolution, based on the comparative description of processes acting on them. Planetary research today is a strongly multidisciplinary endeavor with efforts coming from engineering and natural sciences. Key focal areas of study are the solid surfaces found in our Solar System. Some have a direct interaction with the interplanetary medium and others have dynamic atmospheres. In any of those cases, the geological records of those surfaces (and sub-surfaces) are key to understanding the Solar System as a whole: its evolution and the planetary perspective of our own planet. This book has a modular structure and is divided into 4 sections comprising 15 chapters in total. Each section builds upon the previous one but is also self-standing. The sections are: Methods and tools Processes and Sources Integration and Geological Syntheses Frontiers The latter covers the far-reaching broad topics of exo...
Infantino, Angelo
2017-01-01
The present Accelerator Note is a follow-up of the previous report CERN-ACC-NOTE-2016-12345. In the present work, the FLUKA Monte Carlo model of CERN’s CHARM facility has been improved to the most up-to-date configuration of the facility, including: new test positions, a global refinement of the FLUKA geometry, a careful review of the transport and physics parameters. Several configurations of the facility, in terms of target material and movable shielding configuration, have been simulated. The full set of results is reported in the following and can act as a reference guide to any potential user of the facility.
Energy Technology Data Exchange (ETDEWEB)
Hernandes, Antonio Carlos
2002-07-01
Boron Neutron Capture Therapy - BNCT- is a selective cancer treatment and arises as an alternative therapy to treat cancer when usual techniques - surgery, chemotherapy or radiotherapy - show no satisfactory results. The main proposal of this work is to project a facility to BNCT studies. This facility relies on the use of an AmBe neutron source and on a set of moderators, filters and shielding which will provide the best neutron/gamma beam characteristic for these BNCT studies, i.e., high intensity thermal and/or epithermal neutron fluxes and with the minimum feasible gamma rays and fast neutrons contaminants. A computational model of the experiment was used to obtain the radiation field in the sample irradiation position. The calculations have been performed with the MCNP 4B Monte Carlo Code and the results obtained can be regarded as satisfactory, i.e., a thermal neutron fluency {Nu}{sub {Tau}} = 1,35x10{sup 8} n/cm{sup 2}, a fast neutron dose of 5,86x{sup -1}0 Gy/{Nu}{sub {Tau}} and a gamma ray dose of 8,30x{sup -14} Gy/{Nu}{sub {Tau}}. (author)
Energy Technology Data Exchange (ETDEWEB)
Hernandez, Antonio Carlos
2002-07-01
Boron Neutron Capture Therapy - BNCT - is a selective cancer treatment and arises as an alternative therapy to treat cancer when usual techniques - surgery, chemotherapy or radiotherapy - show no satisfactory results. The main proposal of this work is to project a facility to BNCT studies. This facility relies on the use of an Am Be neutron source and on a set of moderators, filters and shielding which will provide the best neutron/gamma beam characteristic for these Becton studies, i.e., high intensity thermal and/or epithermal neutron fluxes and with the minimum feasible gamma rays and fast neutrons contaminants. A computational model of the experiment was used to obtain the radiation field in the sample irradiation position. The calculations have been performed with the MCNP 4B Monte Carlo Code and the results obtained can be regarded as satisfactory, i.e., a thermal neutron fluencyN{sub T} = 1,35x10{sup 8} n/cm , a fast neutron dose of 5,86x10{sup -10} Gy/N{sub T} and a gamma ray dose of 8,30x10{sup -14} Gy/N{sub T}. (author)
Heinrich, Josué Miguel; Niizawa, Ignacio; Botta, Fausto Adrián; Trombert, Alejandro Raúl; Irazoqui, Horacio Antonio
2012-01-01
In a previous study, we developed a methodology to assess the intrinsic optical properties governing the radiation field in algae suspensions. With these properties at our disposal, a Monte Carlo simulation program is developed and used in this study as a predictive autonomous program applied to the simulation of experiments that reproduce the common illumination conditions that are found in processes of large scale production of microalgae, especially when using open ponds such as raceway ponds. The simulation module is validated by comparing the results of experimental measurements made on artificially illuminated algal suspension with those predicted by the Monte Carlo program. This experiment deals with a situation that resembles that of an open pond or that of a raceway pond, except for the fact that for convenience, the experimental arrangement appears as if those reactors were turned upside down. It serves the purpose of assessing to what extent the scattering phenomena are important for the prediction of the spatial distribution of the radiant energy density. The simulation module developed can be applied to compute the local energy density inside photobioreactors with the goal to optimize its design and their operating conditions. © 2012 Wiley Periodicals, Inc. Photochemistry and Photobiology © 2012 The American Society of Photobiology.
International Nuclear Information System (INIS)
Santana Leitner, Mario; Fasso, Alberto; Fisher, Alan S.; Nuhn, Heinz D.; Dooling, Jeffrey C.; Berg, William; Yang, Bin. X.
2010-01-01
In 2009 the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Center started free electron laser (FEL) operation. In order to continue to produce the bright and short-pulsed x-ray laser demanded by FEL scientists, this pioneer hard x-ray FEL requires a perfectly tailored magnetic field at the undulators, so that the photons generated at the electron wiggling path interact at the right phase with the electron beam. In such a precise system, small (>0.01%) radiation-induced alterations of the magnetic field in the permanent magnets could affect FEL performance. This paper describes the simulation studies of radiation fields in permanent magnets and the expected signal in the detectors. The transport of particles from the radiation sources (i.e. diagnostic insert) to the undulator magnets and to the beam loss monitors (BLM) was simulated with the intra nuclear cascade codes FLUKA and MARS15. In order to accurately reproduce the optics of LCLS, lattice capabilities and magnetic fields were enabled in FLUKA and betatron oscillations were validated against reference data. All electron events entering the BLMs were printed in data files. The paper also introduces the Radioactive Ion Beam Optimizer (RIBO) Monte Carlo 3-D code, which was used to read from the event files, to compute Cerenkov production and then to simulate the optical coupling of the BLM detectors, accounting for the transmission of light through the quartz. (author)
International Nuclear Information System (INIS)
Jarry, Genevieve; Verhaegen, Frank
2007-01-01
Electronic portal imagers have promising dosimetric applications in external beam radiation therapy. In this study a patient dose computation algorithm based on Monte Carlo (MC) simulations and on portal images is developed and validated. The patient exit fluence from primary photons is obtained from the portal image after correction for scattered radiation. The scattered radiation at the portal imager and the spectral energy distribution of the primary photons are estimated from MC simulations at the treatment planning stage. The patient exit fluence and the spectral energy distribution of the primary photons are then used to ray-trace the photons from the portal image towards the source through the CT geometry of the patient. Photon weights which reflect the probability of a photon being transmitted are computed during this step. A dedicated MC code is used to transport back these photons from the source through the patient CT geometry to obtain patient dose. Only Compton interactions are considered. This code also produces a reconstructed portal image which is used as a verification tool to ensure that the dose reconstruction is reliable. The dose reconstruction algorithm is compared against MC dose calculation (MCDC) predictions and against measurements in phantom. The reconstructed absolute absorbed doses and the MCDC predictions in homogeneous and heterogeneous phantoms agree within 3% for simple open fields. Comparison with film-measured relative dose distributions for IMRT fields yields agreement within 3 mm, 5%. This novel dose reconstruction algorithm allows for daily patient-specific dosimetry and verification of patient movement
Energy Technology Data Exchange (ETDEWEB)
Burns, T.J.
1994-03-01
An Xwindow application capable of importing geometric information directly from two Computer Aided Design (CAD) based formats for use in radiation transport and shielding analyses is being developed at ORNL. The application permits the user to graphically view the geometric models imported from the two formats for verification and debugging. Previous models, specifically formatted for the radiation transport and shielding codes can also be imported. Required extensions to the existing combinatorial geometry analysis routines are discussed. Examples illustrating the various options and features which will be implemented in the application are presented. The use of the application as a visualization tool for the output of the radiation transport codes is also discussed.
International Nuclear Information System (INIS)
Burns, T.J.
1994-01-01
An Xwindow application capable of importing geometric information directly from two Computer Aided Design (CAD) based formats for use in radiation transport and shielding analyses is being developed at ORNL. The application permits the user to graphically view the geometric models imported from the two formats for verification and debugging. Previous models, specifically formatted for the radiation transport and shielding codes can also be imported. Required extensions to the existing combinatorial geometry analysis routines are discussed. Examples illustrating the various options and features which will be implemented in the application are presented. The use of the application as a visualization tool for the output of the radiation transport codes is also discussed
Energy Technology Data Exchange (ETDEWEB)
Liaparinos, Panagiotis [Department of Medical Physics, Medical School, University of Patras, Patras 265 00 (Greece); Kandarakis, Ioannis [Department of Medical Instruments Technology, Technological Educational Institution of Athens, Ag. Spyridonos Street, Aigaleo, Athens 122 10 (Greece); Cavouras, Dionisis [Department of Medical Instruments Technology, Technological Educational Institution of Athens, Ag. Spyridonos Street, Aigaleo, Athens 122 10 (Greece); Delis, Harry [Department of Medical Physics, Medical School, University of Patras, Patras 265 00 (Greece); Panayiotakis, George [Department of Medical Physics, Medical School, University of Patras, Patras 265 00 (Greece)]. E-mail: panayiot@upatras.gr
2006-12-20
The purpose of this study was to investigate the radiation detection efficiency of the recently introduced RbGd{sub 2}Br{sub 7}:Ce (RGB) scintillator material by a custom developed Monte Carlo simulation code. Considering its fast principal decay constant (45 ns) and its high light yield (56 000 photons/MeV), RbGd{sub 2}Br{sub 7}:Ce appears to be a quite promising scintillator for applications in nuclear medical imaging systems. In this work, gamma-ray interactions, within the scintillator mass were studied. In addition, the effect of K-characteristic fluorescence radiation emission, re-absorption or escape, as well as the effect of scattering events on the spatial distribution of absorbed energy was examined. Various scintillator crystal thicknesses (5-25 mm), used in positron emission imaging, were considered to be irradiated by 511 keV photons. Similar simulations were performed on the well known Lu{sub 2}SiO{sub 5}:Ce (LSO) scintillator for comparison purposes. Simulation results allowed the determination of the quantum detection efficiency as well as the fraction of the energy absorbed due to the K-characteristic radiation. Results were obtained as a function of scintillator crystal thickness. The Lu{sub 2}SiO{sub 5}:Ce scintillator material showed to exhibit better radiation absorption properties in comparison with RbGd{sub 2}Br{sub 7}:Ce. However, RGB showed to be less affected by the production of K-characteristic radiation. Taking into account its very short decay time and its high light yield, this material could be considered to be employed in positron imaging (PET) detectors.
International Nuclear Information System (INIS)
Balick, B.
1987-01-01
The phases of stellar evolution and the development of planetary nebulae are examined. The relation between planetary nebulae and red giants is studied. Spherical and nonspherical cases of shaping planetaries with stellar winds are described. CCD images of nebulae are analyzed, and it is determined that the shape of planetary nebulae depends on ionization levels. Consideration is given to calculating the distances of planetaries using radio images, and molecular hydrogen envelopes which support the wind-shaping model of planetary nebulae
International Nuclear Information System (INIS)
Taylor, S.R.
1988-01-01
The present study of the density, major-element and trace-element compositions, oxygen isotopes, and noble gases of the metal, sulfide, and silicate components of meteorites shows that these properties do not match those of the terrestrial planets, and thereby suggests that there was not much lateral mixing in the solar nebula during planetary accretion. The planets would then have accumulated from narrow concentric zones, and the current zonal structure of the asteroid belt may be analogous to the structure of the inner portions of the solar nebula during the terrestrial planets' accretion. Localized heating during the material's infall to the median plane of the nebula is suggested to have occurred. 64 references
Lai, Priscilla; Cai, Zhongli; Pignol, Jean-Philippe; Lechtman, Eli; Mashouf, Shahram; Lu, Yijie; Winnik, Mitchell A.; Jaffray, David A.; Reilly, Raymond M.
2017-11-01
Permanent seed implantation (PSI) brachytherapy is a highly conformal form of radiation therapy but is challenged with dose inhomogeneity due to its utilization of low energy radiation sources. Gold nanoparticles (AuNP) conjugated with electron emitting radionuclides have recently been developed as a novel form of brachytherapy and can aid in homogenizing dose through physical distribution of radiolabeled AuNP when injected intratumorally (IT) in suspension. However, the distribution is unpredictable and precise placement of many injections would be difficult. Previously, we reported the design of a nanoparticle depot (NPD) that can be implanted using PSI techniques and which facilitates controlled release of AuNP. We report here the 3D dose distribution resulting from a NPD incorporating AuNP labeled with electron emitters (90Y, 177Lu, 111In) of different energies using Monte Carlo based voxel level dosimetry. The MCNP5 Monte Carlo radiation transport code was used to assess differences in dose distribution from simulated NPD and conventional brachytherapy sources, positioned in breast tissue simulating material. We further compare these dose distributions in mice bearing subcutaneous human breast cancer xenografts implanted with 177Lu-AuNP NPD, or injected IT with 177Lu-AuNP in suspension. The radioactivity distributions were derived from registered SPECT/CT images and time-dependent dose was estimated. Results demonstrated that the dose distribution from NPD reduced the maximum dose 3-fold when compared to conventional seeds. For simulated NPD, as well as NPD implanted in vivo, 90Y delivered the most homogeneous dose distribution. The tumor radioactivity in mice IT injected with 177Lu-AuNP redistributed while radioactivity in the NPD remained confined to the implant site. The dose distribution from radiolabeled AuNP NPD were predictable and concentric in contrast to IT injected radiolabeled AuNP, which provided irregular and temporally variant dose distributions
International Nuclear Information System (INIS)
White, Morgan C.
2000-01-01
The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V and V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to
Energy Technology Data Exchange (ETDEWEB)
White, Morgan C. [Univ. of Florida, Gainesville, FL (United States)
2000-07-01
The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second
Molecular Dications in Planetary Atmospheric Escape
Directory of Open Access Journals (Sweden)
Stefano Falcinelli
2016-08-01
Full Text Available Fundamental properties of multiply charged molecular ions, such as energetics, structure, stability, lifetime and fragmentation dynamics, are relevant to understand and model the behavior of gaseous plasmas as well as ionosphere and astrophysical environments. Experimental determinations of the Kinetic Energy Released (KER for ions originating from dissociations reactions, induced by Coulomb explosion of doubly charged molecular ions (molecular dications produced by double photoionization of CO2, N2O and C2H2 molecules of interest in planetary atmospheres, are reported. The KER measurement as a function of the ultraviolet (UV photon energy in the range of 28–65 eV was extracted from the electron-ion-ion coincidence spectra obtained by using tunable synchrotron radiation coupled with ion imaging techniques at the ELETTRA Synchrotron Light Laboratory Trieste, Italy. These experiments, coupled with a computational analysis based on a Monte Carlo trajectory simulation, allow assessing the probability of escape for simple ionic species in the upper atmosphere of Mars, Venus and Titan. The measured KER in the case of H+, C+, CH+, CH2+, N+, O+, CO+, N2+ and NO+ fragment ions range between 1.0 and 5.5 eV, being large enough to allow these ionic species to participate in the atmospheric escape from such planets into space. In the case of Mars, we suggest a possible explanation for the observed behavior of the O+ and CO22+ ion density profiles.
International Nuclear Information System (INIS)
Szoke, A; Brooks, E D; McKinley, M; Daffin, F
2005-01-01
The equations of radiation transport for thermal photons are notoriously difficult to solve in thick media without resorting to asymptotic approximations such as the diffusion limit. One source of this difficulty is that in thick, absorbing media thermal emission is almost completely balanced by strong absorption. In a previous publication [SB03], the photon transport equation was written in terms of the deviation of the specific intensity from the local equilibrium field. We called the new form of the equations the difference formulation. The difference formulation is rigorously equivalent to the original transport equation. It is particularly advantageous in thick media, where the radiation field approaches local equilibrium and the deviations from the Planck distribution are small. The difference formulation for photon transport also clarifies the diffusion limit. In this paper, the transport equation is solved by the Symbolic Implicit Monte Carlo (SIMC) method and a comparison is made between the standard formulation and the difference formulation. The SIMC method is easily adapted to the derivative source terms of the difference formulation, and a remarkable reduction in noise is obtained when the difference formulation is applied to problems involving thick media
Jarry, G; DeMarco, J J; Beifuss, U; Cagnon, C H; McNitt-Gray, M F
2003-08-21
The purpose of this work is to develop and test a method to estimate the relative and absolute absorbed radiation dose from axial and spiral CT scans using a Monte Carlo approach. Initial testing was done in phantoms and preliminary results were obtained from a standard mathematical anthropomorphic model (MIRD V) and voxelized patient data. To accomplish this we have modified a general purpose Monte Carlo transport code (MCNP4B) to simulate the CT x-ray source and movement, and then to calculate absorbed radiation dose in desired objects. The movement of the source in either axial or spiral modes was modelled explicitly while the CT system components were modelled using published information about x-ray spectra as well as information provided by the manufacturer. Simulations were performed for single axial scans using the head and body computed tomography dose index (CTDI) polymethylmethacrylate phantoms at both central and peripheral positions for all available beam energies and slice thicknesses. For comparison, corresponding physical measurements of CTDI in phantom were made with an ion chamber. To obtain absolute dose values, simulations and measurements were performed in air at the scanner isocentre for each beam energy. To extend the verification, the CT scanner model was applied to the MIRD V model and compared with published results using similar technical factors. After verification of the model, the generalized source was simulated and applied to voxelized models of patient anatomy. The simulated and measured absolute dose data in phantom agreed to within 2% for the head phantom and within 4% for the body phantom at 120 and 140 kVp; this extends to 8% for the head and 9% for the body phantom across all available beam energies and positions. For the head phantom, the simulated and measured absolute dose data agree to within 2% across all slice thicknesses at 120 kVp. Our results in the MIRD phantom agree within 11% of all the different organ dose values
Kalos, Melvin H
2008-01-01
This introduction to Monte Carlo methods seeks to identify and study the unifying elements that underlie their effective application. Initial chapters provide a short treatment of the probability and statistics needed as background, enabling those without experience in Monte Carlo techniques to apply these ideas to their research.The book focuses on two basic themes: The first is the importance of random walks as they occur both in natural stochastic systems and in their relationship to integral and differential equations. The second theme is that of variance reduction in general and importance sampling in particular as a technique for efficient use of the methods. Random walks are introduced with an elementary example in which the modeling of radiation transport arises directly from a schematic probabilistic description of the interaction of radiation with matter. Building on this example, the relationship between random walks and integral equations is outlined
2016-08-10
NUMBER Kevin Kramer, Andy Li, Joe Madrigal, Brian Sanchez, and Kyle Millage 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME...a 10-kiloton fission device, detonated at ground level, through both an open- field and urban environment. Scattered radiation from the atmosphere...neutron and photon emissions, transmitting between 14-42% of the open- field dose depending on the building characteristics. 15. SUBJECT TERMS
International Nuclear Information System (INIS)
Ben Hdech, Yassine
2011-01-01
To ensure the required accuracy and prevent from mis-administration, cancer treatments, by external radiation therapy are simulated on Treatment Planning System or TPS before radiation delivery in order to ensure that the prescription is achieved both in terms of target volumes coverage and healthy tissues protection. The TPS calculates the patient dose distribution and the treatment time per beam required to deliver the prescribed dose. TPS is a key system in the decision process of treatment by radiation therapy. It is therefore essential that the TPS be subject to a thorough check of its performance (quality control or QC) and in particular its ability to accurately compute dose distributions for patients in all clinical situations that be met. The 'traditional' methods recommended to carry out dosimetric CQ of algorithms implemented in the TPS are based on comparisons between dose distributions calculated with the TPS and dose measured in physical test objects (PTO) using the treatment machine. In this thesis we propose to substitute the reference dosimetric measurements performed in OTP by benchmark dose calculations in Digital Test Objects using PENELOPE Monte-Carlo code. This method has three advantages: (i) it allows simulation in situations close to the clinic and often too complex to be experimentally feasible; (ii) due to the digital form of reference data the QC process may be automated; (iii) it allows a comprehensive TPS CQ without hindering the use of an equipment devoted primarily to patients treatments. This new method of CQ has been tested successfully on the Eclipse TPS from Varian Medical Systems Company. (author) [fr
Energy Technology Data Exchange (ETDEWEB)
Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov [Division of Imaging, Diagnostics, and Software Reliability, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993 (United States)
2014-12-15
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
Dong, Han; Sharma, Diksha; Badano, Aldo
2014-12-01
Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual
International Nuclear Information System (INIS)
Dong, Han; Sharma, Diksha; Badano, Aldo
2014-01-01
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
Doerner, Edgardo; Caprile, Paola
2017-12-01
To present the implementation of a new option for parallel processing of the EGSnrc Monte Carlo system using the OpenMP API, as an alternative to the provided method based on the use of a batch queuing system (BQS). The parallel solution presented, called OMP_EGS, makes use of OpenMP features to control the workload distribution between the compute units. These features were inserted into the original EGSnrc source code through properly defined macros. In order to validate the platform, the possibility of producing results in exact agreement with the serial implementation was assessed. The performance of OMP_EGS was evaluated against the BQS method, in terms of parallel speedup and efficiency. As the OpenMP features can be activated or deactivated depending on the compilation options, the implementation of the platform allowed the direct recovery of the original serial implementation. The validation tests showed that OMP_EGS was able to reproduce the exact same results as the serial implementation. The performance and scalability tests showed that OMP_EGS is a better alternative than the EGSnrc BQS parallel implementation, both in terms of runtime and parallel efficiency. The presented solution has several advantages over the BQS-based parallel implementation available for the EGSnrc system. One of the main advantages is that, in contrast to the BQS alternative, it can be implemented using different compilers and operative systems, which turns it into a compact and portable solution that can be used on a wide range of working environments. It does not introduce artifacts on the simulated distributions, as it only handles the distribution of work among the available computing resources and it proved to have a better performance. © 2017 American Association of Physicists in Medicine.
Energy Technology Data Exchange (ETDEWEB)
Manchado, F.; Vilches, M.; Guiraldo, D.; Lallena, A. M.
2011-07-01
In this paper we have studied, using Monte Carlo simulation, the properties of such beams, degradation with depth traversed, the influence of target motion during irradiation, how to reduce the absorbed dose between bands and how to reduce simulation times.
Simulations of GCR interactions within planetary bodies using GEANT4
Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.
2017-12-01
On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05
Monte Carlo simulation for IRRMA
International Nuclear Information System (INIS)
Gardner, R.P.; Liu Lianyan
2000-01-01
Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors
DeMarco, J J; Cagnon, C H; Cody, D D; Stevens, D M; McCollough, C H; Zankl, M; Angel, E; McNitt-Gray, M F
2007-05-07
The purpose of this work is to examine the effects of patient size on radiation dose from CT scans. To perform these investigations, we used Monte Carlo simulation methods with detailed models of both patients and multidetector computed tomography (MDCT) scanners. A family of three-dimensional, voxelized patient models previously developed and validated by the GSF was implemented as input files using the Monte Carlo code MCNPX. These patient models represent a range of patient sizes and ages (8 weeks to 48 years) and have all radiosensitive organs previously identified and segmented, allowing the estimation of dose to any individual organ and calculation of patient effective dose. To estimate radiation dose, every voxel in each patient model was assigned both a specific organ index number and an elemental composition and mass density. Simulated CT scans of each voxelized patient model were performed using a previously developed MDCT source model that includes scanner specific spectra, including bowtie filter, scanner geometry and helical source path. The scan simulations in this work include a whole-body scan protocol and a thoracic CT scan protocol, each performed with fixed tube current. The whole-body scan simulation yielded a predictable decrease in effective dose as a function of increasing patient weight. Results from analysis of individual organs demonstrated similar trends, but with some individual variations. A comparison with a conventional dose estimation method using the ImPACT spreadsheet yielded an effective dose of 0.14 mSv mAs(-1) for the whole-body scan. This result is lower than the simulations on the voxelized model designated 'Irene' (0.15 mSv mAs(-1)) and higher than the models 'Donna' and 'Golem' (0.12 mSv mAs(-1)). For the thoracic scan protocol, the ImPACT spreadsheet estimates an effective dose of 0.037 mSv mAs(-1), which falls between the calculated values for Irene (0.042 mSv mAs(-1)) and Donna (0.031 mSv mAs(-1)) and is higher relative
DeMarco, J. J.; Cagnon, C. H.; Cody, D. D.; Stevens, D. M.; McCollough, C. H.; Zankl, M.; Angel, E.; McNitt-Gray, M. F.
2007-05-01
The purpose of this work is to examine the effects of patient size on radiation dose from CT scans. To perform these investigations, we used Monte Carlo simulation methods with detailed models of both patients and multidetector computed tomography (MDCT) scanners. A family of three-dimensional, voxelized patient models previously developed and validated by the GSF was implemented as input files using the Monte Carlo code MCNPX. These patient models represent a range of patient sizes and ages (8 weeks to 48 years) and have all radiosensitive organs previously identified and segmented, allowing the estimation of dose to any individual organ and calculation of patient effective dose. To estimate radiation dose, every voxel in each patient model was assigned both a specific organ index number and an elemental composition and mass density. Simulated CT scans of each voxelized patient model were performed using a previously developed MDCT source model that includes scanner specific spectra, including bowtie filter, scanner geometry and helical source path. The scan simulations in this work include a whole-body scan protocol and a thoracic CT scan protocol, each performed with fixed tube current. The whole-body scan simulation yielded a predictable decrease in effective dose as a function of increasing patient weight. Results from analysis of individual organs demonstrated similar trends, but with some individual variations. A comparison with a conventional dose estimation method using the ImPACT spreadsheet yielded an effective dose of 0.14 mSv mAs-1 for the whole-body scan. This result is lower than the simulations on the voxelized model designated 'Irene' (0.15 mSv mAs-1) and higher than the models 'Donna' and 'Golem' (0.12 mSv mAs-1). For the thoracic scan protocol, the ImPACT spreadsheet estimates an effective dose of 0.037 mSv mAs-1, which falls between the calculated values for Irene (0.042 mSv mAs-1) and Donna (0.031 mSv mAs-1) and is higher relative to Golem (0
International Nuclear Information System (INIS)
Arter, W.; Loughlin, M.J.
2009-01-01
Accurate calculation of the neutron transport through the shielding of the IFMIF test cell, defined by CAD, is a difficult task for several reasons. The ability of the powerful deterministic radiation transport code Attila, to do this rapidly and reliably has been studied. Three models of increasing geometrical complexity were produced from the CAD using the CADfix software. A fourth model was produced to represent transport within the cell. The work also involved the conversion of the Vitenea-IEF database for high energy neutrons into a format usable by Attila, and the conversion of a particle source specified in MCNP wssaformat to a form usable by Attila. The final model encompassed the entire test cell environment, with only minor modifications. On a state-of-the-art PC, Attila took approximately 3 h to perform the calculations, as a consequence of a careful mesh 'layering'. The results strongly suggest that Attila will be a valuable tool for modelling radiation transport in IFMIF, and for similar problems
Chang, Shu-Jun; Hsu, Jui-Ting; Hung, Shih-Yen; Liu, Yan-Lin; Jiang, Shiang-Huei; Wu, Jay
2017-05-01
Reference phantoms are widely applied to evaluate the radiation dose for external exposure. However, the frequently used reference phantoms are based on Caucasians. Dose estimation for Asians using a Caucasian phantom can result in significant errors. This study recruited 40 volunteers whose body sizes are close to the average Taiwanese population. Magnetic resonance imaging was performed to obtain the organ volume for construction of the Taiwanese reference man (TRM) and Taiwanese reference woman (TRW). The dose conversion coefficients (DCC) resulting from photo beams in anterior-posterior, posterior-anterior, right-lateral, left-lateral, and isotropic irradiation geometries were estimated. In the anterior-posterior geometry, the mean DCC differences among organs between the TRM and ORNL phantom at 0.1, 1, and 10 MeV were 7.3%, 5.8%, and 5.2%, respectively. For the TRW, the mean differences from the ORNL phantom at the three energies were 10.6%, 7.4%, and 8.3%. The DCCs of the Taiwanese reference phantoms and the ORNL phantom presented similar trends in other geometries. The torso size of the phantom and the mass and geometric location of the organ have a significant influence on the DCC. The Taiwanese reference phantoms can be used to establish dose guidelines and regulations for radiation protection from external exposure.
Energy Technology Data Exchange (ETDEWEB)
Liu, B; Sajo, E [University of Massachusetts Lowell, Lowell, MA (United States); Ouyang, Z; Ngwa, W [University of Massachusetts Lowell, Lowell, MA (United States); Brigham and Women’s Hospital, Dana-Farber Cancer Institute and Harvard Medical School, Boston, MA (United States)
2016-06-15
Purpose: A recent publication has shown that by delivering titanium dioxide nanoparticles (titania) as a photosensitizer into tumors, Cerenkov radiation (CR) produced by radionuclides could be used for substantially boosting damage to cancer cells. The present work compares CR production by various clinically relevant radiation sources including internal radionuclides and external beam radiotherapy (EBRT), and provides preliminarily computational results of CR absorption by titania. Methods: 1) Geant4.10.1 was used to simulate ionizing radiation-induced CR production in a 1cm diameter spherical volume using external radiotherapy sources: Varian Clinac IX 6MV and Eldorado {sup 60}Co, both with 10*10 cm{sup 2} field size. In each case the volume was placed at the maximum dose depth (1.5cm for 6MV source and 0.5cm for {sup 60}Co). In addition, {sup 18}F, {sup 192}Ir and {sup 60}Co were simulated using Geant4 radioactive decay models as internal sources. Dose deposition and CR production spectra in 200nm-400nm range were calculated as it is the excitation range of titania. 2) Using 6MV external source, the absorption by titania was calculated via the track length of CR in the spherical volume. The nanoparticle concentration was varied from 0.25 to 5µg/g. Results: Among different radioactive sources, results showed that {sup 18}F induced the highest amount of CR per disintegration, but {sup 60}Co had the highest yield per unit dose. When compared with external sources, 6MV source was shown to be the most efficient for the the same delivered dose. Simulations indicated increased absorption for increasing concentrations, with up to 68% absorption of generated CR for 5µg/g titania concentration. Conclusion: The results demonstrate that 6MV beam is favored with a higher CR yield, compared to radionuclides, and that the use of higher concentrations of titania may increase photosensitization. From the findings, we propose that if sufficiently potent concentrations of
Stam, D.M.; de Rooij, W.A.; Cornet, G.; Hovenier, J.W.
2006-01-01
We present an efficient numerical method for integrating planetary radiation over a planetary disk, which is especially interesting for simulating signals of extrasolar planets. Our integration method is applicable to calculating the full flux vector of the disk-integrated planetary radiation, i.e.
Energy Technology Data Exchange (ETDEWEB)
Blazy-Aubignac, L
2007-09-15
The treatment planning systems (T.P.S.) occupy a key position in the radiotherapy service: they realize the projected calculation of the dose distribution and the treatment duration. Traditionally, the quality control of the calculated distribution doses relies on their comparisons with dose distributions measured under the device of treatment. This thesis proposes to substitute these dosimetry measures to the profile of reference dosimetry calculations got by the Penelope Monte-Carlo code. The Monte-Carlo simulations give a broad choice of test configurations and allow to envisage a quality control of dosimetry aspects of T.P.S. without monopolizing the treatment devices. This quality control, based on the Monte-Carlo simulations has been tested on a clinical T.P.S. and has allowed to simplify the quality procedures of the T.P.S.. This quality control, in depth, more precise and simpler to implement could be generalized to every center of radiotherapy. (N.C.)
National Aeronautics and Space Administration — The Planetary Data System (PDS) is an archive of data products from NASA planetary missions, which is sponsored by NASA's Science Mission Directorate. We actively...
International Nuclear Information System (INIS)
Makri, T; Yakoumakis, E; Papadopoulou, D; Gialousis, G; Theodoropoulos, V; Sandilos, P; Georgiou, E
2006-01-01
Seeking to assess the radiation risk associated with radiological examinations in neonatal intensive care units, thermo-luminescence dosimetry was used for the measurement of entrance surface dose (ESD) in 44 AP chest and 28 AP combined chest-abdominal exposures of a sample of 60 neonates. The mean values of ESD were found to be equal to 44 ± 16 μGy and 43 ± 19 μGy, respectively. The MCNP-4C2 code with a mathematical phantom simulating a neonate and appropriate x-ray energy spectra were employed for the simulation of the AP chest and AP combined chest-abdominal exposures. Equivalent organ dose per unit ESD and energy imparted per unit ESD calculations are presented in tabular form. Combined with ESD measurements, these calculations yield an effective dose of 10.2 ± 3.7 μSv, regardless of sex, and an imparted energy of 18.5 ± 6.7 μJ for the chest radiograph. The corresponding results for the combined chest-abdominal examination are 14.7 ± 7.6 μSv (males)/17.2 ± 7.6 μSv (females) and 29.7 ± 13.2 μJ. The calculated total risk per radiograph was low, ranging between 1.7 and 2.9 per million neonates, per film, and being slightly higher for females. Results of this study are in good agreement with previous studies, especially in view of the diversity met in the calculation methods
International Nuclear Information System (INIS)
Yeh, C.Y.; Tung, C.J.; Lee, C.C.; Lin, M.H.; Chao, T.C.
2014-01-01
Measurement-based Monte Carlo (MBMC) simulation using a high definition (HD) phantom was used to evaluate the dose distribution in nasopharyngeal cancer (NPC) patients treated with intensity modulated radiation therapy (IMRT). Around nasopharyngeal cavity, there exists many small volume organs-at-risk (OARs) such as the optic nerves, auditory nerves, cochlea, and semicircular canal which necessitate the use of a high definition phantom for accurate and correct dose evaluation. The aim of this research was to study the advantages of using an HD phantom for MBMC simulation in NPC patients treated with IMRT. The MBMC simulation in this study was based on the IMRT treatment plan of three NPC patients generated by the anisotropic analytical algorithm (AAA) of the Eclipse treatment planning system (Varian Medical Systems, Palo Alto, CA, USA) using a calculation grid of 2 mm 2 . The NPC tumor was treated to a cumulative dose of 7000 cGy in 35 fractions using the shrinking-field sequential IMRT (SIMRT) method. The BEAMnrc MC Code was used to simulate a Varian EX21 linear accelerator treatment head. The HD phantom contained 0.5 × 0.5 × 1 mm 3 voxels for the nasopharyngeal area and 0.5 × 0.5 × 3 mm 3 for the rest of the head area. An efficiency map was obtained for the amorphous silicon aS1000 electronic portal imaging device (EPID) to adjust the weighting of each particle in the phase-space file for each IMRT beam. Our analysis revealed that small volume organs such as the eighth cranial nerve, semicircular canal, cochlea and external auditory canal showed an absolute dose difference of ≥200 cGy, while the dose difference for larger organs such as the parotid glands and tumor was negligible for the MBMC simulation using the HD phantom. The HD phantom was found to be suitable for Monte Carlo dose volume analysis of small volume organs. - Highlights: • HD dose evaluation for IMRT of NPC patients have been verified by the MC method. • MC results shows
Su, Lin; Yang, Youming; Bednarz, Bryan; Sterpin, Edmond; Du, Xining; Liu, Tianyu; Ji, Wei; Xu, X. George
2014-01-01
Purpose: Using the graphical processing units (GPU) hardware technology, an extremely fast Monte Carlo (MC) code ARCHERRT is developed for radiation dose calculations in radiation therapy. This paper describes the detailed software development and testing for three clinical TomoTherapy® cases: the prostate, lung, and head & neck. Methods: To obtain clinically relevant dose distributions, phase space files (PSFs) created from optimized radiation therapy treatment plan fluence maps were used as the input to ARCHERRT. Patient-specific phantoms were constructed from patient CT images. Batch simulations were employed to facilitate the time-consuming task of loading large PSFs, and to improve the estimation of statistical uncertainty. Furthermore, two different Woodcock tracking algorithms were implemented and their relative performance was compared. The dose curves of an Elekta accelerator PSF incident on a homogeneous water phantom were benchmarked against DOSXYZnrc. For each of the treatment cases, dose volume histograms and isodose maps were produced from ARCHERRT and the general-purpose code, GEANT4. The gamma index analysis was performed to evaluate the similarity of voxel doses obtained from these two codes. The hardware accelerators used in this study are one NVIDIA K20 GPU, one NVIDIA K40 GPU, and six NVIDIA M2090 GPUs. In addition, to make a fairer comparison of the CPU and GPU performance, a multithreaded CPU code was developed using OpenMP and tested on an Intel E5-2620 CPU. Results: For the water phantom, the depth dose curve and dose profiles from ARCHERRT agree well with DOSXYZnrc. For clinical cases, results from ARCHERRT are compared with those from GEANT4 and good agreement is observed. Gamma index test is performed for voxels whose dose is greater than 10% of maximum dose. For 2%/2mm criteria, the passing rates for the prostate, lung case, and head & neck cases are 99.7%, 98.5%, and 97.2%, respectively. Due to specific architecture of GPU, modified
Su, Lin; Yang, Youming; Bednarz, Bryan; Sterpin, Edmond; Du, Xining; Liu, Tianyu; Ji, Wei; Xu, X George
2014-07-01
Using the graphical processing units (GPU) hardware technology, an extremely fast Monte Carlo (MC) code ARCHERRT is developed for radiation dose calculations in radiation therapy. This paper describes the detailed software development and testing for three clinical TomoTherapy® cases: the prostate, lung, and head & neck. To obtain clinically relevant dose distributions, phase space files (PSFs) created from optimized radiation therapy treatment plan fluence maps were used as the input to ARCHERRT. Patient-specific phantoms were constructed from patient CT images. Batch simulations were employed to facilitate the time-consuming task of loading large PSFs, and to improve the estimation of statistical uncertainty. Furthermore, two different Woodcock tracking algorithms were implemented and their relative performance was compared. The dose curves of an Elekta accelerator PSF incident on a homogeneous water phantom were benchmarked against DOSXYZnrc. For each of the treatment cases, dose volume histograms and isodose maps were produced from ARCHERRT and the general-purpose code, GEANT4. The gamma index analysis was performed to evaluate the similarity of voxel doses obtained from these two codes. The hardware accelerators used in this study are one NVIDIA K20 GPU, one NVIDIA K40 GPU, and six NVIDIA M2090 GPUs. In addition, to make a fairer comparison of the CPU and GPU performance, a multithreaded CPU code was developed using OpenMP and tested on an Intel E5-2620 CPU. For the water phantom, the depth dose curve and dose profiles from ARCHERRT agree well with DOSXYZnrc. For clinical cases, results from ARCHERRT are compared with those from GEANT4 and good agreement is observed. Gamma index test is performed for voxels whose dose is greater than 10% of maximum dose. For 2%/2mm criteria, the passing rates for the prostate, lung case, and head & neck cases are 99.7%, 98.5%, and 97.2%, respectively. Due to specific architecture of GPU, modified Woodcock tracking algorithm
Meier, R. R.; Lee, J.-S.
1982-01-01
The transport of resonance radiation under optically thick conditions is shown to be accurately described by a Monte Carlo model of the atomic oxygen 1304 A airglow triplet in which partial frequency redistribution, temperature gradients, pure absorption and multilevel scattering are accounted for. All features of the data can be explained by photoelectron impact excitation and the resonant scattering of sunlight, where the latter source dominates below 100 and above 500 km and is stronger at intermediate altitudes than previously thought. It is concluded that the OI 1304 A emission can be used in studies of excitation processes and atomic oxygen densities in planetary atmospheres.
International Nuclear Information System (INIS)
Grinin, V.P.
1982-01-01
It is shown that the inclination of spectral lines observed in a number of planetary nebulae when the spectrograph slit is placed along the major axis, which is presently ascribed to nonuniform expansion of the shells, actually may be due to rotation of the nebulae about their minor axes, as Campbell and Moore have suggested in their reports. It is assumed that the rotation of the central star (or, if the core is a binary system, circular motions of gas along quasi-Keplerian orbits) serves as the source of the original rotation of a protoplanetary nebula. The mechanism providing for strengthening of the original rotation in the process of expansion of the shell is the tangential pressure of L/sub α/ radiation due to the anisotropic properties of the medium and radiation field. The dynamic effect produced by them is evidently greatest in the epoch when the optical depth of the nebula in the L/sub c/ continuum becomes on the order of unity in the course of its expansion
Jaradat, Adnan Khalaf
The x ray leakage from the housing of a therapy x ray source is regulated to be chamber and track-etch detectors. The leakage was measured at nine different positions over the rear wall using a 3 x 3 matrix with a 1 m separation between adjacent positions. In general, the leakage was less than the canonical value, but the exact value depends on energy, gantry angle, and measurement position. Leakage at 10 MV for some positions exceeded 0.1%. Electrons with energy greater than about 9 MeV have the ability to produce neutrons. Neutron leakage has been measured around the head of electron accelerators at a distance 1 m from the target at 0°, 46°, 90°, 135°, and 180° azimuthal angles; for electron energies of 9, 12, 15, 16, 18, and 20 MeV and 10, 15, and 18 MV x ray photon beam, using a neutron bubble detector of type BD-PND and using Track-Etch detectors. The highest neutron dose equivalent per unit electron dose was at 0° for all electron energies. The neutron leakage from photon beams was the highest between all the machines. Intensity modulated radiation therapy (IMRT) delivery consists of a summation of small beamlets having different weights that make up each field. A linear accelerator room designed exclusively for IMRT use would require different, probably lower, tenth value layers (TVL) for determining the required wall thicknesses for the primary barriers. The first, second, and third TVL of 60Co gamma rays and photons from 4, 6, 10, 15, and 18 MV x ray beams by concrete have been determined and modeled using a Monte Carlo technique (MCNP version 4C2) for cone beams of half-opening angles of 0°, 3°, 6°, 9°, 12°, and 14°.
International Nuclear Information System (INIS)
Zuckerman, B.
1978-01-01
A 'proto-planetary nebula' or a 'planetary nebula progenitor' is the term used to describe those objects that are losing mass at a rate >approximately 10 -5 Msolar masses/year (i.e. comparable to mass loss rates in planetary nebulae with ionized masses >approximately 0.2 Msolar masses) and which, it is believed, will become planetary nebulae themselves within 5 years. It is shown that most proto-planetary nebulae appear as very red objects although a few have been 'caught' near the middle of the Hertzsprung-Russell diagram. The precursors of these proto-planetaries are the general red giant population, more specifically probably Mira and semi-regular variables. (Auth.)end
Anderson, Danielle; Siegbahn, E. Albert; Fallone, B. Gino; Serduc, Raphael; Warkentin, Brad
2012-05-01
This work evaluates four dose-volume metrics applied to microbeam radiation therapy (MRT) using simulated dosimetric data as input. We seek to improve upon the most frequently used MRT metric, the peak-to-valley dose ratio (PVDR), by analyzing MRT dose distributions from a more volumetric perspective. Monte Carlo simulations were used to calculate dose distributions in three cubic head phantoms: a 2 cm mouse head, an 8 cm cat head and a 16 cm dog head. The dose distribution was calculated for a 4 × 4 mm2 microbeam array in each phantom, as well as a 16 × 16 mm2 array in the 8 cm cat head, and a 32 × 32 mm2 array in the 16 cm dog head. Microbeam widths of 25, 50 and 75 µm and center-to-center spacings of 100, 200 and 400 µm were considered. The metrics calculated for each simulation were the conventional PVDR, the peak-to-mean valley dose ratio (PMVDR), the mean dose and the percentage volume below a threshold dose. The PVDR ranged between 3 and 230 for the 2 cm mouse phantom, and between 2 and 186 for the 16 cm dog phantom depending on geometry. The corresponding ranges for the PMVDR were much smaller, being 2-49 (mouse) and 2-46 (dog), and showed a slightly weaker dependence on phantom size and array size. The ratio of the PMVDR to the PVDR varied from 0.21 to 0.79 for the different collimation configurations, indicating a difference between the geometric dependence on outcome that would be predicted by these two metrics. For unidirectional irradiation, the mean lesion dose was 102%, 79% and 42% of the mean skin dose for the 2 cm mouse, 8 cm cat and 16 cm dog head phantoms, respectively. However, the mean lesion dose recovered to 83% of the mean skin dose in the 16 cm dog phantom in intersecting cross-firing regions. The percentage volume below a 10% dose threshold was highly dependent on geometry, with ranges for the different collimation configurations of 2-87% and 33-96% for the 2 cm mouse and 16 cm dog heads, respectively. The results of this study
International Nuclear Information System (INIS)
Anderson, Danielle; Fallone, B Gino; Warkentin, Brad; Siegbahn, E Albert; Serduc, Raphael
2012-01-01
This work evaluates four dose-volume metrics applied to microbeam radiation therapy (MRT) using simulated dosimetric data as input. We seek to improve upon the most frequently used MRT metric, the peak-to-valley dose ratio (PVDR), by analyzing MRT dose distributions from a more volumetric perspective. Monte Carlo simulations were used to calculate dose distributions in three cubic head phantoms: a 2 cm mouse head, an 8 cm cat head and a 16 cm dog head. The dose distribution was calculated for a 4 × 4 mm 2 microbeam array in each phantom, as well as a 16 × 16 mm 2 array in the 8 cm cat head, and a 32 × 32 mm 2 array in the 16 cm dog head. Microbeam widths of 25, 50 and 75 µm and center-to-center spacings of 100, 200 and 400 µm were considered. The metrics calculated for each simulation were the conventional PVDR, the peak-to-mean valley dose ratio (PMVDR), the mean dose and the percentage volume below a threshold dose. The PVDR ranged between 3 and 230 for the 2 cm mouse phantom, and between 2 and 186 for the 16 cm dog phantom depending on geometry. The corresponding ranges for the PMVDR were much smaller, being 2–49 (mouse) and 2–46 (dog), and showed a slightly weaker dependence on phantom size and array size. The ratio of the PMVDR to the PVDR varied from 0.21 to 0.79 for the different collimation configurations, indicating a difference between the geometric dependence on outcome that would be predicted by these two metrics. For unidirectional irradiation, the mean lesion dose was 102%, 79% and 42% of the mean skin dose for the 2 cm mouse, 8 cm cat and 16 cm dog head phantoms, respectively. However, the mean lesion dose recovered to 83% of the mean skin dose in the 16 cm dog phantom in intersecting cross-firing regions. The percentage volume below a 10% dose threshold was highly dependent on geometry, with ranges for the different collimation configurations of 2–87% and 33–96% for the 2 cm mouse and 16 cm dog heads, respectively. The results of this
Escape from planetary neighbourhoods
Waalkens, H.; Burbanks, A.; Wiggins, S.
2005-01-01
In this paper we use recently developed phase-space transport theory coupled with a so-called classical spectral theorem to develop a dynamically exact and computationally efficient procedure for studying escape from a planetary neighbourhood. The ‘planetary neighbourhood’ is a bounded region of
Planetary Atmospheric Electricity
Leblanc, F; Yair, Y; Harrison, R. G; Lebreton, J. P; Blanc, M
2008-01-01
This volume presents our contemporary understanding of atmospheric electricity at Earth and in other solar system atmospheres. It is written by experts in terrestrial atmospheric electricity and planetary scientists. Many of the key issues related to planetary atmospheric electricity are discussed. The physics presented in this book includes ionisation processes in planetary atmospheres, charge generation and separation, and a discussion of electromagnetic signatures of atmospheric discharges. The measurement of thunderstorms and lightning, including its effects and hazards, is highlighted by articles on ground and space based instrumentation, and new missions.Theory and modelling of planetary atmospheric electricity complete this review of the research that is undertaken in this exciting field of space science. This book is an essential research tool for space scientists and geoscientists interested in electrical effects in atmospheres and planetary systems. Graduate students and researchers who are new to t...
Hargitai, H.
building Lunar or Martian bases. Factors of this category are the presence of water, 24 h communication oppor- tunity with Earth, radio noise free sky, radiation, temperature etc conditions. Since the emergence of the discipline of astrobiology, potentially habitable niches - and espe- cially the so far undiscovered de facto inhabited niches - make very high value of a given landscape. CONCLUSION As we have closer touch with planetary surfaces other than our, and as human (and manned) exploration of the Solar System will again be in the agenda, in addition to physical geographic or geologic factors, new ones: economical, cultural, aesthetic and geofactors together will determine the value of a certain landscape in a given area. Its study will be more geographic than geologic. The above listed ele- ments can be important when chosing a base or landing site on any planetary body. The landscape values can be merged in a GIS system and this way we can more ea- sity determine not only landcape types but also the optimal landing sites for future missions. References [1] Mezõsi , G.: A földrajzi táj (geographic landscape), in: Általános ter- mészerföldrajz, Budapest, 1993. pp 807-818. [2] Baker, V. R.: Extraterrestrial Geo- morphology: An Introduction. Geomorphology 37 (2001) pp 175-178. [3] Jakucs, L.: A földrajzi burok kozmogén és endogén dinamikája (Endogenic and Cosmogenic Dy- namics of the Geospheres). JATEPress, 1997. 3
Modeling Radar Scattering by Planetary Regoliths for Varying Angles of Incidence
Prem, P.; Patterson, G. W.; Zimmerman, M. I.
2017-12-01
Bistatic radar observations can play an important role in characterizing the texture and composition of planetary regoliths. Multiple scattering within a closely-packed particulate medium, such as a regolith, can lead to a response referred to as the Coherent Backscatter Opposition Effect (CBOE), associated with an increase in the intensity of backscattered radiation and an increase in Circular Polarization Ratio (CPR) at small bistatic angles. The nature of the CBOE is thought to depend not only on regolith properties, but also on the angle of incidence (Mishchenko, 1992). The latter factor is of particular interest in light of recent radar observations of the Moon over a range of bistatic and incidence angles by the Mini-RF instrument (on board the Lunar Reconnaissance Orbiter), operating in bistatic mode with a ground-based transmitter at the Arecibo Observatory. These observations have led to some intriguing results that are not yet well-understood - for instance, the lunar South Polar crater Cabeus shows an elevated CPR at only some combinations of incidence angle/bistatic angle, a potential clue to the depth distribution of water ice at the lunar poles (Patterson et al., 2017). Our objective in this work is to develop a model for radar scattering by planetary regoliths that can assist in the interpretation of Mini-RF observations. We approach the problem by coupling the Multiple Sphere T-Matrix (MSTM) code of Mackowski and Mishchenko (2011) to a Monte Carlo radiative transfer model. The MSTM code is based on the solution of Maxwell's equations for the propagation of electromagnetic waves in the presence of a cluster of scattering/absorbing spheres, and can be used to model the scattering of radar waves by an aggregation of nominal regolith particles. The scattering properties thus obtained serve as input to the Monte Carlo model, which is used to simulate radar scattering at larger spatial scales. The Monte Carlo approach has the advantage of being able to
Christoffersen, R.; Rahman, Z.; Keller, L. P.; Dukes, C.; Baragiola, R.
2012-01-01
Energetic ions present in the diverse plasma conditions in space play a significant role in the formation and modification of solid phases found in environments ranging from the interstellar medium (ISM) to the surfaces of airless bodies such as asteroids and the Moon. These effects are often referred to as space radiation processing, a term that encompasses changes induced in natural space-exposed materials that may be only structural, such as in radiation-induced amorphization, or may involve ion-induced nanoscale to microscale chemical changes, as occurs in preferential sputtering and ion-beam mixing. Ion sputtering in general may also be responsible for partial or complete erosion of space exposed materials, in some instances possibly bringing about the complete destruction of free-floating solid grains in the ISM or in circumstellar nebular dust clouds. We report here on two examples of the application of high-resolution and analytical transmission electron microscopy (TEM) to problems in space radiation processing. The first problem concerns the role of space radiation processing in controlling the overall fate of Fe sulfides as hosts for sulfur in the ISM. The second problem concerns the known, but as yet poorly quantified, role of space radiation processing in lunar space weathering.
International Nuclear Information System (INIS)
2013-01-01
The chapter one presents the composition of matter and atomic theory; matter structure; transitions; origin of radiation; radioactivity; nuclear radiation; interactions in decay processes; radiation produced by the interaction of radiation with matter
Gazetteer of Planetary Nomenclature
National Aeronautics and Space Administration — Planetary nomenclature, like terrestrial nomenclature, is used to uniquely identify a feature on the surface of a planet or satellite so that the feature can be...
X-ray observations of planetary nebulae
International Nuclear Information System (INIS)
Apparao, K.M.V.; Tarafdar, S.P.
1990-01-01
The Einstein satellite was used to observe 19 planetary nebulae and X-ray emission was detected from four planetary nebulae. The EXOSAT satellite observed 12 planetary nebulae and five new sources were detected. An Einstein HRI observation shows that NGC 246 is a point source, implying that the X-rays are from the central star. Most of the detected planetary nebulae are old and the X-rays are observed during the later stage of planetary nebulae/central star evolution, when the nebula has dispersed sufficiently and/or when the central star gets old and the heavy elements in the atmosphere settle down due to gravitation. However in two cases where the central star is sufficiently luminous X-rays were observed, even though they were young nebulae; the X-radiation ionizes the nebula to a degree, to allow negligible absorption in the nebula. Temperature T x is obtained using X-ray flux and optical magnitude and assuming the spectrum is blackbody. T x agrees with Zanstra temperature obtained from optical Helium lines. (author)
Antitwilight II: Monte Carlo simulations.
Richtsmeier, Steven C; Lynch, David K; Dearborn, David S P
2017-07-01
For this paper, we employ the Monte Carlo scene (MCScene) radiative transfer code to elucidate the underlying physics giving rise to the structure and colors of the antitwilight, i.e., twilight opposite the Sun. MCScene calculations successfully reproduce colors and spatial features observed in videos and still photos of the antitwilight taken under clear, aerosol-free sky conditions. Through simulations, we examine the effects of solar elevation angle, Rayleigh scattering, molecular absorption, aerosol scattering, multiple scattering, and surface reflectance on the appearance of the antitwilight. We also compare MCScene calculations with predictions made by the MODTRAN radiative transfer code for a solar elevation angle of +1°.
Monte Carlo Particle Transport: Algorithm and Performance Overview
International Nuclear Information System (INIS)
Gentile, N.; Procassini, R.; Scott, H.
2005-01-01
Monte Carlo methods are frequently used for neutron and radiation transport. These methods have several advantages, such as relative ease of programming and dealing with complex meshes. Disadvantages include long run times and statistical noise. Monte Carlo photon transport calculations also often suffer from inaccuracies in matter temperature due to the lack of implicitness. In this paper we discuss the Monte Carlo algorithm as it is applied to neutron and photon transport, detail the differences between neutron and photon Monte Carlo, and give an overview of the ways the numerical method has been modified to deal with issues that arise in photon Monte Carlo simulations
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
RESONANCE ⎜ August 2014. GENERAL ⎜ ARTICLE. Variational Monte Carlo Technique. Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. Keywords. Variational methods, Monte. Carlo techniques, harmonic os- cillators, quantum mechanical systems. Sukanta Deb is an. Assistant Professor in the.
Indian Academy of Sciences (India)
. Keywords. Gibbs sampling, Markov Chain. Monte Carlo, Bayesian inference, stationary distribution, conver- gence, image restoration. Arnab Chakraborty. We describe the mathematics behind the Markov. Chain Monte Carlo method of ...
Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.; Roche, Kenneth J.; Kurtz, Richard J.; Wirth, Brian D.
2016-01-01
Object kinetic Monte Carlo was employed to study the effect of dose rate on the evolution of vacancy microstructure in polycrystalline tungsten under neutron bombardment. The evolution was followed up to 1.0 displacement per atom (dpa) with point defects generated in accordance with a primary knock-on atom (PKA) spectrum corresponding to 14-MeV neutrons. The present study includes the effect of grain size (2.0 and 4.0 $\\mu$m) but excludes the impact of transmutation or pre-existing defects be...
Non-planetary Science from Planetary Missions
Elvis, M.; Rabe, K.; Daniels, K.
2015-12-01
Planetary science is naturally focussed on the issues of the origin and history of solar systems, especially our own. The implications of an early turbulent history of our solar system reach into many areas including the origin of Earth's oceans, of ores in the Earth's crust and possibly the seeding of life. There are however other areas of science that stand to be developed greatly by planetary missions, primarily to small solar system bodies. The physics of granular materials has been well-studied in Earth's gravity, but lacks a general theory. Because of the compacting effects of gravity, some experiments desired for testing these theories remain impossible on Earth. Studying the behavior of a micro-gravity rubble pile -- such as many asteroids are believed to be -- could provide a new route towards exploring general principles of granular physics. These same studies would also prove valuable for planning missions to sample these same bodies, as techniques for anchoring and deep sampling are difficult to plan in the absence of such knowledge. In materials physics, first-principles total-energy calculations for compounds of a given stoichiometry have identified metastable, or even stable, structures distinct from known structures obtained by synthesis under laboratory conditions. The conditions in the proto-planetary nebula, in the slowly cooling cores of planetesimals, and in the high speed collisions of planetesimals and their derivatives, are all conditions that cannot be achieved in the laboratory. Large samples from comets and asteroids offer the chance to find crystals with these as-yet unobserved structures as well as more exotic materials. Some of these could have unusual properties important for materials science. Meteorites give us a glimpse of these exotic materials, several dozen of which are known that are unique to meteorites. But samples retrieved directly from small bodies in space will not have been affected by atmospheric entry, warmth or
Public Participation in Planetary Exploration
Friedman, Louis
2000-07-01
In the past several years The Planetary Society has created several innovative opportunities for general public participation in the exploration of the solar system and the search for extraterrestrial life. The conduct of such exploration has traditionally been the province of a few thousand, at most, of professionally involved scientists and engineers. Yet the rationale for spending resources required by broad and far-reaching exploration involves a greater societal interest - it frequently being noted that the rationale cannot rely on science alone. This paper reports on the more notable of the opportunities for general public participation, in particular: 1) Visions of Mars: a CD containing the works of science fiction about Mars, designed to be placed on Mars as the first library to be found by eventual human explorers; 2) MAPEX: a Microelectronics And Photonics Experiment, measuring the radiation environment for future human explorers of Mars, and containing a electron beam lithograph of names of all the members of The Planetary Society at a particular time; 3) Naming of spacecraft: Involvement in the naming of spacecraft: Magellan, Sojourner; 4) The Mars Microphone: the first privately funded instrument to be sent to another world; 5) Red Rover Goes to Mars: the first commercial-education partnership on a planetary mission; 6) Student designed nanoexperiments: to fly on a Mars lander; and 7) SETI@home: a tool permitting millions to contribute to research and data processing in the search for extraterrestrial intelligence. A brief description of each of the projects will be given, and the opportunity it provided for public participation described. The evolving complexity of these projects suggest that more opportunities will be found, and that the role of public participation can increase at the same time as making substantive contributions to the flight missions. It will be suggested that these projects presage the day that planetary exploration will be truly
Measuring and interpreting X-ray fluorescence from planetary surfaces.
Owens, Alan; Beckhoff, Burkhard; Fraser, George; Kolbe, Michael; Krumrey, Michael; Mantero, Alfonso; Mantler, Michael; Peacock, Anthony; Pia, Maria-Grazia; Pullan, Derek; Schneider, Uwe G; Ulm, Gerhard
2008-11-15
As part of a comprehensive study of X-ray emission from planetary surfaces and in particular the planet Mercury, we have measured fluorescent radiation from a number of planetary analog rock samples using monochromatized synchrotron radiation provided by the BESSY II electron storage ring. The experiments were carried out using a purpose built X-ray fluorescence (XRF) spectrometer chamber developed by the Physikalisch-Technische Bundesanstalt, Germany's national metrology institute. The XRF instrumentation is absolutely calibrated and allows for reference-free quantitation of rock sample composition, taking into account secondary photon- and electron-induced enhancement effects. The fluorescence data, in turn, have been used to validate a planetary fluorescence simulation tool based on the GEANT4 transport code. This simulation can be used as a mission analysis tool to predict the time-dependent orbital XRF spectral distributions from planetary surfaces throughout the mapping phase.
Airships for Planetary Exploration
Colozza, Anthony
2004-01-01
The feasibility of utilizing an airship for planetary atmospheric exploration was assessed. The environmental conditions of the planets and moons within our solar system were evaluated to determine their applicability for airship flight. A station-keeping mission of 50 days in length was used as the baseline mission. Airship sizing was performed utilizing both solar power and isotope power to meet the baseline mission goal at the selected planetary location. The results show that an isotope-powered airship is feasible within the lower atmosphere of Venus and Saturn s moon Titan.
International Nuclear Information System (INIS)
Hudzietzova, J.; Sabol, J.; Fueloep, M.
2013-01-01
In the paper using the Monte Carlo method ( code MCNPX) were calculated absorbed doses in organs caregivers, from which thereafter was set the value of the equivalent dose in these organs by appropriate formulas and then effective doses in selected geometries using protective shielding devices. The results show that using of shielding aprons equivalent of 1 mm of lead will reduce the exposure of workers caring for patients after administration of the radionuclide I-131 by about 30%. If the caregiver without protective shielding aprons is located between two patients, the gamma rays will be reduced by about 18% due to averted body of caregiver, while the worker's personal dosimeter located at the chest will register approximately 40% lower value of personal dose equivalent. (authors)
International Nuclear Information System (INIS)
Hudzietzova, J.; Sabol, J.; Fueloep, M.
2013-01-01
In the paper using the Monte Carlo method ( code MCNPX) were calculated absorbed doses in organs caregivers, from which thereafter was set the value of the equivalent dose in these organs by appropriate formulas and then effective doses in selected geometries using protective shielding devices. The results show that using of shielding aprons equivalent of 1 mm of lead will reduce the exposure of workers caring for patients after administration of the radionuclide I-131 by about 30%. If the caregiver without protective shielding aprons is located between two patients, the gamma rays will be reduced by about 18% due to averted body of caregiver, while the worker's personal dosimeter located at the chest will register approximately 40% lower value of personal dose equivalent. (author)
International Nuclear Information System (INIS)
Ramirez Montenegro, E.S. del
2000-01-01
In the present thesis an evaluation of the radiographic techniques was made by the students in the clinics of the Faculty of Odontology in the Universidad de San Carlos. The sample was 56 students of fourth and fifth year, an survey form was designed including information about radiographic technique, pacient, film seting up, cone alineation, furthermore exposure repetitions and its cause. It was conclude that paralelism technique is used by 46% of the students, 41% bicectriz technique, 13% both techniques, 100 % aleta mordible. Regarding to equipment set up previous to exposure 88% of the students sets the equipment in acceptable way, 88% used XCP accesory to hold the film without desinfection procedures and it was not set up properly. A 92% of the evaluated student had to repeat the exposures due to wrong application of radiographic techniques
International Nuclear Information System (INIS)
Bakht, M.K.; Haddadi, A.; Sadeghi, M.; Ahmadi, S.J.; Sadjadi, S.S.; Tenreiro, C.
2013-01-01
Previously, a promising β - -emitting praseodymium-142 glass seed was proposed for brachytherapy of prostate cancer. In accordance with the previous study, a 142 Pr capillary tube-based radioactive implant (CTRI) was suggested as a source with a new structure to enhance application of β - -emitting radioisotopes such as 142 Pr in brachytherapy. Praseodymium oxide powder was encapsulated in a glass capillary tube. Then, a thin and flexible fluorinated ethylene propylene Teflon layer sealed the capillary tube. The source was activated in the Tehran Research Reactor by the 141 Pr(n, γ) 142 Pr reaction. Measurements of the dosimetric parameters were performed using GafChromic radiochromic film. In addition, the dose rate distribution of 142 Pr CTRI was calculated by modeling 142 Pr source in a water phantom using Monte Carlo N-Particle Transport (MCNP5) Code. The active source was unreactive and did not leak in water. In comparison with the earlier proposed 142 Pr seed, the suggested source showed similar desirable dosimetric characteristics. Moreover, the 142 Pr CTRI production procedure may be technically and economically more feasible. The mass of praseodymium in CTRI structure could be greater than that of the 142 Pr glass seed; therefore, the required irradiation time and the neutron flux could be reduced. A 142 Pr CTRI was proposed for brachytherapy of prostate cancer. The dosimetric calculations by the experimental measurements and Monte Carlo simulation were performed to fulfill the requirements according to the American Association of Physicists in Medicine recommendations before the clinical use of new brachytherapy sources. The characteristics of the suggested source were compared with those of the previously proposed 142 Pr glass seed. (author)
New and misclassified planetary nebulae
International Nuclear Information System (INIS)
Kohoutek, L.
1978-01-01
Since the 'Catalogue of Galactic Planetary Nebulae' 226 new objects have been classified as planetary nebulae. They are summarized in the form of designations, names, coordinates and the references to the discovery. Further 9 new objects have been added and called 'proto-planetary nebulae', but their status is still uncertain. Only 34 objects have been included in the present list of misclassified planetary nebulae although the number of doubtful cases is much larger. (Auth.)
International Nuclear Information System (INIS)
Mathis, J.S.
1978-01-01
The author's review concentrates on theoretical aspects of dust in planetary nebulae (PN). He considers the questions: how much dust is there is PN; what is its composition; what effects does it have on the ionization structure, on the dynamics of the nebula. (Auth.)
On Aryabhata's Planetary Constants
Kak, Subhash
2001-01-01
This paper examines the theory of a Babylonian origin of Aryabhata's planetary constants. It shows that Aryabhata's basic constant is closer to the Indian counterpart than to the Babylonian one. Sketching connections between Aryabhata's framework and earlier Indic astronomical ideas on yugas and cyclic calendar systems, it is argued that Aryabhata's system is an outgrowth of an earlier Indic tradition.
The planetary scientist's companion
Lodders, Katharina
1998-01-01
A comprehensive and practical book of facts and data about the Sun, planets, asteroids, comets, meteorites, the Kuiper belt and Centaur objects in our solar system. Also covered are properties of nearby stars, the interstellar medium, and extra-solar planetary systems.
Directory of Open Access Journals (Sweden)
Kuczyński Paweł
2014-06-01
Full Text Available The paper deals with a solution of radiation heat transfer problems in enclosures filled with nonparticipating medium using ray tracing on hierarchical ortho-Cartesian meshes. The idea behind the approach is that radiative heat transfer problems can be solved on much coarser grids than their counterparts from computational fluid dynamics (CFD. The resulting code is designed as an add-on to OpenFOAM, an open-source CFD program. Ortho-Cartesian mesh involving boundary elements is created based upon CFD mesh. Parametric non-uniform rational basis spline (NURBS surfaces are used to define boundaries of the enclosure, allowing for dealing with domains of complex shapes. Algorithm for determining random, uniformly distributed locations of rays leaving NURBS surfaces is described. The paper presents results of test cases assuming gray diffusive walls. In the current version of the model the radiation is not absorbed within gases. However, the ultimate aim of the work is to upgrade the functionality of the model, to problems in absorbing, emitting and scattering medium projecting iteratively the results of radiative analysis on CFD mesh and CFD solution on radiative mesh.
SPEX: the Spectropolarimeter for Planetary Exploration
Rietjens, J. H. H.; Snik, F.; Stam, D. M.; Smit, J. M.; van Harten, G.; Keller, C. U.; Verlaan, A. L.; Laan, E. C.; ter Horst, R.; Navarro, R.; Wielinga, K.; Moon, S. G.; Voors, R.
2017-11-01
We present SPEX, the Spectropolarimeter for Planetary Exploration, which is a compact, robust and low-mass spectropolarimeter designed to operate from an orbiting or in situ platform. Its purpose is to simultaneously measure the radiance and the state (degree and angle) of linear polarization of sunlight that has been scattered in a planetary atmosphere and/or reflected by a planetary surface with high accuracy. The degree of linear polarization is extremely sensitive to the microphysical properties of atmospheric or surface particles (such as size, shape, and composition), and to the vertical distribution of atmospheric particles, such as cloud top altitudes. Measurements as those performed by SPEX are therefore crucial and often the only tool for disentangling the many parameters that describe planetary atmospheres and surfaces. SPEX uses a novel, passive method for its radiance and polarization observations that is based on a carefully selected combination of polarization optics. This method, called spectral modulation, is the modulation of the radiance spectrum in both amplitude and phase by the degree and angle of linear polarization, respectively. The polarization optics consists of an achromatic quarter-wave retarder, an athermal multiple-order retarder, and a polarizing beam splitter. We will show first results obtained with the recently developed prototype of the SPEX instrument, and present a performance analysis based on a dedicated vector radiative transport model together with a recently developed SPEX instrument simulator.
Valiente, D
2001-01-01
In this thesis the procedures of radiation protection used by students of dentistry, also the infrastructure of equipment, protective barriers and protective devices at the clinic of the faculty was evaluated. A sample of 76 students and two technicians were evaluated, also 7 dental units with x-ray tubes were evaluated. The conclusions are that only 2 equipment of x-rays meets the requirements of radiation safety and radiology techniques used by the students need to be improved to obtain good image quality and therefore better diagnostic by the students could be made.
International Nuclear Information System (INIS)
Valiente, Dalsy
2001-01-01
In this thesis the procedures of radiation protection used by students of dentistry, also the infrastructure of equipment, protective barriers and protective devices at the clinic of the faculty was evaluated. A sample of 76 students and two technicians were evaluated, also 7 dental units with x-ray tubes were evaluated. The conclusions are that only 2 equipment of x-rays meets the requirements of radiation safety and radiology techniques used by the students need to be improved to obtain good image quality and therefore better diagnostic by the students could be made
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Bergström, Ida; Elfgren, Erik
2013-06-11
At the particle physics laboratory CERN in Geneva, Switzerland, the Neutron Time-of-Flight facility has recently started the construction of a second experimental line. The new neutron beam line will unavoidably induce radiation in both the experimental area and in nearby accessible areas. Computer simulations for the minimization of the background were carried out using the FLUKA Monte Carlo simulation package. The background radiation in the new experimental area needs to be kept to a minimum during measurements. This was studied with focus on the contributions from backscattering in the beam dump. The beam dump was originally designed for shielding the outside area using a block of iron covered in concrete. However, the backscattering was never studied in detail. In this thesis, the fluences (i.e. the flux integrated over time) of neutrons and photons were studied in the experimental area while the beam dump design was modified. An optimized design was obtained by stopping the fast neutrons in a high Z mat...
Specialized Monte Carlo codes versus general-purpose Monte Carlo codes
International Nuclear Information System (INIS)
Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi
2002-01-01
The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)
Approximating Sievert Integrals to Monte Carlo Methods to calculate ...
African Journals Online (AJOL)
Radiation dose rates along the transverse axis of a miniature P192PIr source were calculated using Sievert Integral (considered simple and inaccurate), and by the sophisticated and accurate Monte Carlo method. Using data obt-ained by the Monte Carlo method as benchmark and applying least squares regression curve ...
Johnson, Daniel; Chen, Yong; Ahmad, Salahuddin
2015-01-01
The factors influencing carbon ion therapy can be predicted from accurate knowledge about the production of secondary particles from the interaction of carbon ions in water/tissue-like materials, and subsequently the interaction of the secondary particles in the same materials. The secondary particles may have linear energy transfer (LET) values that potentially increase the relative biological effectiveness of the beam. Our primary objective in this study was to classify and quantify the secondary particles produced, their dose averaged LETs, and their dose contributions in the absorbing material. A 1 mm diameter carbon ion pencil beam with energies per nucleon of 155, 262, and 369 MeV was used in a geometry and tracking 4 Monte Carlo simulation to interact in a 27 L water phantom containing 3000 rectangular detector voxels. The dose-averaged LET and the dose contributions of primary and secondary particles were calculated from the simulation. The results of the simulations show that the secondary particles that contributed a major dose component had LETs 600 keV/µm contributed only <0.3% of the dose.
Directory of Open Access Journals (Sweden)
Daniel Johnson
2015-01-01
Full Text Available The factors influencing carbon ion therapy can be predicted from accurate knowledge about the production of secondary particles from the interaction of carbon ions in water/tissue-like materials, and subsequently the interaction of the secondary particles in the same materials. The secondary particles may have linear energy transfer (LET values that potentially increase the relative biological effectiveness of the beam. Our primary objective in this study was to classify and quantify the secondary particles produced, their dose averaged LETs, and their dose contributions in the absorbing material. A 1 mm diameter carbon ion pencil beam with energies per nucleon of 155, 262, and 369 MeV was used in a geometry and tracking 4 Monte Carlo simulation to interact in a 27 L water phantom containing 3000 rectangular detector voxels. The dose-averaged LET and the dose contributions of primary and secondary particles were calculated from the simulation. The results of the simulations show that the secondary particles that contributed a major dose component had LETs 600 keV/µm contributed only <0.3% of the dose.
International Nuclear Information System (INIS)
Salem, Youbba-Ould
2014-01-01
We characterize a passive dosimeter capable of measuring both fast and thermal neutrons for ambient and personal dosimetry. These neutrons can be detected in a mixed neutron-gamma field with appropriate converters (polyethylene for fast neutrons, cadmium for thermal neutrons). Monte Carlo simulations with MCNPX helped with the geometrical conception of the dosimeter and the choice of materials. The responses of the RPL dosimeter to these neutrons are linear in H * (10) and H p (10) with detection limits of 2 mSv for fast neutrons and 0.19 mSv for thermal neutrons. The angular dependencies are satisfactory according to the ISO 21909 norm. A calibration factor of (9.5 ± 0.5)*10 -2 mSv.cm 2 /RPL signal is obtained to the fast neutrons of the IPHC's 241 Am-Be calibrator. This factor is (9.7 ± 0.3)*10 -3 mSv.cm 2 /RPL signal for the thermalized neutrons. (author)
Directory of Open Access Journals (Sweden)
Bardenet Rémi
2013-07-01
Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.
An Overview of the Monte Carlo Methods, Codes, & Applications Group
Energy Technology Data Exchange (ETDEWEB)
Trahan, Travis John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-08-30
This report sketches the work of the Group to deliver first-principle Monte Carlo methods, production quality codes, and radiation transport-based computational and experimental assessments using the codes MCNP and MCATK for such applications as criticality safety, non-proliferation, nuclear energy, nuclear threat reduction and response, radiation detection and measurement, radiation health protection, and stockpile stewardship.
Planetary seismology and interiors
Toksoz, M. N.
1979-01-01
This report briefly summarizes knowledge gained in the area of planetary seismology in the period 1969-1979. Attention is given to the seismic instruments, the seismic environment (noise, characteristics of seismic wave propagation, etc.), and the seismicity of the moon and Mars as determined by the Apollo missions and Viking Lander experiments, respectively. The models of internal structures of the terrestrial planets are discussed, with the earth used for reference.
Gawlitza, J; Haubenreisser, H; Meyer, M; Hagelstein, C; Sudarski, S; Schoenberg, S O; Henzler, T
2016-01-01
The aim of this study was to systematically compare organ-specific-radiation dose levels between a radiation dose optimized perfusion CT (dVPCT) protocol of the liver and a tri-phasic standard CT protocol of the liver using a Monte-Carlo-Simulation-based analysis platform. The complete CT data of 52 patients (41 males; mean age 65 ± 12) with suspected HCC that underwent dVPCT examinations on a 3rd generation dual-source CT (Somatom Force, Siemens) with a dose optimized tube voltage of 70 kVp or 80 kVp were exported to an analysis platform (Radimetrics, Bayer). The dVPCT studies were matched with a reference group of 50 patients (35 males; mean age 65 ± 14) that underwent standard tri-phasic CT (sCT) examinations of the liver with 130 kVp using the calculated water-equivalent-diameter of the patients. The analysis platform was used for the calculation of the organ-specific effective dose (ED) as well as global radiation-dose parameters (ICRP103). The organ-specific ED of the dVPCT protocol was statistically significantly lower when compared to the sCT in 14 of 21, and noninferior in a total of 18 of 21 examined items (all p examinations were especially in the dose sensitive organs such as the red marrow (17.3 mSv vs 24.6 mSv, p = effective organ dose levels, especially in dose sensitive organs, while providing additional functional information which is of paramount importance in patients undergoing novel targeted therapies.
Tinetti, Giovanna
2014-04-28
Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. Whether this fact is the result of a selection bias induced by the kind of techniques used to discover new planets--mainly radial velocity and transit--or simply the proof that the Solar System is a rarity in the Milky Way, we do not know yet. What is clear, though, is that the Solar System has failed to be the paradigm not only in our Galaxy but even 'just' in the solar neighbourhood. This finding, although unsettling, forces us to reconsider our knowledge of planets under a different light and perhaps question a few of the theoretical pillars on which we base our current 'understanding'. The next decade will be critical to advance in what we should perhaps call Galactic planetary science. In this paper, I review highlights and pitfalls of our current knowledge of this topic and elaborate on how this knowledge might arguably evolve in the next decade. More critically, I identify what should be the mandatory scientific and technical steps to be taken in this fascinating journey of remote exploration of planets in our Galaxy.
Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I
2015-05-01
This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Meric, Ilker; Johansen, Geir A; Holstad, Marie B; Mattingly, John; Gardner, Robin P
2012-01-01
Prompt gamma-ray neutron activation analysis (PGNAA) has been and still is one of the major methods of choice for the elemental analysis of various bulk samples. This is mostly due to the fact that PGNAA offers a rapid, non-destructive and on-line means of sample interrogation. The quantitative analysis of the prompt gamma-ray data could, on the other hand, be performed either through the single peak analysis or the so-called Monte Carlo library least-squares (MCLLS) approach, of which the latter has been shown to be more sensitive and more accurate than the former. The MCLLS approach is based on the assumption that the total prompt gamma-ray spectrum of any sample is a linear combination of the contributions from the individual constituents or libraries. This assumption leads to, through the minimization of the chi-square value, a set of linear equations which has to be solved to obtain the library multipliers, a process that involves the inversion of the covariance matrix. The least-squares solution may be extremely uncertain due to the ill-conditioning of the covariance matrix. The covariance matrix will become ill-conditioned whenever, in the subsequent calculations, two or more libraries are highly correlated. The ill-conditioning will also be unavoidable whenever the sample contains trace amounts of certain elements or elements with significantly low thermal neutron capture cross-sections. In this work, a new iterative approach, which can handle the ill-conditioning of the covariance matrix, is proposed and applied to a hydrocarbon multiphase flow problem in which the parameters of interest are the separate amounts of the oil, gas, water and salt phases. The results of the proposed method are also compared with the results obtained through the implementation of a well-known regularization method, the truncated singular value decomposition. Final calculations indicate that the proposed approach would be able to treat ill-conditioned cases appropriately. (paper)
Kießling, N.; Bieberle, A.; Hampel, U.
2008-10-01
Limited energy resolution in scintillation type gamma ray detectors leads to systematic errors in photon counting because the pulse height discrimination stages cannot accurately discriminate interactions with full respectively partial deposition of isotopic emission energy. The resulting error is a systematic positive count rate offset originating from erroneously counted scattered photons. The origin of scattering may be the detector itself (scintillation crystals and other construction material) as well as components of the setup, including the object of investigation. In this article results of a simulation study are presented which was carried out to assess the role of different design parameters for the count rate accuracy of a high resolution gamma ray detector used for transmission tomography. Thereby the simulation software Geant4 Application for Emission Tomography (GATE) was used. As a target parameter we evaluated the radiation cross-talk, which is the amount of erroneously counted interactions from photons which have undergone Compton scattering in neighbouring crystals. For the given detector design it was found that cross-talk obtained from the simulated data is in good agreement with experimentally determined cross-talk. It could further be shown by virtual detector design changes that radiation cross-talk can be reduced only to a degree that would still require additional software correction measures, such as scattering correction algorithms, if quantitative accuracy it demanded.
Dynamics of Planetary Systems in Star Clusters
Spurzem, R.; Giersz, M.; Heggie, D. C.; Lin, D. N. C.
2009-05-01
At least 10%-15% of nearby Sunlike stars have known Jupiter-mass planets. In contrast, very few planets are found in mature open and globular clusters such as the Hyades and 47 Tuc. We explore here the possibility that this dichotomy is due to the postformation disruption of planetary systems associated with the stellar encounters in long-lived clusters. One supporting piece of evidence for this scenario is the discovery of freely floating low-mass objects in star forming regions. We use two independent numerical approaches, a hybrid Monte Carlo and a direct N-body method, to simulate the impact of the encounters. We show that the results of numerical simulations are in reasonable agreement with analytical determinations in the adiabatic and impulsive limits. They indicate that distant stellar encounters generally do not significantly modify the compact and nearly circular orbits. However, moderately close stellar encounters, which are likely to occur in dense clusters, can excite planets' orbital eccentricity and induce dynamical instability in systems that are closely packed with multiple planets. The disruption of planetary systems occurs primarily through occasional nearly parabolic, nonadiabatic encounters, though eccentricity of the planets evolves through repeated hyperbolic adiabatic encounters that accumulate small-amplitude changes. The detached planets are generally retained by the potential of their host clusters as free floaters in young stellar clusters such as σ Orionis. We compute effective cross sections for the dissolution of planetary systems and show that, for all initial eccentricities, dissolution occurs on timescales that are longer than the dispersion of small stellar associations, but shorter than the age of typical open and globular clusters. Although it is much more difficult to disrupt short-period planets, close encounters can excite modest eccentricity among them, such that subsequent tidal dissipation leads to orbital decay, tidal
Impact-driven planetary desiccation: The origin of the dry Venus
Kurosawa, Kosuke
2015-11-01
The fate of surface water on Venus is one of the most important outstanding problems in comparative planetology. Although Venus should have had a large amount of surface water (like the Earth) during its formation, the current water content on the Venusian surface is only 1 part in 100 000 of that of the mass of Earth's oceans. Here a new concept is proposed to explain water removal on a steam-covered proto Venus, referred to as ;impact-driven planetary desiccation;. Since a steam atmosphere is photochemically unstable, water vapor dissociates into hydrogen and oxygen. Then, hydrogen escapes easily into space through hydrodynamic escape driven by strong extreme ultraviolet radiation from the young Sun. The focus is on the intense impact bombardment during the terminal stage of planetary accretion as generators of a significant amount of reducing agent. The fine-grained ejecta remove the residual oxygen, the counter part of escaped hydrogen, via the oxidation of iron-bearing rocks in a hot atmosphere. Thus, hypervelocity impacts cause net desiccation of the planetary surface. I constructed a stochastic cratering model using a Monte Carlo approach to investigate the cumulative mass of nonoxidized, ejected rocks due to the intense impact bombardment. The ejecta mass after each impact was calculated using the π-group scaling laws and a modified Maxwell's Z model. The effect of projectile penetration into the ground on the ejecta mass was also included. Next, an upper limit on the total amount of removed water was calculated using the stoichiometric limit of the oxidation of basaltic rocks, taking into account the effect of fast H2 escape. It is shown that a thick steam atmosphere with a mass equivalent to that of the terrestrial oceans would be removed. The cumulative mass of rocky ejecta released into the atmosphere reaches 1 wt% of the host planet, which is 10 000 times of the current mass of the Earth's atmosphere. These results strongly suggest that chemical
The impact of Monte Carlo simulation: a scientometric analysis of scholarly literature
Pia, Maria Grazia; Bell, Zane W; Dressendorfer, Paul V
2010-01-01
A scientometric analysis of Monte Carlo simulation and Monte Carlo codes has been performed over a set of representative scholarly journals related to radiation physics. The results of this study are reported and discussed. They document and quantitatively appraise the role of Monte Carlo methods and codes in scientific research and engineering applications.
Ponomarev, Artem; Cucinotta, F.
2011-01-01
To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to
Europlanet Research Infrastructure: Planetary Simulation Facilities
Davies, G. R.; Mason, N. J.; Green, S.; Gómez, F.; Prieto, O.; Helbert, J.; Colangeli, L.; Srama, R.; Grande, M.; Merrison, J.
2008-09-01
EuroPlanet The Europlanet Research Infrastructure consortium funded under FP7 aims to provide the EU Planetary Science community greater access for to research infrastructure. A series of networking and outreach initiatives will be complimented by joint research activities and the formation of three Trans National Access distributed service laboratories (TNA's) to provide a unique and comprehensive set of analogue field sites, laboratory simulation facilities, and extraterrestrial sample analysis tools. Here we report on the infrastructure that comprises the second TNA; Planetary Simulation Facilities. 11 laboratory based facilities are able to recreate the conditions found in the atmospheres and on the surfaces of planetary systems with specific emphasis on Martian, Titan and Europa analogues. The strategy has been to offer some overlap in capabilities to ensure access to the highest number of users and to allow for progressive and efficient development strategies. For example initial testing of mobility capability prior to the step wise development within planetary atmospheres that can be made progressively more hostile through the introduction of extreme temperatures, radiation, wind and dust. Europlanet Research Infrastructure Facilties: Mars atmosphere simulation chambers at VUA and OU These relatively large chambers (up to 1 x 0.5 x 0.5 m) simulate Martian atmospheric conditions and the dual cooling options at VUA allows stabilised instrument temperatures while the remainder of the sample chamber can be varied between 220K and 350K. Researchers can therefore assess analytical protocols for instruments operating on Mars; e.g. effect of pCO2, temperature and material (e.g., ± ice) on spectroscopic and laser ablation techniques while monitoring the performance of detection technologies such as CCD at low T & variable p H2O & pCO2. Titan atmosphere and surface simulation chamber at OU The chamber simulates Titan's atmospheric composition under a range of
Glass, Brian J.; Thompson, S.; Paulsen, G.
2010-01-01
Several proposed or planned planetary science missions to Mars and other Solar System bodies over the next decade require subsurface access by drilling. This paper discusses the problems of remote robotic drilling, an automation and control architecture based loosely on observed human behaviors in drilling on Earth, and an overview of robotic drilling field test results using this architecture since 2005. Both rotary-drag and rotary-percussive drills are targeted. A hybrid diagnostic approach incorporates heuristics, model-based reasoning and vibration monitoring with neural nets. Ongoing work leads to flight-ready drilling software.
Curley, Casey Michael
Monte Carlo (MC) and Pencil Beam (PB) calculations are compared to their measured planar dose distributions using a 2-D diode array for lung Stereotactic Body Radiation Therapy (SBRT). The planar dose distributions were studied for two different phantom types: an in-house heterogeneous phantom and a homogeneous phantom. The motivation is to mimic the human anatomy during a lung SBRT treatment and incorporate heterogeneities into the pre-treatment Quality Assurance process, where measured and calculated planar dose distributions are compared before the radiation treatment. Individual and combined field dosimetry has been performed for both fixed gantry angle (anterior to posterior) and planned gantry angle delivery. A gamma analysis has been performed for all beam arrangements. The measurements were obtained using the 2-D diode array MapCHECK 2(TM). MC and PB calculations were performed using the BrainLAB iPlan RTRTM Dose software. The results suggest that with the heterogeneous phantom as a quality assurance device, the MC calculations result in closer agreements to the measured values, when using the planned gantry angle delivery method for composite beams. For the homogeneous phantom, the results suggest that the preferred delivery method is at the fixed anterior to posterior gantry angle. Furthermore, the MC and PB calculations do not show significant differences for dose difference and distance to agreement criteria 3%/3mm. However, PB calculations are in better agreement with the measured values for more stringent gamma criteria when considering individual beam whereas MC agreements are closer for composite beam measurements.
Lin, Hui; Jing, Jia; Xu, Liangfeng; Mao, Xiaoli
2017-12-01
To evaluate the influence of energy spectra, mesh sizes, high Z element on dose and PVDR in Microbeam Radiation Therapy (MRT) based on 1-D analogy-mouse-head-model (1-D MHM) and 3-D voxel-mouse-head-phantom (3-D VMHP) by Monte Carlo simulation. A Microbeam-Array-Source-Model was implemented into EGSnrc/DOSXYZnrc. The microbeam size is assumed to be 25μm, 50μm or 75μm in thickness and fixed 1mm in height with 200μmc-t-c. The influence of the energy spectra of ID17@ESRF and BMIT@CLS were investigated. The mesh size was optimized. PVDR in 1-D MHM and 3-D VMHP was compared with the homogeneous water phantom. The arc influence of 3-D VMHP filled with water (3-D VMHWP) was compared with the rectangle phantom. PVDR of the lower BMIT@CLS spectrum is 2.4times that of ID17@ESRF for lower valley dose. The optimized mesh is 5µm for 25µm, and 10µm for 50µm and 75µm microbeams with 200µmc-t-c. A 500μm skull layer could make PVDR difference up to 62.5% for 1-D MHM. However this influence is limited (influence is limited for the more depth (influence of 3-D heterogeneous media. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Ackerman, Thomas P.; Lin, Ruei-Fong
1993-01-01
The radiation field over a broken stratocumulus cloud deck is simulated by the Monte Carlo method. We conducted four experiments to investigate the main factor for the observed shortwave reflectively over the FIRE flight 2 leg 5, in which reflectivity decreases almost linearly from the cloud center to cloud edge while the cloud top height and the brightness temperature remain almost constant through out the clouds. From our results, the geometry effect, however, did not contribute significantly to what has been observed. We found that the variation of the volume extinction coefficient as a function of its relative position in the cloud affects the reflectivity efficiently. Additional check of the brightness temperature of each experiment also confirms this conclusion. The cloud microphysical data showed some interesting features. We found that the cloud droplet spectrum is nearly log-normal distributed when the clouds were solid. However, whether the shift of cloud droplet spectrum toward the larger end is not certain. The decrease of number density from cloud center to cloud edges seems to have more significant effects on the optical properties.
Interstellar and Planetary Analogs in the Laboratory
Salama, Farid
2013-01-01
We present and discuss the unique capabilities of the laboratory facility, COSmIC, that was developed at NASA Ames to investigate the interaction of ionizing radiation (UV, charged particles) with molecular species (neutral molecules, radicals and ions) and carbonaceous grains in the Solar System and in the Interstellar Medium (ISM). COSmIC stands for Cosmic Simulation Chamber, a laboratory chamber where interstellar and planetary analogs are generated, processed and analyzed. It is composed of a pulsed discharge nozzle (PDN) expansion that generates a free jet supersonic expansion in a plasma cavity coupled to two ultrahigh-sensitivity, complementary in situ diagnostics: a cavity ring down spectroscopy (CRDS) system for photonic detection and a Reflectron time-of-flight mass spectrometer (ReTOF-MS) for mass detection. This setup allows the study of molecules, ions and solids under the low temperature and high vacuum conditions that are required to simulate some interstellar, circumstellar and planetary physical environments providing new fundamental insights on the molecular level into the processes that are critical to the chemistry in the ISM, circumstellar and planet forming regions, and on icy objects in the Solar System. Recent laboratory results that were obtained using COSmIC will be discussed, in particular the progress that have been achieved in monitoring in the laboratory the formation of solid particles from their gas-phase molecular precursors in environments as varied as circumstellar outflow and planetary atmospheres.
Energy Technology Data Exchange (ETDEWEB)
Cupini, E. [ENEA, Centro Ricerche `Ezio Clementel`, Bologna (Italy). Dipt. Innovazione; Borgia, M.G. [ENEA, Centro Ricerche `Ezio Clementel`, Bologna (Italy). Dipt. Energia; Premuda, M. [Consiglio Nazionale delle Ricerche, Bologna (Italy). Ist. FISBAT
1997-03-01
The Montecarlo code PREMAR is described, which allows the user to simulate the radiation transport in the atmosphere, in the ultraviolet-infrared frequency interval. A plan multilayer geometry is at present foreseen by the code, witch albedo possibility at the lower boundary surface. For a given monochromatic point source, the main quantities computed by the code are the absorption spatial distributions of aerosol and molecules, together with the related atmospheric transmittances. Moreover, simulation of of Lidar experiments are foreseen by the code, the source and telescope fields of view being assigned. To build-up the appropriate probability distributions, an input data library is assumed to be read by the code. For this purpose the radiance-transmittance LOWTRAN-7 code has been conveniently adapted as a source of the library so as to exploit the richness of information of the code for a large variety of atmospheric simulations. Results of applications of the PREMAR code are finally presented, with special reference to simulations of Lidar system and radiometer experiments carried out at the Brasimone ENEA Centre by the Environment Department.
International Nuclear Information System (INIS)
Davidson, J.H.
1986-01-01
The basic facts about radiation are explained, along with some simple and natural ways of combating its ill-effects, based on ancient healing wisdom as well as the latest biochemical and technological research. Details are also given of the diet that saved thousands of lives in Nagasaki after the Atomic bomb attack. Special comment is made on the use of radiation for food processing. (U.K.)
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...
Planetary Sciences and Exploration Programme
Indian Academy of Sciences (India)
The Indian Space Research Organisation (ISRO) has taken a number of initiatives to plan for a National. Research Programme in the area of planetary science and exploration. This announcement solicits proposals in the field of planetary science. Universities, research and educational institutions may submit proposals ...
International Nuclear Information System (INIS)
Joosten, A; Bochud, F; Moeckli, R
2014-01-01
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable
Technology under Planetary Protection Research (PPR)
National Aeronautics and Space Administration — Planetary protection involves preventing biological contamination on both outbound and sample return missions to other planetary bodies. Numerous areas of research...
Planetary heat flow measurements.
Hagermann, Axel
2005-12-15
The year 2005 marks the 35th anniversary of the Apollo 13 mission, probably the most successful failure in the history of manned spaceflight. Naturally, Apollo 13's scientific payload is far less known than the spectacular accident and subsequent rescue of its crew. Among other instruments, it carried the first instrument designed to measure the flux of heat on a planetary body other than Earth. The year 2005 also should have marked the launch of the Japanese LUNAR-A mission, and ESA's Rosetta mission is slowly approaching comet Churyumov-Gerasimenko. Both missions carry penetrators to study the heat flow from their target bodies. What is so interesting about planetary heat flow? What can we learn from it and how do we measure it?Not only the Sun, but all planets in the Solar System are essentially heat engines. Various heat sources or heat reservoirs drive intrinsic and surface processes, causing 'dead balls of rock, ice or gas' to evolve dynamically over time, driving convection that powers tectonic processes and spawns magnetic fields. The heat flow constrains models of the thermal evolution of a planet and also its composition because it provides an upper limit for the bulk abundance of radioactive elements. On Earth, the global variation of heat flow also reflects the tectonic activity: heat flow increases towards the young ocean ridges, whereas it is rather low on the old continental shields. It is not surprising that surface heat flow measurements, or even estimates, where performed, contributed greatly to our understanding of what happens inside the planets. In this article, I will review the results and the methods used in past heat flow measurements and speculate on the targets and design of future experiments.
Solar planetary systems stardust to terrestrial and extraterrestrial planetary sciences
Bhattacharya, Asit B
2017-01-01
The authors have put forth great efforts in gathering present day knowledge about different objects within our solar system and universe. This book features the most current information on the subject with information acquired from noted scientists in this area. The main objective is to convey the importance of the subject and provide detailed information on the physical makeup of our planetary system and technologies used for research. Information on educational projects has also been included in the Radio Astronomy chapters.This information is a real plus for students and educators considering a career in Planetary Science or for increasing their knowledge about our planetary system
Monte Carlo codes and Monte Carlo simulator program
International Nuclear Information System (INIS)
Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.
1990-03-01
Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)
Monte Carlo simulation of experiments
International Nuclear Information System (INIS)
Opat, G.I.
1977-07-01
An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)
2009-01-01
Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...
National Aeronautics and Space Administration — Radiation detectors that sense gamma and neutron radiation are critical to the exploration of planetary surface composition. Among the key technological challenges...
Planetary Protection Constraints For Planetary Exploration and Exobiology
Debus, A.; Bonneville, R.; Viso, M.
According to the article IX of the OUTER SPACE TREATY (London / Washington January 27., 1967) and in the frame of extraterrestrial missions, it is required to preserve planets and Earth from contamination. For ethical, safety and scientific reasons, the space agencies have to comply with the Outer Space Treaty and to take into account the related planetary protection Cospar recommendations. Planetary protection takes also into account the protection of exobiological science, because the results of life detection experimentations could have impacts on planetary protection regulations. The validation of their results depends strongly of how the samples have been collected, stored and analyzed, and particularly of their biological and organic cleanliness. Any risk of contamination by organic materials, chemical coumpounds and by terrestrial microorganisms must be avoided. A large number of missions is presently scheduled, particularly on Mars, in order to search for life or traces of past life. In the frame of such missions, CNES is building a planetary protection organization in order handle and to take in charge all tasks linked to science and engineering concerned by planetary protection. Taking into account CNES past experience in planetary protection related to the Mars 96 mission, its planned participation in exobiological missions with NASA as well as its works and involvement in Cospar activities, this paper will present the main requirements in order to avoid celestial bodies biological contamination, focussing on Mars and including Earth, and to protect exobiological science.
CdWO sub 4 scintillator as a compact gamma ray spectrometer for planetary lander missions
Eisen, Y; Starr, R; Trombka, J I
2002-01-01
The objective of this work is to develop a gamma ray spectrometer (GRS) suitable for use on planetary rover missions. The main characteristics of this detector are low weight, small volume low power and resistance to cosmic ray radiation over a long period of time. We describe a 3 cm diameter by 3 cm thick CdWO sub 4 cylindrical scintillator coupled to a PMT as a GRS for the energy region 0.662-7.64 MeV. Its spectral performance and efficiency are compared to that of a CsI(Tl) scintillator 2.5 cm diameter by 6 cm thick coupled to a 28 mmx28 mm PIN photodiode. The comparison is made experimentally using sup 1 sup 3 sup 7 Cs, sup 6 sup 0 Co, 6.13 MeV gamma rays from a sup 1 sup 3 C(alpha,gamma n)O sup 1 sup 6 * source, 7.64 MeV thermal neutron capture gamma rays emitted from iron bars using a sup 2 sup 5 sup 2 Cf neutron source, and natural radioactivity 1.46 MeV sup 4 sup 0 K and 2.61 MeV sup 2 sup 3 sup 2 Th gamma rays. We use a Monte Carlo method to calculate the total peak efficiency of these detectors and ...
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Kinematics of galactic planetary nebulae
International Nuclear Information System (INIS)
Kiosa, M.I.; Khromov, G.S.
1979-01-01
The classical method of determining the components of the solar motion relative to the centroid of the system of planetary nebulae with known radial velocities is investigated. It is shown that this method is insensitive to random errors in the radial velocities and that low accuracy in determining the coordinates of the solar apex and motion results from the insufficient number of planetaries with measured radial velocities. The planetary nebulae are found not to satisfy well the law of differential galactic rotation with circular orbits. This is attributed to the elongation of their galactic orbits. A method for obtaining the statistical parallax of planetary nebulae is considered, and the parallax calculated from the tau components of their proper motion is shown to be the most reliable
In Situ Planetary Geochronology Technology
National Aeronautics and Space Administration — This project's purpose was to determine whether a Pulsed Neutron Generator (PNG) could be used in an instrument that could perform in situ age dating of planetary...
NUEN-618 Class Project: Actually Implicit Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Vega, R. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunner, T. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-12-14
This research describes a new method for the solution of the thermal radiative transfer (TRT) equations that is implicit in time which will be called Actually Implicit Monte Carlo (AIMC). This section aims to introduce the TRT equations, as well as the current workhorse method which is known as Implicit Monte Carlo (IMC). As the name of the method proposed here indicates, IMC is a misnomer in that it is only semi-implicit, which will be shown in this section as well.
Monte Carlo methods and applications in nuclear physics
International Nuclear Information System (INIS)
Carlson, J.
1990-01-01
Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs
Monte Carlo methods and applications in nuclear physics
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs.
Planetary Geologic Mapping Handbook - 2009
Tanaka, K. L.; Skinner, J. A.; Hare, T. M.
2009-01-01
Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
nonprobabilistic) problem [5]. ... In quantum mechanics, the MC methods are used to simulate many-particle systems us- ing random ...... D Ceperley, G V Chester and M H Kalos, Monte Carlo simulation of a many-fermion study, Physical Review Vol.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.
Leonardo Rossi
Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...
NASA Planetary Visualization Tool
Hogan, P.; Kim, R.
2004-12-01
NASA World Wind allows one to zoom from satellite altitude into any place on Earth, leveraging the combination of high resolution LandSat imagery and SRTM elevation data to experience Earth in visually rich 3D, just as if they were really there. NASA World Wind combines LandSat 7 imagery with Shuttle Radar Topography Mission (SRTM) elevation data, for a dramatic view of the Earth at eye level. Users can literally fly across the world's terrain from any location in any direction. Particular focus was put into the ease of usability so people of all ages can enjoy World Wind. All one needs to control World Wind is a two button mouse. Additional guides and features can be accessed though a simplified menu. Navigation is automated with single clicks of a mouse as well as the ability to type in any location and automatically zoom to it. NASA World Wind was designed to run on recent PC hardware with the same technology used by today's 3D video games. NASA World Wind delivers the NASA Blue Marble, spectacular true-color imagery of the entire Earth at 1-kilometer-per-pixel. Using NASA World Wind, you can continue to zoom past Blue Marble resolution to seamlessly experience the extremely detailed mosaic of LandSat 7 data at an impressive 15-meters-per-pixel resolution. NASA World Wind also delivers other color bands such as the infrared spectrum. The NASA Scientific Visualization Studio at Goddard Space Flight Center (GSFC) has produced a set of visually intense animations that demonstrate a variety of subjects such as hurricane dynamics and seasonal changes across the globe. NASA World Wind takes these animations and plays them directly on the world. The NASA Moderate Resolution Imaging Spectroradiometer (MODIS) produces a set of time relevant planetary imagery that's updated every day. MODIS catalogs fires, floods, dust, smoke, storms and volcanic activity. NASA World Wind produces an easily customized view of this information and marks them directly on the globe. When one
Manaud, Nicolas; Rossi, Angelo Pio; Hare, Trent; Aye, Michael; Galluzzi, Valentina; van Gasselt, Stephan; Martinez, Santa; McAuliffe, Jonathan; Million, Chase; Nass, Andrea; Zinzi, Angelo
2016-10-01
"Open" has become attached to several concepts: science, data, and software are some of the most obvious. It is already common practice within the planetary science community to share spacecraft missions data freely and openly [1]. However, this is not historically the case for software tools, source code, and derived data sets, which are often reproduced independently by multiple individuals and groups. Sharing data, tools and overall knowledge would increase scientific return and benefits [e.g. 2], and recent projects and initiatives are helping toward this goal [e.g. 3,4,5,6].OpenPlanetary is a bottom-up initiative to address the need of the planetary science community for sharing ideas and collaborating on common planetary research and data analysis problems, new challenges, and opportunities. It started from an initial participants effort to stay connected and share information related to and beyond the ESA's first Planetary GIS Workshop [7]. It then continued during the 2nd (US) Planetary Data Workshop [8], and aggregated more people.Our objective is to build an online distributed framework enabling open collaborations within the planetary science community. We aim to co-create, curate and publish resource materials and data sets; to organise online events, to support community-based projects development; and to offer a real-time communication channel at and between conferences and workshops.We will present our current framework and resources, developing projects and ideas, and solicit for feedback and participation. OpenPlanetary is intended for research and education professionals: scientists, engineers, designers, teachers and students, as well as the general public that includes enthusiasts and citizen scientists. All are welcome to join and contribute at openplanetary.co[1] International Planetary Data Alliance, planetarydata.org. [2] Nosek et al (2015), dx.doi.org/10.1126/science.aab2374. [3] Erard S. et al. (2016), EGU2016-17527. [4] Proposal for a PDS
Planetary Transmission Diagnostics
Lewicki, David G. (Technical Monitor); Samuel, Paul D.; Conroy, Joseph K.; Pines, Darryll J.
2004-01-01
This report presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting algorithm. The lifting scheme, developed by Wim Sweldens of Bell Labs, is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Classic lifting analyzes a given signal using wavelets derived from a single fundamental basis function. A number of researchers have proposed techniques for adding adaptivity to the lifting scheme, allowing the transform to choose from a set of fundamental bases the basis that best fits the signal. This characteristic is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation (synchronous signal-averaging) algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local wave-form changes associated with certain types of gear damage are poorly adapted, causing a significant change in the
Planetary Geophysics and Tectonics
Zuber, Maria
2005-01-01
The broad objective of this work is to improve understanding of the internal structures and thermal and stress histories of the solid planets by combining results from analytical and computational modeling, and geophysical data analysis of gravity, topography and tectonic surface structures. During the past year we performed two quite independent studies in the attempt to explain the Mariner 10 magnetic observations of Mercury. In the first we revisited the possibility of crustal remanence by studying the conditions under which one could break symmetry inherent in Runcorn's model of a uniformly magnetized shell to produce a remanent signal with a dipolar form. In the second we applied a thin shell dynamo model to evaluate the range of intensity/structure for which such a planetary configuration can produce a dipole field consistent with Mariner 10 results. In the next full proposal cycle we will: (1) develop numerical and analytical and models of thin shell dynamos to address the possible nature of Mercury s present-day magnetic field and the demise of Mars magnetic field; (2) study the effect of degree-1 mantle convection on a core dynamo as relevant to the early magnetic field of Mars; (3) develop models of how the deep mantles of terrestrial planets are perturbed by large impacts and address the consequences for mantle evolution; (4) study the structure, compensation, state of stress, and viscous relaxation of lunar basins, and address implications for the Moon s state of stress and thermal history by modeling and gravity/topography analysis; and (5) use a three-dimensional viscous relaxation model for a planet with generalized vertical viscosity distribution to study the degree-two components of the Moon's topography and gravity fields to constrain the primordial stress state and spatial heterogeneity of the crust and mantle.
Dust Dynamics Near Planetary Surfaces
Colwell, Joshua; Hughes, Anna; Grund, Chris
Observations of a lunar "horizon glow" by several Surveyor spacecraft in the 1960s opened the study of the dynamics of charged dust particles near planetary surfaces. The surfaces of the Moon and other airless planetary bodies in the solar system (asteroids, and other moons) are directly exposed to the solar wind and ionizing solar ultraviolet radiation, resulting in a time-dependent electric surface potential. Because these same objects are also exposed to bombardment by micrometeoroids, the surfaces are usually characterized by a power-law size distribution of dust that extends to sub-micron-sized particles. Individual particles can acquire a charge different from their surroundings leading to electrostatic levitation. Once levitated, particles may simply return to the surface on nearly ballistic trajectories, escape entirely from the moon or asteroid if the initial velocity is large, or in some cases be stably levitated for extended periods of time. All three outcomes have observable consequences. Furthermore, the behavior of charged dust near the surface has practical implications for planned future manned and unmanned activities on the lunar surface. Charged dust particles also act as sensitive probes of the near-surface plasma environment. Recent numerical modeling of dust levitation and transport show that charged micron-sized dust is likely to accumulate in topographic lows such as craters, providing a mechanism for the creation of dust "ponds" observed on the asteroid 433 Eros. Such deposition can occur when particles are supported by the photoelectron sheath above the dayside and drift over shadowed regions of craters where the surface potential is much smaller. Earlier studies of the lunar horizon glow are consistent with those particles being on simple ballistic trajectories following electrostatic launching from the surface. Smaller particles may be accelerated from the lunar surface to high altitudes consistent with observations of high altitude
Small reactor power systems for manned planetary surface bases
Bloomfield, Harvey S.
1987-12-01
A preliminary feasibility study of the potential application of small nuclear reactor space power systems to manned planetary surface base missions was conducted. The purpose of the study was to identify and assess the technology, performance, and safety issues associated with integration of reactor power systems with an evolutionary manned planetary surface exploration scenario. The requirements and characteristics of a variety of human-rated modular reactor power system configurations selected for a range of power levels from 25 kWe to hundreds of kilowatts is described. Trade-off analyses for reactor power systems utilizing both man-made and indigenous shielding materials are provided to examine performance, installation and operational safety feasibility issues. The results of this study have confirmed the preliminary feasibility of a wide variety of small reactor power plant configurations for growth oriented manned planetary surface exploration missions. The capability for power level growth with increasing manned presence, while maintaining safe radiation levels, was favorably assessed for nominal 25 to 100 kWe modular configurations. No feasibility limitations or technical barriers were identified and the use of both distance and indigenous planetary soil material for human rated radiation shielding were shown to be viable and attractive options.
Planetary Torque in 3D Isentropic Disks
Energy Technology Data Exchange (ETDEWEB)
Fung, Jeffrey [Department of Astronomy, University of California at Berkeley, Campbell Hall, Berkeley, CA 94720-3411 (United States); Masset, Frédéric; Velasco, David [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, Av. Universidad s/n, 62210 Cuernavaca, Mor. (Mexico); Lega, Elena, E-mail: jeffrey.fung@berkeley.edu [Université de la Côte d’Azur, Observatoire de la Côte d’Azur, CNRS, Laboratoire Lagrange UMR 7293, Nice (France)
2017-03-01
Planetary migration is inherently a three-dimensional (3D) problem, because Earth-size planetary cores are deeply embedded in protoplanetary disks. Simulations of these 3D disks remain challenging due to the steep resolution requirements. Using two different hydrodynamics codes, FARGO3D and PEnGUIn, we simulate disk–planet interaction for a one to five Earth-mass planet embedded in an isentropic disk. We measure the torque on the planet and ensure that the measurements are converged both in resolution and between the two codes. We find that the torque is independent of the smoothing length of the planet’s potential ( r {sub s}), and that it has a weak dependence on the adiabatic index of the gaseous disk ( γ ). The torque values correspond to an inward migration rate qualitatively similar to previous linear calculations. We perform additional simulations with explicit radiative transfer using FARGOCA, and again find agreement between 3D simulations and existing torque formulae. We also present the flow pattern around the planets that show active flow is present within the planet’s Hill sphere, and meridional vortices are shed downstream. The vertical flow speed near the planet is faster for a smaller r {sub s} or γ , up to supersonic speeds for the smallest r {sub s} and γ in our study.
H3+ cooling in planetary atmospheres.
Miller, Steve; Stallard, Tom; Melin, Henrik; Tennyson, Jonathan
2010-01-01
We review the role of H3+ in planetary atmospheres, with a particular emphasis on its effect in cooling and stabilising, an effect that has been termed the "H3+ thermostat" (see Miller et al., Philos. Trans. R. Soc. London, Ser. A, 2000, 58, 2485). In the course of our analysis of this effect, we found that cooling functions that make use of the partition function, Q(T) based on the calculated H3+ energy levels of Neale and Tennyson (Astrophys. J., 1995, 454, L169) may underestimate just how much energy this ion is radiating to space. So we present a new fit to the calculated values of Q(T) that is accurate to within 2% for the range 100 K to 10 000 K, a very significant improvement on the fit originally provided by Neale and Tennyson themselves. We also present a fit to Q(T) calculated from only those values Neale and Tennyson computed from first principles, which may be more appropriate for planetary scientists wishing to calculate the amount of atmospheric cooling from the H3+ ion.
Planetary Torque in 3D Isentropic Disks
International Nuclear Information System (INIS)
Fung, Jeffrey; Masset, Frédéric; Velasco, David; Lega, Elena
2017-01-01
Planetary migration is inherently a three-dimensional (3D) problem, because Earth-size planetary cores are deeply embedded in protoplanetary disks. Simulations of these 3D disks remain challenging due to the steep resolution requirements. Using two different hydrodynamics codes, FARGO3D and PEnGUIn, we simulate disk–planet interaction for a one to five Earth-mass planet embedded in an isentropic disk. We measure the torque on the planet and ensure that the measurements are converged both in resolution and between the two codes. We find that the torque is independent of the smoothing length of the planet’s potential ( r s ), and that it has a weak dependence on the adiabatic index of the gaseous disk ( γ ). The torque values correspond to an inward migration rate qualitatively similar to previous linear calculations. We perform additional simulations with explicit radiative transfer using FARGOCA, and again find agreement between 3D simulations and existing torque formulae. We also present the flow pattern around the planets that show active flow is present within the planet’s Hill sphere, and meridional vortices are shed downstream. The vertical flow speed near the planet is faster for a smaller r s or γ , up to supersonic speeds for the smallest r s and γ in our study.
Optimizing bandpasses to separate planetary bodies
Teal, Dillon J.; Yarber, Aara'L.; Kopparapu, Ravi; Arney, Giada; Roberge, Aki
2018-01-01
Future telescopes will be able to directly image exoplants, opening up a new era in comparative planteology. However, background point sources, including stars, brown dwarfs, and distant unresolved galaxies may be confused with planetary sources. Observing time is previous, and methods are needed to efficiently and effectively distinguish between exoplanets and background objects. We present an optimized strategy using multi-color point source photometry to distinguish planets from the background objects with the greatest potential for spectral confusion. By determining which photometric bandpasses most effectively separate planets from background sources, this strategy would enable optimization of future telescope designs. Such an approach is key to quickly characterizing a planet while minimizing observation time. To find this optimized strategy, we used comparative spectroscopy alongside linear regression and Markov Chain Monte-Carlo (MCMC) retrieval to identify the optimal bandpasses for the LUVOIR (Large UV-Optical-Infrared Telescope) mission study coronagraph. We consider bandpasses that maximize the color-color separation of background objects from planets; we also investigate the effectiveness of color photometry to distinguish between different classes of planets using Solar System and observed spectra. This optimization strategy would also be useful to other direct imaging missions, such as HabEx, and could in principle be applied to transit spectroscopy missions, such as JWST, ARIEL, and OST.
Planetary Image Geometry Library
Deen, Robert C.; Pariser, Oleg
2010-01-01
The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A
The fragility of planetary systems
Portegies Zwart, S. F.; Jílková, Lucie
2015-07-01
We specify the range to which perturbations penetrate a planetesimal system. Such perturbations can originate from massive planets or from encounters with other stars. The latter can have an origin in the star cluster in which the planetary system was born, or from random encounters once the planetary system has escaped its parental cluster. The probability of a random encounter, either in a star cluster or in the Galactic field depends on the local stellar density, the velocity dispersion and the time spend in that environment. By adopting order of magnitude estimates, we argue that the majority of planetary systems born in open clusters will have a Parking zone, in which planetesimals are affected by encounters in their parental star cluster but remain unperturbed after the star has left the cluster. Objects found in this range of semimajor axis and eccentricity preserve the memory of the encounter that last affected their orbits, and they can therefore be used to reconstruct this encounter. Planetary systems born in a denser environment, such as in a globular cluster are unlikely to have a Parking zone. We further argue that some planetary systems may have a Frozen zone, in which orbits are not affected either by the more inner massive planets or by external influences. Objects discovered in this zone will have preserved information about their formation in their orbital parameters.
Directory of Open Access Journals (Sweden)
Pedro Medina Avendaño
1981-01-01
Full Text Available Carlos Vega Duarte tenía la sencillez de los seres elementales y puros. Su corazón era limpio como oro de aluvión. Su trato directo y coloquial ponía de relieve a un santandereano sin contaminaciones que amaba el fulgor de las armas y se encandilaba con el destello de las frases perfectas
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
Wormhole Hamiltonian Monte Carlo
Lan, S; Streets, J; Shahbaba, B
2014-01-01
Copyright © 2014, Association for the Advancement of Artificial Intelligence. In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, espe...
International Nuclear Information System (INIS)
Creutz, M.
1986-01-01
The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena
Handbook of cosmic hazards and planetary defense
Allahdadi, Firooz
2015-01-01
Covers in a comprehensive fashion all aspects of cosmic hazards and possible strategies for contending with these threats through a comprehensive planetary defense strategy. This handbook brings together in a single reference work a rich blend of information about the various types of cosmic threats that are posed to human civilization by asteroids, comets, bolides, meteors, solar flares and coronal mass ejections, cosmic radiation and other types of threats that are only recently beginning to be understood and studied, such as investigation of the “cracks” in the protective shield provided by the Van Allen belts and the geomagnetosphere, of matter-antimatter collisions, orbital debris and radiological or biological contamination. Some areas that are addressed involve areas about which there is a good deal of information that has been collected for many decades by multiple space missions run by many different space agencies, observatories and scientific researchers. Other areas involving research and ...
Energy Technology Data Exchange (ETDEWEB)
Brockway, D.; Soran, P.; Whalen, P.
1985-01-01
A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.
Directory of Open Access Journals (Sweden)
Charlie Samuya Veric
2001-12-01
Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...
VARIATIONAL PRINCIPLE FOR PLANETARY INTERIORS
International Nuclear Information System (INIS)
Zeng, Li; Jacobsen, Stein B.
2016-01-01
In the past few years, the number of confirmed planets has grown above 2000. It is clear that they represent a diversity of structures not seen in our own solar system. In addition to very detailed interior modeling, it is valuable to have a simple analytical framework for describing planetary structures. The variational principle is a fundamental principle in physics, entailing that a physical system follows the trajectory, which minimizes its action. It is alternative to the differential equation formulation of a physical system. Applying the variational principle to the planetary interior can beautifully summarize the set of differential equations into one, which provides us some insight into the problem. From this principle, a universal mass–radius relation, an estimate of the error propagation from the equation of state to the mass–radius relation, and a form of the virial theorem applicable to planetary interiors are derived.
Radiation Tolerant Temperature-Invariant Scintillation Modules, Phase II
National Aeronautics and Space Administration — Radiation detectors are an invaluable tool for space applications spanning planetary science, astrophysics, heliophysics, space weather, and dosimetry for human...
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
Thompson, W.L.; Cashwell, E.D.
1980-01-01
At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time
Monte Carlo simulation of gas Cerenkov detectors
International Nuclear Information System (INIS)
Mack, J.M.; Jain, M.; Jordan, T.M.
1984-01-01
Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
Thompson, W.L.; Cashwell, E.D.; Godfrey, T.N.K.; Schrandt, R.G.; Deutsch, O.L.; Booth, T.E.
1980-05-01
Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner
Nass, Andrea; van Gasselt, Stephan; Hargitai, Hendrik; Hare, Trent; Manaud, Nicolas; Karachevtseva, Irina; Kersten, Elke; Roatsch, Thomas; Wählisch, Marita; Kereszturi, Akos
2016-04-01
Cartography is one of the most important communication channels between users of spatial information and laymen as well as the open public alike. This applies to all known real-world objects located either here on Earth or on any other object in our Solar System. In planetary sciences, however, the main use of cartography resides in a concept called planetary mapping with all its various attached meanings: it can be (1) systematic spacecraft observation from orbit, i.e. the retrieval of physical information, (2) the interpretation of discrete planetary surface units and their abstraction, or it can be (3) planetary cartography sensu strictu, i.e., the technical and artistic creation of map products. As the concept of planetary mapping covers a wide range of different information and knowledge levels, aims associated with the concept of mapping consequently range from a technical and engineering focus to a scientific distillation process. Among others, scientific centers focusing on planetary cartography are the United State Geological Survey (USGS, Flagstaff), the Moscow State University of Geodesy and Cartography (MIIGAiK, Moscow), Eötvös Loránd University (ELTE, Hungary), and the German Aerospace Center (DLR, Berlin). The International Astronomical Union (IAU), the Commission Planetary Cartography within International Cartographic Association (ICA), the Open Geospatial Consortium (OGC), the WG IV/8 Planetary Mapping and Spatial Databases within International Society for Photogrammetry and Remote Sensing (ISPRS) and a range of other institutions contribute on definition frameworks in planetary cartography. Classical cartography is nowadays often (mis-)understood as a tool mainly rather than a scientific discipline and an art of communication. Consequently, concepts of information systems, mapping tools and cartographic frameworks are used interchangeably, and cartographic workflows and visualization of spatial information in thematic maps have often been
SPEX: The spectropolarimeter for planetary EXploration
Snik, F.; Rietjens, J.H.H.; Harten, G. van; Stam, D.M.; Keller, C.U.; Smit, J.M.; Laan, E.C.; Verlaan, A.L.; Horst, R. ter; Navarro, R.; Wielinga, K.; Moon, S.G.; Voors, R.
2010-01-01
SPEX (Spectropolarimeter for Planetary EXploration) is an innovative, compact instrument for spectropolarimetry, and in particular for detecting and characterizing aerosols in planetary atmospheres. With its ∼1-liter volume it is capable of full linear spectropolarimetry, without moving parts. The
Red giants as precursors of planetary nebulae
International Nuclear Information System (INIS)
Renzini, A.
1981-01-01
It is generally accepted that Planetary Nebulae are produced by asymptotic giant-branch stars. Therefore, several properties of planetary nebulae are discussed in the framework of the current theory of stellar evolution. (Auth.)
Usefulness of the Monte Carlo method in reliability calculations
International Nuclear Information System (INIS)
Lanore, J.M.; Kalli, H.
1977-01-01
Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels
Monte Carlo capabilities of the SCALE code system
International Nuclear Information System (INIS)
Rearden, B.T.; Petrie, L.M.; Peplow, D.E.; Bekar, K.B.; Wiarda, D.; Celik, C.; Perfetti, C.M.; Ibrahim, A.M.; Hart, S.W.D.; Dunn, M.E.; Marshall, W.J.
2015-01-01
Highlights: • Foundational Monte Carlo capabilities of SCALE are described. • Improvements in continuous-energy treatments are detailed. • New methods for problem-dependent temperature corrections are described. • New methods for sensitivity analysis and depletion are described. • Nuclear data, users interfaces, and quality assurance activities are summarized. - Abstract: SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2
GTR Component of Planetary Precession
Indian Academy of Sciences (India)
detection of gravitational waves has only augmented their en- thusiasm about the General Theory of Relativity ... the GTR advance of the perihelion of planetary motion about the sun. 1. Introduction. When you throw an ... cury's orbit was estimated to advance by about 565 seconds of an arc per Earth-century. It is also now ...
Planetary imaging with amateur astronomical instruments
Papathanasopoulos, k.; Giannaris, G.
2017-09-01
Planetary imaging can be varied by the types and size of instruments and processing. With basic amateur telescopes and software, can be captured images of our planetary system, mainly Jupiter, Saturn and Mars, but also solar eclipses, solar flares, and many more. Planetary photos can be useful for professional astronomers, and how amateur astronomers can play a role on that field.
International Nuclear Information System (INIS)
Talley, T.L.; Evans, F.
1988-01-01
Prior work demonstrated the importance of nuclear scattering to fusion product energy deposition in hot plasmas. This suggests careful examination of nuclear physics details in burning plasma simulations. An existing Monte Carlo fast ion transport code is being expanded to be a test bed for this examination. An initial extension, the energy deposition of fast alpha particles in a hot deuterium plasma, is reported. The deposition times and deposition ranges are modified by allowing nuclear scattering. Up to 10% of the initial alpha particle energy is carried to greater ranges and times by the more mobile recoil deuterons. 4 refs., 5 figs., 2 tabs
Monte Carlo techniques in diagnostic and therapeutic nuclear medicine
International Nuclear Information System (INIS)
Zaidi, H.
2002-01-01
Monte Carlo techniques have become one of the most popular tools in different areas of medical radiation physics following the development and subsequent implementation of powerful computing systems for clinical use. In particular, they have been extensively applied to simulate processes involving random behaviour and to quantify physical parameters that are difficult or even impossible to calculate analytically or to determine by experimental measurements. The use of the Monte Carlo method to simulate radiation transport turned out to be the most accurate means of predicting absorbed dose distributions and other quantities of interest in the radiation treatment of cancer patients using either external or radionuclide radiotherapy. The same trend has occurred for the estimation of the absorbed dose in diagnostic procedures using radionuclides. There is broad consensus in accepting that the earliest Monte Carlo calculations in medical radiation physics were made in the area of nuclear medicine, where the technique was used for dosimetry modelling and computations. Formalism and data based on Monte Carlo calculations, developed by the Medical Internal Radiation Dose (MIRD) committee of the Society of Nuclear Medicine, were published in a series of supplements to the Journal of Nuclear Medicine, the first one being released in 1968. Some of these pamphlets made extensive use of Monte Carlo calculations to derive specific absorbed fractions for electron and photon sources uniformly distributed in organs of mathematical phantoms. Interest in Monte Carlo-based dose calculations with β-emitters has been revived with the application of radiolabelled monoclonal antibodies to radioimmunotherapy. As a consequence of this generalized use, many questions are being raised primarily about the need and potential of Monte Carlo techniques, but also about how accurate it really is, what would it take to apply it clinically and make it available widely to the medical physics
Estimation of flux distributions with Monte Carlo functional expansion tallies
International Nuclear Information System (INIS)
Griesheimer, D. P.; Martin, W. R.; Holloway, J. P.
2005-01-01
Monte Carlo methods provide a powerful technique for estimating the average radiation flux in a volume (or across a surface) in cases where analytical solutions may not be possible. Unfortunately, Monte Carlo simulations typically provide only integral results and do not offer any further details about the distribution of the flux with respect to space, angle, time or energy. In the functional expansion tally (FET) a Monte Carlo simulation is used to estimate the functional expansion coefficients for flux distributions with respect to an orthogonal set of basis functions. The expansion coefficients are then used in post-processing to reconstruct a series approximation to the true distribution. Discrete event FET estimators are derived and their application in estimating radiation flux or current distributions is demonstrated. Sources of uncertainty in the FET are quantified and estimators for the statistical and truncation errors are derived. Numerical results are presented to support the theoretical development. (authors)
Monte Carlo Generation of the 2BN Bremsstrahlung Distribution
Peralta, L; Trindade, A
2003-01-01
The 2BN bremsstrahlung cross-section is a well-adapted distribution to describe the radiative processes at low electron kinetic energy (Ek<500 keV). In this work a method to implement this distribution in a Monte Carlo generator is developed.
The Anthropocene: A Planetary Perspective
Anbar, A. D.; Hartnett, H. E.; York, A.; Selin, C.
2016-12-01
The Anthropocene is a new planetary epoch defined by the emergence of human activity as one of the most important driving forces on Earth, rivaling and also stressing the other systems that govern the planet's habitability. Public discussions and debates about the challenges of this epoch tend to be polarized. One extreme denies that humans have a planetary-scale impact, while the other wishes that this impact could disappear. The tension between these perspectives is often paralyzing. Effective adaptation and mitigation requires a new perspective that reframes the conversation. We propose a planetary perspective according to which this epoch is the result of a recent major innovation in the 4 billion year history of life on Earth: the emergence of an energy-intensive planetary civilization. The rate of human energy use is already within an order of magnitude of that of the rest of the biosphere, and rising rapidly, and so this innovation is second only to the evolution of photosynthesis in terms of energy capture and utilization by living systems. Such energy use has and will continue to affect Earth at planetary scale. This reality cannot be denied nor wished away. From this pragmatic perspective, the Anthropocene is not an unnatural event that can be reversed, as though humanity is separate from the Earth systems with which we are co-evolving. Rather, it is an evolutionary transition to be managed. This is the challenge of turning a carelessly altered planet into a carefully designed and managed world, maintaining a "safe operating space" for human civilization (Steffen et al., 2011). To do so, we need an integrated approach to Earth systems science that considers humans as a natural and integral component of Earth's systems. Insights drawn from the humanities and the social sciences must be integrated with the natural sciences in order to thrive in this new epoch. This type of integrated perspective is relatively uncontroversial on personal, local, and even
Monte Carlo calculation of ''skyshine'' neutron dose from ALS [Advanced Light Source
International Nuclear Information System (INIS)
Moin-Vasiri, M.
1990-06-01
This report discusses the following topics on ''skyshine'' neutron dose from ALS: Sources of radiation; ALS modeling for skyshine calculations; MORSE Monte-Carlo; Implementation of MORSE; Results of skyshine calculations from storage ring; and Comparison of MORSE shielding calculations
From red giants to planetary nebulae
International Nuclear Information System (INIS)
Kwok, S.
1982-01-01
The transition from red giants to planetary nebulae is studied by comparing the spectral characteristics of red giant envelopes and planetary nebulae. Observational and theoretical evidence both suggest that remnants of red giant envelopes may still be present in planetary nebula systems and should have significant effects on their formation. The dynamical effects of the interaction of stellar winds from central stars of planetary nebulae with the remnant red giant envelopes are evaluated and the mechanism found to be capable of producing the observed masses and momenta of planetary nebulae. The observed mass-radii relation of planetary nebulae may also be best explained by the interacting winds model. The possibility that red giant mass loss, and therefore the production of planetary nebulae, is different between Population I and II systems is also discussed
Energy Technology Data Exchange (ETDEWEB)
Lomax, Jamie R.; Wisniewski, John P.; Hashimoto, Jun [Homer L. Dodge Department of Physics, University of Oklahoma, Norman, OK 73071 (United States); Grady, Carol A. [Exoplanets and Stellar Astrophysics Laboratory, Code 667, Goddard Space Flight Center, Greenbelt, MD 20771 (United States); McElwain, Michael W. [NASA Goddard Space Flight Center, Code 6681, Greenbelt, MD 20771 (United States); Kudo, Tomoyuki; Currie, Thayne M; Egner, Sebastian; Guyon, Olivier; Hayano, Yutaka [Subaru Telescope, National Astronomical Observatory of Japan, 650 North A’ohoku Place, Hilo, HI 96720 (United States); Kusakabe, Nobuhiko; Hayashi, Masahiko [National Astronomical Observatory of Japan, 2-21-1, Osawa, Mitaka, Tokyo, 181-8588 (Japan); Okamoto, Yoshiko K. [Institute of Astrophysics and Planetary Sciences, Faculty of Science, Ibaraki University, 2-1-1 Bunkyo, Mito, Ibaraki 310-8512 (Japan); Fukagawa, Misato [Graduate School of Science, Osaka University, 1-1 Machikaneyama, Toyonaka, Osaka 560-0043 (Japan); Abe, Lyu [Laboratoire Lagrange (UMR 7293), Universite de Nice-Sophia Antipolis, CNRS, Observatoire de la Cote d’Azur, 28 avenue Valrose, F-06108 Nice Cedex 2 (France); Brandner, Wolfgang; Feldt, Markus [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Brandt, Timothy D. [Astrophysics Department, Institute for Advanced Study, Princeton, NJ 08540 (United States); Carson, Joseph C. [Department of Physics and Astronomy, College of Charleston, 58 Coming Street, Charleston, SC 29424 (United States); Goto, Miwa, E-mail: Jamie.R.Lomax@ou.edu, E-mail: wisniewski@ou.edu, E-mail: carol.a.grady@nasa.gov [Universitäts-Sternwarte München, Ludwig-Maximilians-Universität, Scheinerstr. 1, D-81679 München (Germany); and others
2016-09-01
We present a new analysis of multi-epoch, H -band, scattered light images of the AB Aur system. We use a Monte Carlo radiative transfer code to simultaneously model the system’s spectral energy distribution (SED) and H -band polarized intensity (PI) imagery. We find that a disk-dominated model, as opposed to one that is envelope-dominated, can plausibly reproduce AB Aur’s SED and near-IR imagery. This is consistent with previous modeling attempts presented in the literature and supports the idea that at least a subset of AB Aur’s spirals originate within the disk. In light of this, we also analyzed the movement of spiral structures in multi-epoch H -band total light and PI imagery of the disk. We detect no significant rotation or change in spatial location of the spiral structures in these data, which span a 5.8-year baseline. If such structures are caused by disk–planet interactions, the lack of observed rotation constrains the location of the orbit of planetary perturbers to be >47 au.
An improved wavelength selection scheme for Monte Carlo solvers applied to hypersonic plasmas
International Nuclear Information System (INIS)
Feldick, Andrew; Modest, Michael F.
2011-01-01
A new databasing scheme is developed for Monte Carlo Ray Tracing methods applied to hypersonic planetary entry. In this scheme, the complex relationships for the emission wavelength selection of atomic and molecular species in nonequilibrium flows are simplified by developing random number relationships for individual transitions, as opposed to using relationships for the spectral emission coefficient of a given species. These new techniques speed up wavelength selection by about 2 orders of magnitude, and offer flexibility for use in weighted or part-spectrum Monte Carlo solvers.
Solar Variability and Planetary Climates
Calisesi, Y; Gray, L; Langen, J; Lockwood, M
2007-01-01
Variations in solar activity, as revealed by variations in the number of sunspots, have been observed since ancient times. To what extent changes in the solar output may affect planetary climates, though, remains today more than ever a subject of controversy. In 2000, the SSSI volume on Solar Variability and Climate reviewed the to-date understanding of the physics of solar variability and of the associated climate response. The present volume on Solar Variability and Planetary Climates provides an overview of recent advances in this field, with particular focus at the Earth's middle and lower atmosphere. The book structure mirrors that of the ISSI workshop held in Bern in June 2005, the collection of invited workshop contributions and of complementary introductory papers synthesizing the current understanding in key research areas such as middle atmospheric processes, stratosphere-troposphere dynamical coupling, tropospheric aerosols chemistry, solar storm influences, solar variability physics, and terrestri...
Teaching, Learning, and Planetary Exploration
Brown, Robert A.
2002-01-01
This is the final report of a program that examined the fundamentals of education associated with space activities, promoted educational policy development in appropriate forums, and developed pathfinder products and services to demonstrate the utility of advanced communication technologies for space-based education. Our focus was on space astrophysics and planetary exploration, with a special emphasis on the themes of the Origins Program, with which the Principal Investigator (PI) had been involved from the outset. Teaching, Learning, and Planetary Exploration was also the core funding of the Space Telescope Science Institute's (ST ScI) Special Studies Office (SSO), and as such had provided basic support for such important NASA studies as the fix for Hubble Space Telescope (HST) spherical aberration, scientific conception of the HST Advanced Camera, specification of the Next-Generation Space Telescope (NGST), and the strategic plan for the second decade of the HST science program.
The PSA: Planetary Science Archive
Barthelemy, M.; Martinez, S.; Heather, D.; Vazquez, J. L.; Arviset, C.; Osuna, P.; PSA development Team
2012-04-01
Scientific and engineering data from ESA's planetary missions are made accessible to the world-wide scientific community via the Planetary Science Archive (PSA). The PSA consists of online services incorporating search, preview, download, notification and delivery basket functionality. Besides data from the GIOTTO spacecraft and several ground-based cometary observations, the PSA contains data from the Mars Express, Venus Express, Rosetta, SMART-1 and Huygens missions. The focus of the PSA activities is on the long-term preservation of data and knowledge from ESA's planetary missions. Scientific users can access the data online using several interfaces: - The Advanced Search Interface allows complex parameter based queries, providing the end user with a facility to complete very specific searches on meta-data and geometrical parameters. By nature, this interface requires careful use and heavy interaction with the end-user to input and control the relevant search parameters. - The Map-based Interface is currently operational only for Mars Express HRCS and OMEGA data. This interface allows an end-user to specify a region-of-interest by dragging a box onto a base map of Mars. From this interface, it is possible to directly visualize query results. The Map-based and Advanced interfaces are linked and cross-compatible. If a user defines a region-of-interest in the Map-based interface, the results can be refined by entering more detailed search parameters in the Advanced interface. - The FTP Browser Interface is designed for more experienced users, and allows for direct browsing and access of the data set content through ftp-tree search. Each dataset contains documentation and calibration information in addition to the scientific or engineering data. All data are prepared by the corresponding instrument teams, mostly located in Europe. PSA supports the instrument teams in the full archiving process, from the definition of the data products, meta-data and product labels
Numerical models of planetary dynamos
International Nuclear Information System (INIS)
Glatzmaier, G.A.; Roberts, P.H.
1992-01-01
We describe a nonlinear, axisymmetric, spherical-shell model of planetary dynamos. This intermediate-type dynamo model requires a prescribed helicity field (the alpha effect) and a prescribed buoyancy force or thermal wind (the omega effect) and solves for the axisymmetric time-dependent magnetic and velocity fields. Three very different time dependent solutions are obtained from different prescribed sets of alpha and omega fields
Stream Lifetimes Against Planetary Encounters
Valsecchi, G. B.; Lega, E.; Froeschle, Cl.
2011-01-01
We study, both analytically and numerically, the perturbation induced by an encounter with a planet on a meteoroid stream. Our analytical tool is the extension of pik s theory of close encounters, that we apply to streams described by geocentric variables. The resulting formulae are used to compute the rate at which a stream is dispersed by planetary encounters into the sporadic background. We have verified the accuracy of the analytical model using a numerical test.
Planetary rovers and data fusion
Masuku, Anthony Dumisani
2012-01-01
This research will investigate the problem of position estimation for planetary rovers. Diverse algorithmic filters are available for collecting input data and transforming that data to useful information for the purpose of position estimation process. The terrain has sandy soil which might cause slipping of the robot, and small stones and pebbles which can affect trajectory. The Kalman Filter, a state estimation algorithm was used for fusing the sensor data to improve the p...
Gazetteer of planetary nomenclature 1994
Batson, Raymond M.; Russell, Joel F.
1995-01-01
Planetary nomenclature, like terrestrial nomenclature, is used to uniquely identify a feature on the surface of a planet or satellite so that the feature can be easily located, described, and discussed. This volume contains detailed information about all names of topographic and albedo features on planets and satellites (and some planetary ring and ring-gap systems) that the International Astronomical Union has named and approved from its founding in 1919 through its triennial meeting in 1994.This edition of the Gazetteer of Planetary Nomenclature supersedes an earlier informal volume distributed by the U.S. Geological Survey in 1986 as Open-File Report 84-692 (Masursky and others, 1986). Named features are depicted on maps of the Moon published first by the U.S. Defense Mapping Agency or the Aeronautical Chart and Information Center and more recently by the U.S. Geological Survey; on maps of Mercury, Venus, Mars, and the satellites of Jupiter, Saturn, and Uranus published by the U.S. Geological Survey; and on maps of the Moon, Venus, and Mars produced by the U.S.S.R.Although we have attempted to check the accuracy of all data in this volume, we realize that some errors will remain in a work of this size. Readers noting errors or omissions are urged to communicate them to the U.S. Geological Survey, Branch of Astrogeology, Rm. 409, 2255 N. Gemini Drive, Flagstaff, AZ 86001.
Evolution of planetary nebula nuclei
International Nuclear Information System (INIS)
Shaw, R.A.
1985-01-01
The evolution of planetary nebula nuclei (PNNs) is examined with the aid of the most recent available stellar evolution calculations and new observations of these objects. Their expected distribution in the log L-log T plane is calculated based upon the stellar evolutionary models of Paczynski, Schoenberner and Iben, the initial mass function derived by Miller and Scalo, and various assumptions concerning mass loss during post-main sequence evolution. The distribution is found to be insensitive both to the assumed range of main-sequence progenitor mass and to reasonable variations in the age and the star forming history of the galactic disk. Rather, the distribution is determined by the strong dependence of the rate of stellar evolution upon core mass, the steepness of the initial mass function, and to a lesser extent the finite lifetime of an observable planetary nebula. The theoretical distributions are rather different than any of those inferred from earlier observations. Possible observational selection effects that may be responsible are examined, as well as the intrinsic uncertainties associated with the theoretical model predictions. An extensive photometric and smaller photographic survey of southern hemisphere planetary nebulae (PNs) is presented
Angular biasing in implicit Monte-Carlo
International Nuclear Information System (INIS)
Zimmerman, G.B.
1994-01-01
Calculations of indirect drive Inertial Confinement Fusion target experiments require an integrated approach in which laser irradiation and radiation transport in the hohlraum are solved simultaneously with the symmetry, implosion and burn of the fuel capsule. The Implicit Monte Carlo method has proved to be a valuable tool for the two dimensional radiation transport within the hohlraum, but the impact of statistical noise on the symmetric implosion of the small fuel capsule is difficult to overcome. We present an angular biasing technique in which an increased number of low weight photons are directed at the imploding capsule. For typical parameters this reduces the required computer time for an integrated calculation by a factor of 10. An additional factor of 5 can also be achieved by directing even smaller weight photons at the polar regions of the capsule where small mass zones are most sensitive to statistical noise
Modeling of the Lunar Radiation Environment
International Nuclear Information System (INIS)
De Angelis, G.; Badavi, F.F.; Clem, J.M.; Blattnig, S.R.; Clowdsley, M.S.; Nealy, J.E.; Tripathi, R.K.; Wilson, J.W.
2007-01-01
In view of manned missions targeted to the Moon, for which radiation exposure is one of the greatest challenges to be tackled, it is of fundamental importance to have available a tool, which allows the determination of the particle flux and spectra at any time and at any point of the lunar surface. With this goal in mind, a new model of the Moon's radiation environment due to Galactic Cosmic Rays (GCR) and Solar Particle Events (SPE) has been developed. Primary particles reach the lunar surface, and are transported all throughout the subsurface layers, with backscattering patterns taken into account. The surface itself has been modeled as regolith and bedrock, with composition taken from the results of the instruments flown on the Apollo missions. Subsurface environments like lava tubes have been considered in the analysis. Particle transport has been performed with both deterministic and Monte Carlo codes with an adaptation for planetary surface geometry. Results are given in terms of fluxes, doses and LET, for most kinds of particles for various kinds of soil and rock chemical compositions
Monte Carlo Methods in Physics
International Nuclear Information System (INIS)
Santoso, B.
1997-01-01
Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained
The role of planetary waves in the tropospheric jet response to stratospheric cooling
Smith, Karen L.; Scott, Richard K.
2016-03-01
An idealized general circulation model is used to assess the importance of planetary-scale waves in determining the position of the tropospheric jet, specifically its tendency to shift poleward as winter stratospheric cooling is increased. Full model integrations are compared against integrations in which planetary waves are truncated in the zonal direction, and only synoptic-scale waves are retained. Two series of truncated integrations are considered, using (i) a modified radiative equilibrium temperature or (ii) a nudged-bias correction technique. Both produce tropospheric climatologies that are similar to the full model when stratospheric cooling is weak. When stratospheric cooling is increased, the results indicate that the interaction between planetary- and synoptic-scale waves plays an important role in determining the structure of the tropospheric mean flow and rule out the possibility that the jet shift occurs purely as a response to changes in the planetary- or synoptic-scale wave fields alone.
Monte Carlo simulation for radiographic applications
International Nuclear Information System (INIS)
Tillack, G.R.; Bellon, C.
2003-01-01
Standard radiography simulators are based on the attenuation law complemented by built-up-factors (BUF) to describe the interaction of radiation with material. The assumption of BUF implies that scattered radiation reduces only the contrast in radiographic images. This simplification holds for a wide range of applications like weld inspection as known from practical experience. But only a detailed description of the different underlying interaction mechanisms is capable to explain effects like mottling or others that every radiographer has experienced in practice. The application of Monte Carlo models is capable to handle primary and secondary interaction mechanisms contributing to the image formation process like photon interactions (absorption, incoherent and coherent scattering including electron-binding effects, pair production) and electron interactions (electron tracing including X-Ray fluorescence and Bremsstrahlung production). It opens up possibilities like the separation of influencing factors and the understanding of the functioning of intensifying screen used in film radiography. The paper discusses the opportunities in applying the Monte Carlo method to investigate special features in radiography in terms of selected examples. (orig.) [de
Collisional stripping of planetary crusts
Carter, Philip J.; Leinhardt, Zoë M.; Elliott, Tim; Stewart, Sarah T.; Walter, Michael J.
2018-02-01
Geochemical studies of planetary accretion and evolution have invoked various degrees of collisional erosion to explain differences in bulk composition between planets and chondrites. Here we undertake a full, dynamical evaluation of 'crustal stripping' during accretion and its key geochemical consequences. Crusts are expected to contain a significant fraction of planetary budgets of incompatible elements, which include the major heat producing nuclides. We present smoothed particle hydrodynamics simulations of collisions between differentiated rocky planetesimals and planetary embryos. We find that the crust is preferentially lost relative to the mantle during impacts, and we have developed a scaling law based on these simulations that approximates the mass of crust that remains in the largest remnant. Using this scaling law and a recent set of N-body simulations of terrestrial planet formation, we have estimated the maximum effect of crustal stripping on incompatible element abundances during the accretion of planetary embryos. We find that on average approximately one third of the initial crust is stripped from embryos as they accrete, which leads to a reduction of ∼20% in the budgets of the heat producing elements if the stripped crust does not reaccrete. Erosion of crusts can lead to non-chondritic ratios of incompatible elements, but the magnitude of this effect depends sensitively on the details of the crust-forming melting process on the planetesimals. The Lu/Hf system is fractionated for a wide range of crustal formation scenarios. Using eucrites (the products of planetesimal silicate melting, thought to represent the crust of Vesta) as a guide to the Lu/Hf of planetesimal crust partially lost during accretion, we predict the Earth could evolve to a superchondritic 176Hf/177Hf (3-5 parts per ten thousand) at present day. Such values are in keeping with compositional estimates of the bulk Earth. Stripping of planetary crusts during accretion can lead to
Monte Carlo systems used for treatment planning and dose verification
Energy Technology Data Exchange (ETDEWEB)
Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)
2017-04-15
General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
Monte Carlo capabilities of the SCALE code system
International Nuclear Information System (INIS)
Rearden, B.T.; Petrie, L.M.; Peplow, D.E.; Bekar, K.B.; Wiarda, D.; Celik, C.; Perfetti, C.M.; Ibrahim, A.M.; Dunn, M.E.; Hart, S.W.D.
2013-01-01
SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a 'plug-and-play' framework that includes three deterministic and three Monte Carlo radiation transport solvers (KENO, MAVRIC, TSUNAMI) that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE's graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2, to be released in 2014, will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2. (authors)
A study on the shielding element using Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Kim, Ki Jeong [Dept. of Radiology, Konkuk University Medical Center, Seoul (Korea, Republic of); Shim, Jae Goo [Dept. of Radiologic Technology, Daegu Health College, Daegu (Korea, Republic of)
2017-06-15
In this research, we simulated the elementary star shielding ability using Monte Carlo simulation to apply medical radiation shielding sheet which can replace existing lead. In the selection of elements, mainly elements and metal elements having a large atomic number, which are known to have high shielding performance, recently, various composite materials have improved shielding performance, so that weight reduction, processability, In consideration of activity etc., 21 elements were selected. The simulation tools were utilized Monte Carlo method. As a result of simulating the shielding performance by each element, it was estimated that the shielding ratio is the highest at 98.82% and 98.44% for tungsten and gold.
Parallelizing Monte Carlo with PMC
International Nuclear Information System (INIS)
Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.
1994-11-01
PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described
The Planetary Data System— Archiving Planetary Data for the use of the Planetary Science Community
Morgan, Thomas H.; McLaughlin, Stephanie A.; Grayzeck, Edwin J.; Vilas, Faith; Knopf, William P.; Crichton, Daniel J.
2014-11-01
NASA’s Planetary Data System (PDS) archives, curates, and distributes digital data from NASA’s planetary missions. PDS provides the planetary science community convenient online access to data from NASA’s missions so that they can continue to mine these rich data sets for new discoveries. The PDS is a federated system consisting of nodes for specific discipline areas ranging from planetary geology to space physics. Our federation includes an engineering node that provides systems engineering support to the entire PDS.In order to adequately capture complete mission data sets containing not only raw and reduced instrument data, but also calibration and documentation and geometry data required to interpret and use these data sets both singly and together (data from multiple instruments, or from multiple missions), PDS personnel work with NASA missions from the initial AO through the end of mission to define, organize, and document the data. This process includes peer-review of data sets by members of the science community to ensure that the data sets are scientifically useful, effectively organized, and well documented. PDS makes the data in PDS easily searchable so that members of the planetary community can both query the archive to find data relevant to specific scientific investigations and easily retrieve the data for analysis. To ensure long-term preservation of data and to make data sets more easily searchable with the new capabilities in Information Technology now available (and as existing technologies become obsolete), the PDS (together with the COSPAR sponsored IPDA) developed and deployed a new data archiving system known as PDS4, released in 2013. The LADEE, MAVEN, OSIRIS REx, InSight, and Mars2020 missions are using PDS4. ESA has adopted PDS4 for the upcoming BepiColumbo mission. The PDS is actively migrating existing data records into PDS4 and developing tools to aid data providers and users. The PDS is also incorporating challenge
Computational radiology and imaging with the MCNP Monte Carlo code
Energy Technology Data Exchange (ETDEWEB)
Estes, G.P.; Taylor, W.M.
1995-05-01
MCNP, a 3D coupled neutron/photon/electron Monte Carlo radiation transport code, is currently used in medical applications such as cancer radiation treatment planning, interpretation of diagnostic radiation images, and treatment beam optimization. This paper will discuss MCNP`s current uses and capabilities, as well as envisioned improvements that would further enhance MCNP role in computational medicine. It will be demonstrated that the methodology exists to simulate medical images (e.g. SPECT). Techniques will be discussed that would enable the construction of 3D computational geometry models of individual patients for use in patient-specific studies that would improve the quality of care for patients.
Do planetary seasons play a role in attaining stable climates?
Olsen, Kasper Wibeck; Bohr, Jakob
2018-02-01
A simple phenomenological account for planetary climate instabilities is presented. The description is based on the standard model where the balance of incoming stellar radiation and outward thermal radiation is described by the effective planet temperature. Often, it is found to have three different points, or temperatures, where the influx of radiation is balanced with the out-flux, even with conserved boundary conditions. Two of these points are relatively long-term stable, namely the point corresponding to a cold climate and the point corresponding to a hot climate. In a classical sense these points are equilibrium balance points. The hypothesis promoted in this paper is the possibility that the intermediate third point can become long-term stable by being driven dynamically. The initially unstable point is made relatively stable over a long period by the presence of seasonal climate variations.
Lectures on Monte Carlo methods
Madras, Neal
2001-01-01
Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati
Wormhole Hamiltonian Monte Carlo
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2015-01-01
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function. PMID:25861551
Wormhole Hamiltonian Monte Carlo.
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2014-07-31
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function.
Castin, N.; Malerba, L.; Bonny, G.; Pascuet, M. I.; Hou, M.
2009-09-01
We apply a novel atomistic kinetic Monte Carlo model, which includes local chemistry and relaxation effects when assessing the migration energy barriers of point defects, to the study of the microchemical evolution driven by vacancy diffusion in FeCu and FeCuNi alloys. These alloys are of importance for nuclear applications because Cu precipitation, enhanced by the presence of Ni, is one of the main causes of hardening and embrittlement in reactor pressure vessel steels used in existing nuclear power plants. Local chemistry and relaxation effects are introduced using artificial intelligence techniques, namely a conveniently trained artificial neural network, to calculate the migration energy barriers of vacancies as functions of the local atomic configuration. We prove, through a number of results, that the use of the neural network is fully equivalent to calculating the migration energy barriers on-the-fly, using computationally expensive methods such as nudged elastic bands with an interatomic potential. The use of the neural network makes the computational cost affordable, so that simulations of the same type as those hitherto carried out using heuristic formulas for the assessment of the energy barriers can now be performed, at the same computational cost, using more rigorously calculated barriers. This method opens the way to properly treating more complex problems, such as the case of self-interstitial cluster formation, in an atomistic kinetic Monte Carlo framework.
International Nuclear Information System (INIS)
Castin, N.; Malerba, L.; Bonny, G.; Pascuet, M.I.; Hou, M.
2009-01-01
We apply a novel atomistic kinetic Monte Carlo model, which includes local chemistry and relaxation effects when assessing the migration energy barriers of point defects, to the study of the microchemical evolution driven by vacancy diffusion in FeCu and FeCuNi alloys. These alloys are of importance for nuclear applications because Cu precipitation, enhanced by the presence of Ni, is one of the main causes of hardening and embrittlement in reactor pressure vessel steels used in existing nuclear power plants. Local chemistry and relaxation effects are introduced using artificial intelligence techniques, namely a conveniently trained artificial neural network, to calculate the migration energy barriers of vacancies as functions of the local atomic configuration. We prove, through a number of results, that the use of the neural network is fully equivalent to calculating the migration energy barriers on-the-fly, using computationally expensive methods such as nudged elastic bands with an interatomic potential. The use of the neural network makes the computational cost affordable, so that simulations of the same type as those hitherto carried out using heuristic formulas for the assessment of the energy barriers can now be performed, at the same computational cost, using more rigorously calculated barriers. This method opens the way to properly treating more complex problems, such as the case of self-interstitial cluster formation, in an atomistic kinetic Monte Carlo framework.
Energy Technology Data Exchange (ETDEWEB)
Castin, N. [Structural Materials Group, Nuclear Materials Science Institute, Kernenergie Centre d' Etude de l' Energie Nucleaire (SCK CEN), Studiecentrum voor, Boeretang 200, 2400 Mol (Belgium); Physique des Solides Irradies et des Nanostructures (PSIN), Universite Libre de Bruxelles (ULB), Boulevard du Triomphe CP234, 1050 Brussels (Belgium); Malerba, L. [Structural Materials Group, Nuclear Materials Science Institute, Kernenergie Centre d' Etude de l' Energie Nucleaire (SCK CEN), Studiecentrum voor, Boeretang 200, 2400 Mol (Belgium)], E-mail: lmalerba@sckcen.be; Bonny, G. [Structural Materials Group, Nuclear Materials Science Institute, Kernenergie Centre d' Etude de l' Energie Nucleaire (SCK CEN), Studiecentrum voor, Boeretang 200, 2400 Mol (Belgium); Laboratory of Theoretical Physics, Universiteit Gent, Proeftuinstraat 86, B-9000 Gent (Belgium); Pascuet, M.I. [Structural Materials Group, Nuclear Materials Science Institute, Kernenergie Centre d' Etude de l' Energie Nucleaire (SCK CEN), Studiecentrum voor, Boeretang 200, 2400 Mol (Belgium); CAC-CNEA, Departamento de Materiales, Avda. Gral. Paz 1499, 1650 San Martin, Pcia. Buenos Aires (Argentina); CONICET, Avda. Rivadavia 1917, 1033 Buenos Aires (Argentina); Hou, M. [Physique des Solides Irradies et des Nanostructures (PSIN), Universite Libre de Bruxelles (ULB), Boulevard du Triomphe CP234, 1050 Brussels (Belgium)
2009-09-15
We apply a novel atomistic kinetic Monte Carlo model, which includes local chemistry and relaxation effects when assessing the migration energy barriers of point defects, to the study of the microchemical evolution driven by vacancy diffusion in FeCu and FeCuNi alloys. These alloys are of importance for nuclear applications because Cu precipitation, enhanced by the presence of Ni, is one of the main causes of hardening and embrittlement in reactor pressure vessel steels used in existing nuclear power plants. Local chemistry and relaxation effects are introduced using artificial intelligence techniques, namely a conveniently trained artificial neural network, to calculate the migration energy barriers of vacancies as functions of the local atomic configuration. We prove, through a number of results, that the use of the neural network is fully equivalent to calculating the migration energy barriers on-the-fly, using computationally expensive methods such as nudged elastic bands with an interatomic potential. The use of the neural network makes the computational cost affordable, so that simulations of the same type as those hitherto carried out using heuristic formulas for the assessment of the energy barriers can now be performed, at the same computational cost, using more rigorously calculated barriers. This method opens the way to properly treating more complex problems, such as the case of self-interstitial cluster formation, in an atomistic kinetic Monte Carlo framework.
Teaching, learning, and planetary exploration
Brown, Robert A.
1992-01-01
The progress accomplished in the first five months of the three-year grant period of Teaching, Learning, and Planetary Exploration is presented. The objectives of this project are to discover new education products and services based on space science, particularly planetary exploration. An Exploration in Education is the umbrella name for the education projects as they are seen by teachers and the interested public. As described in the proposal, our approach consists of: (1) increasing practical understanding of the potential role and capabilities of the research community to contribute to basic education using new discoveries; (2) developing an intellectual framework for these contributions by supplying criteria and templates for the teacher's stories; (3) attracting astronomers, engineers, and technical staff to the project and helping them form productive education partnerships for the future, (4) exploring relevant technologies and networks for authoring and communicating the teacher's stories; (5) enlisting the participation of potential user's of the teacher's stories in defining the products; (6) actually producing and delivering many educationally useful teacher's stories; and (7) reporting the pilot study results with critical evaluation. Technical progress was made by assembling our electronic publishing stations, designing electronic publications based on space science, and developing distribution approaches for electronic products. Progress was made addressing critical issues by developing policies and procedures for securing intellectual property rights and assembling a focus group of teachers to test our ideas and assure the quality of our products. The following useful materials are being produced: the TOPS report; three electronic 'PictureBooks'; one 'ElectronicArticle'; three 'ElectronicReports'; ten 'PrinterPosters'; and the 'FaxForum' with an initial complement of printed materials. We have coordinated with planetary scientists and astronomers
Institute of Geophysics, Planetary Physics, and Signatures
Federal Laboratory Consortium — The Institute of Geophysics, Planetary Physics, and Signatures at Los Alamos National Laboratory is committed to promoting and supporting high quality, cutting-edge...
Sealed Planetary Return Canister (SPRC), Phase I
National Aeronautics and Space Administration — Sample return missions have primary importance in future planetary missions. A basic requirement is that samples be returned in pristine, uncontaminated condition,...
Polarimetry of stars and planetary systems
National Research Council Canada - National Science Library
Kolokolova, Ludmilla; Hough, James; Levasseur-Regourd, Anny-Chantal
2015-01-01
... fields of polarimetric exploration, including proto-planetary and debris discs, icy satellites, transneptunian objects, exoplanets and the search for extraterrestrial life -- unique results produced...
Sealed Planetary Return Canister (SPRC), Phase II
National Aeronautics and Space Administration — Sample return missions have primary importance in future planetary missions. A basic requirement is that samples be returned in pristine, uncontaminated condition,...
PSUP: A Planetary SUrface Portal
Poulet, F.; Quantin-Nataf, C.; Ballans, H.; Dassas, K.; Audouard, J.; Carter, J.; Gondet, B.; Lozac'h, L.; Malapert, J.-C.; Marmo, C.; Riu, L.; Séjourné, A.
2018-01-01
The large size and complexity of planetary data acquired by spacecraft during the last two decades create a demand within the planetary community for access to the archives of raw and high level data and for the tools necessary to analyze these data. Among the different targets of the Solar System, Mars is unique as the combined datasets from the Viking, Mars Global Surveyor, Mars Odyssey, Mars Express and Mars Reconnaissance Orbiter missions provide a tremendous wealth of information that can be used to study the surface of Mars. The number and the size of the datasets require an information system to process, manage and distribute data. The Observatories of Paris Sud (OSUPS) and Lyon (OSUL) have developed a portal, called PSUP (Planetary SUrface Portal), for providing users with efficient and easy access to data products dedicated to the Martian surface. The objectives of the portal are: 1) to allow processing and downloading of data via a specific application called MarsSI (Martian surface data processing Information System); 2) to provide the visualization and merging of high level (image, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu), and 3) to distribute some of these specific high level data with an emphasis on products issued by the science teams of OSUPS and OSUL. As the MarsSI service is extensively described in a companion paper (Quantin-Nataf et al., companion paper, submitted to this special issue), the present paper focus on the general architecture and the functionalities of the web-based user interface MarsVisu. This service provides access to many data products for Mars: albedo, mineral and thermal inertia global maps from spectrometers; mosaics from imagers; image footprints and rasters from the MarsSI tool; high level specific products (defined as catalogs or vectors). MarsVisu can be used to quickly assess the visualized processed data and maps as well as identify areas that have not been mapped yet
Planetary Radars Operating Centre PROC
Catallo, C.; Flamini, E.; Seu, R.; Alberti, G.
2007-12-01
Planetary exploration by means of radar systems, mainly using Ground Penetrating Radars (GPR) plays an important role in Italy. Numerous scientific international space programs are currently carried out jointly with ESA and NASA by Italian Space Agency, the scientific community and the industry. Three important experiments under Italian leadership ( designed and manufactured by the Italian industry), provided by ASI either as contribution to ESA programs either within a NASA/ASI joint venture framework, are now operating: MARSIS on-board Mars Express, SHARAD on-board Mars Reconnaissance Orbiter and CASSINI Radar on-board Cassini spacecraft. In order to support all the scientific communities, institutional customers and experiment teams operation three Italian dedicated operational centers have been realized, namely SHOC, (Sharad Operating Centre), MOC (Marsis Operating Center) and CASSINI PAD ( Processing Altimetry Data). Each center is dedicated to a single instrument management and control, data processing and distribution. Although they had been conceived to operate autonomously and independently one from each other, synergies and overlaps have been envisaged leading to the suggestion of a unified center, the Planetary Radar Processing Center (PROC). PROC is conceived in order to include the three operational centers, namely SHOC, MOC and CASSINI PAD, either from logistics point of view and from HW/SW capabilities point of view. The Planetary Radar Processing Center shall be conceived as the Italian support facility to the scientific community for on-going and future Italian planetary exploration programs. Therefore, scalability, easy use and management shall be the design drivers. The paper describes how PROC is designed and developed, to allow SHOC, MOC and CASSINI PAD to operate as before, and to offer improved functionalities to increase capabilities, mainly in terms of data exchange, comparison, interpretation and exploitation. Furthermore, in the frame of
PLANETARY EMBRYO BOW SHOCKS AS A MECHANISM FOR CHONDRULE FORMATION
Energy Technology Data Exchange (ETDEWEB)
Mann, Christopher R.; Boley, Aaron C. [Department of Physics and Astronomy University of British Columbia Vancouver, BC V6T 1Z1 (Canada); Morris, Melissa A. [Physics Department State University of New York at Cortland Cortland, NY 13045 (United States)
2016-02-20
We use radiation hydrodynamics with direct particle integration to explore the feasibility of chondrule formation in planetary embryo bow shocks. The calculations presented here are used to explore the consequences of a Mars-size planetary embryo traveling on a moderately excited orbit through the dusty, early environment of the solar system. The embryo’s eccentric orbit produces a range of supersonic relative velocities between the embryo and the circularly orbiting gas and dust, prompting the formation of bow shocks. Temporary atmospheres around these embryos, which can be created via volatile outgassing and gas capture from the surrounding nebula, can non-trivially affect thermal profiles of solids entering the shock. We explore the thermal environment of solids that traverse the bow shock at different impact radii, the effects that planetoid atmospheres have on shock morphologies, and the stripping efficiency of planetoidal atmospheres in the presence of high relative winds. Simulations are run using adiabatic and radiative conditions, with multiple treatments for the local opacities. Shock speeds of 5, 6, and 7 km s{sup −1} are explored. We find that a high-mass atmosphere and inefficient radiative conditions can produce peak temperatures and cooling rates that are consistent with the constraints set by chondrule furnace studies. For most conditions, the derived cooling rates are potentially too high to be consistent with chondrule formation.
Advanced Multilevel Monte Carlo Methods
Jasra, Ajay
2017-04-24
This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.
Handbook of Monte Carlo methods
National Research Council Canada - National Science Library
Kroese, Dirk P; Taimre, Thomas; Botev, Zdravko I
2011-01-01
... in rapid succession, the staggering number of related techniques, ideas, concepts and algorithms makes it difficult to maintain an overall picture of the Monte Carlo approach. This book attempts to encapsulate the emerging dynamics of this field of study"--
TARC: Carlo Rubbia's Energy Amplifier
Laurent Guiraud
1997-01-01
Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.
Palmer, Grant; Prabhu, Dinesh; Cruden, Brett A.
2013-01-01
The 2013-2022 Decaedal survey for planetary exploration has identified probe missions to Uranus and Saturn as high priorities. This work endeavors to examine the uncertainty for determining aeroheating in such entry environments. Representative entry trajectories are constructed using the TRAJ software. Flowfields at selected points on the trajectories are then computed using the Data Parallel Line Relaxation (DPLR) Computational Fluid Dynamics Code. A Monte Carlo study is performed on the DPLR input parameters to determine the uncertainty in the predicted aeroheating, and correlation coefficients are examined to identify which input parameters show the most influence on the uncertainty. A review of the present best practices for input parameters (e.g. transport coefficient and vibrational relaxation time) is also conducted. It is found that the 2(sigma) - uncertainty for heating on Uranus entry is no more than 2.1%, assuming an equilibrium catalytic wall, with the uncertainty being determined primarily by diffusion and H(sub 2) recombination rate within the boundary layer. However, if the wall is assumed to be partially or non-catalytic, this uncertainty may increase to as large as 18%. The catalytic wall model can contribute over 3x change in heat flux and a 20% variation in film coefficient. Therefore, coupled material response/fluid dynamic models are recommended for this problem. It was also found that much of this variability is artificially suppressed when a constant Schmidt number approach is implemented. Because the boundary layer is reacting, it is necessary to employ self-consistent effective binary diffusion to obtain a correct thermal transport solution. For Saturn entries, the 2(sigma) - uncertainty for convective heating was less than 3.7%. The major uncertainty driver was dependent on shock temperature/velocity, changing from boundary layer thermal conductivity to diffusivity and then to shock layer ionization rate as velocity increases. While
Carlos Chagas: biographical sketch.
Moncayo, Alvaro
2010-01-01
Carlos Chagas was born on 9 July 1878 in the farm "Bon Retiro" located close to the City of Oliveira in the interior of the State of Minas Gerais, Brazil. He started his medical studies in 1897 at the School of Medicine of Rio de Janeiro. In the late XIX century, the works by Louis Pasteur and Robert Koch induced a change in the medical paradigm with emphasis in experimental demonstrations of the causal link between microbes and disease. During the same years in Germany appeared the pathological concept of disease, linking organic lesions with symptoms. All these innovations were adopted by the reforms of the medical schools in Brazil and influenced the scientific formation of Chagas. Chagas completed his medical studies between 1897 and 1903 and his examinations during these years were always ranked with high grades. Oswaldo Cruz accepted Chagas as a doctoral candidate and directed his thesis on "Hematological studies of Malaria" which was received with honors by the examiners. In 1903 the director appointed Chagas as research assistant at the Institute. In those years, the Institute of Manguinhos, under the direction of Oswaldo Cruz, initiated a process of institutional growth and gathered a distinguished group of Brazilian and foreign scientists. In 1907, he was requested to investigate and control a malaria outbreak in Lassance, Minas Gerais. In this moment Chagas could not have imagined that this field research was the beginning of one of the most notable medical discoveries. Chagas was, at the age of 28, a Research Assistant at the Institute of Manguinhos and was studying a new flagellate parasite isolated from triatomine insects captured in the State of Minas Gerais. Chagas made his discoveries in this order: first the causal agent, then the vector and finally the human cases. These notable discoveries were carried out by Chagas in twenty months. At the age of 33 Chagas had completed his discoveries and published the scientific articles that gave him world
Photoneutron spectrum measured with Bonner Spheres in Planetary method mode
Energy Technology Data Exchange (ETDEWEB)
Benites R, J. [Centro Estatal de Cancerologia de Nayarit, Servicio de Seguridad Radiologica, Calz. de la Cruz 118 Sur, 63000 Tepic, Nayarit (Mexico); Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Apdo. Postal 336, 98000 Zacatecas (Mexico); Velazquez F, J., E-mail: jlbenitesr@prodigy.net.mx [Universidad Autonoma de Nayarit, Posgrado en Ciencias Biologico Agropecuarias, Carretera Tepic-Compostela Km 9, 63780 Jalisco-Nayarit (Mexico)
2012-10-15
We measured the spectrum of photoneutrons at 100 cm isocenter linear accelerator (Linac) Varian ix operating at 15 MV Bremsstrahlung mode. In this process was used a radiation field of 20 x 20 cm{sup 2} at a depth of 5 cm in a solid water phantom with dimensions of 30 x 30 x 15 cm{sup 3}. The measurement was performed with a system using it Bonner Spheres spectrometric method Planetary mode. As neutron detector of the spectrometer is used thermoluminescent dosimeters pairs of type 600 and 700. (Author)
Energetic Techniques For Planetary Defense
Barbee, B.; Bambacus, M.; Bruck Syal, M.; Greenaugh, K. C.; Leung, R. Y.; Plesko, C. S.
2017-12-01
Near-Earth Objects (NEOs) are asteroids and comets whose heliocentric orbits tend to approach or cross Earth's heliocentric orbit. NEOs of various sizes periodically collide with Earth, and efforts are currently underway to discover, track, and characterize NEOs so that those on Earth-impacting trajectories are discovered far enough in advance that we would have opportunities to deflect or destroy them prior to Earth impact, if warranted. We will describe current efforts by the National Aeronautics and Space Administration (NASA) and the National Nuclear Security Administration (NNSA) to assess options for energetic methods of deflecting or destroying hazardous NEOs. These methods include kinetic impactors, which are spacecraft designed to collide with an NEO and thereby alter the NEO's trajectory, and nuclear engineering devices, which are used to rapidly vaporize a layer of NEO surface material. Depending on the amount of energy imparted, this can result in either deflection of the NEO via alteration of its trajectory, or robust disruption of the NEO and dispersal of the remaining fragments. We have studied the efficacies and limitations of these techniques in simulations, and have combined the techniques with corresponding spacecraft designs and mission designs. From those results we have generalized planetary defense mission design strategies and drawn conclusions that are applicable to a range of plausible scenarios. We will present and summarize our research efforts to date, and describe approaches to carrying out planetary defense missions with energetic NEO deflection or disruption techniques.
Interactive investigations into planetary interiors
Rose, I.
2015-12-01
Many processes in Earth science are difficult to observe or visualize due to the large timescales and lengthscales over which they operate. The dynamics of planetary mantles are particularly challenging as we cannot even look at the rocks involved. As a result, much teaching material on mantle dynamics relies on static images and cartoons, many of which are decades old. Recent improvements in computing power and technology (largely driven by game and web development) have allowed for advances in real-time physics simulations and visualizations, but these have been slow to affect Earth science education.Here I demonstrate a teaching tool for mantle convection and seismology which solves the equations for conservation of mass, momentum, and energy in real time, allowing users make changes to the simulation and immediately see the effects. The user can ask and answer questions about what happens when they add heat in one place, or take it away from another place, or increase the temperature at the base of the mantle. They can also pause the simulation, and while it is paused, create and visualize seismic waves traveling through the mantle. These allow for investigations into and discussions about plate tectonics, earthquakes, hot spot volcanism, and planetary cooling.The simulation is rendered to the screen using OpenGL, and is cross-platform. It can be run as a native application for maximum performance, but it can also be embedded in a web browser for easy deployment and portability.
Visual lunar and planetary astronomy
Abel, Paul G
2013-01-01
With the advent of CCDs and webcams, the focus of amateur astronomy has to some extent shifted from science to art. The object of many amateur astronomers is now to produce “stunning images” that, although beautiful, are not intended to have scientific merit. Paul Abel has been addressing this issue by promoting visual astronomy wherever possible – at talks to astronomical societies, in articles for popular science magazines, and on BBC TV’s The Sky at Night. Visual Lunar and Planetary Astronomy is a comprehensive modern treatment of visual lunar and planetary astronomy, showing that even in the age of space telescopes and interplanetary probes it is still possible to contribute scientifically with no more than a moderately priced commercially made astronomical telescope. It is believed that imaging and photography is somehow more objective and more accurate than the eye, and this has led to a peculiar “crisis of faith” in the human visual system and its amazing processing power. But by anal...
Adjoint electron Monte Carlo calculations
International Nuclear Information System (INIS)
Jordan, T.M.
1986-01-01
Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment
Parallel MCNP Monte Carlo transport calculations with MPI
International Nuclear Information System (INIS)
Wagner, J.C.; Haghighat, A.
1996-01-01
The steady increase in computational performance has made Monte Carlo calculations for large/complex systems possible. However, in order to make these calculations practical, order of magnitude increases in performance are necessary. The Monte Carlo method is inherently parallel (particles are simulated independently) and thus has the potential for near-linear speedup with respect to the number of processors. Further, the ever-increasing accessibility of parallel computers, such as workstation clusters, facilitates the practical use of parallel Monte Carlo. Recognizing the nature of the Monte Carlo method and the trends in available computing, the code developers at Los Alamos National Laboratory implemented the message-passing general-purpose Monte Carlo radiation transport code MCNP (version 4A). The PVM package was chosen by the MCNP code developers because it supports a variety of communication networks, several UNIX platforms, and heterogeneous computer systems. This PVM version of MCNP has been shown to produce speedups that approach the number of processors and thus, is a very useful tool for transport analysis. Due to software incompatibilities on the local IBM SP2, PVM has not been available, and thus it is not possible to take advantage of this useful tool. Hence, it became necessary to implement an alternative message-passing library package into MCNP. Because the message-passing interface (MPI) is supported on the local system, takes advantage of the high-speed communication switches in the SP2, and is considered to be the emerging standard, it was selected
Considerations in the Design of Future Planetary Laser Altimeters
Smith, D. E.; Neumann, G. A.; Mazarico, E.; Zuber, M. T.; Sun, X.
2017-12-01
Planetary laser altimeters have generally been designed to provide high accuracy measurements of the nadir range to an uncooperative surface for deriving the shape of the target body, and sometimes specifically for identifying and characterizing potential landing sites. However, experience has shown that in addition to the range measurement, other valuable observations can be acquired, including surface reflectance and surface roughness, despite not being given high priority in the original altimeter design or even anticipated. After nearly 2 decades of planetary laser altimeter design, the requirements are evolving and additional capabilities are becoming equally important. The target bodies, once the terrestrial planets, are now equally asteroids and moons that in many cases do not permit simple orbital operations due to their small mass, radiation issues, or spacecraft fuel limitations. In addition, for a number of reasons, it has become necessary to perform shape determination from a much greater range, even thousands of kilometers, and thus ranging is becoming as important as nadir altimetry. Reflectance measurements have also proved important for assessing the presence of ice, water or CO2, and laser pulse spreading informed knowledge of surface roughness; all indicating a need for improved instrument capability. Recently, the need to obtain accurate range measurement to laser reflectors on landers or on a planetary surface is presenting new science opportunities but for which current designs are far from optimal. These changes to classic laser altimetry have consequences for many instrument functions and capabilities, including beam divergence, laser power, number of beams and detectors, pixelation, energy measurements, pointing stability, polarization, laser wavelengths, and laser pulse rate dependent range. We will discuss how a new consideration of these trades will help make lidars key instruments to execute innovative science in future planetary
Visualization of Kepler's Laws of Planetary Motion
Lu, Meishu; Su, Jun; Wang, Weiguo; Lu, Jianlong
2017-01-01
For this article, we use a 3D printer to print a surface similar to universal gravitation for demonstrating and investigating Kepler's laws of planetary motion describing the motion of a small ball on the surface. This novel experimental method allows Kepler's laws of planetary motion to be visualized and will contribute to improving the…
Preparing Planetary Scientists to Engage Audiences
Shupla, C. B.; Shaner, A. J.; Hackler, A. S.
2017-12-01
While some planetary scientists have extensive experience sharing their science with audiences, many can benefit from guidance on giving presentations or conducting activities for students. The Lunar and Planetary Institute (LPI) provides resources and trainings to support planetary scientists in their communication efforts. Trainings have included sessions for students and early career scientists at conferences (providing opportunities for them to practice their delivery and receive feedback for their poster and oral presentations), as well as separate communication workshops on how to engage various audiences. LPI has similarly begun coaching planetary scientists to help them prepare their public presentations. LPI is also helping to connect different audiences and their requests for speakers to planetary scientists. Scientists have been key contributors in developing and conducting activities in LPI education and public events. LPI is currently working with scientists to identify and redesign short planetary science activities for scientists to use with different audiences. The activities will be tied to fundamental planetary science concepts, with basic materials and simple modifications to engage different ages and audience size and background. Input from the planetary science community on these efforts is welcome. Current results and resources, as well as future opportunities will be shared.
Introduction to the special issue: Planetary geomorphology
Burr, Devon M.; Howard, Alan D.
2015-07-01
Planetary geomorphology is the study of extraterrestrial landscapes. In recognition of the promise for productive interaction between terrestrial and planetary geomorphologists, the 45th annual Binghamton Geomorphology Symposium (BGS) focused on Planetary Geomorphology. The aim of the symposium was to bring planetary and terrestrial geomorphologists together for symbiotic and synthetic interactions that would enrich both subdisciplines. In acknowledgment of the crucial role of terrestrial field work in planetary geomorphology and of the BGS tradition, the symposium began with a field trip to the Appalachian Mountains, followed by a dinner talk of recent results from the Mars Surface Laboratory. On Saturday and Sunday, the symposium was organized around major themes in planetary geomorphology, starting with the geomorphic processes that are most common in our Solar System-impact cratering, tectonism, volcanism-to set the stage for other geomorphic processes, including aeolian, fluvial, lacustrine, and glacial/polar. On Saturday evening, the banquet talk provided an historical overview of planetary geomorphology, including its roots in the terrestrial geosciences. The symposium concluded with a full-afternoon tutorial on planetary geomorphologic datasets. This special issue of Geomorphology consists of papers by invited authors from the 2014 BGS, and this introduction provides some context for these papers.
Interoperability in the Planetary Science Archive (PSA)
Rios Diaz, C.
2017-09-01
The protocols and standards currently being supported by the recently released new version of the Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet- Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. We explore these protocols in more detail providing scientifically useful examples of their usage within the PSA.
Optical observations of southern planetary nebula candidates
VandeSteene, GC; Sahu, KC; Pottasch, [No Value
1996-01-01
We present H alpha+[NII] images and low resolution spectra of 16 IRAS-selected, southern planetary nebula candidates previously detected in the radio continuum. The H alpha+[NII] images are presented as finding charts. Contour plots are shown for the resolved planetary nebulae. From these images
Multiple Scattering in Planetary Regoliths Using Incoherent Interactions
Muinonen, K.; Markkanen, J.; Vaisanen, T.; Penttilä, A.
2017-12-01
We consider scattering of light by a planetary regolith using novel numerical methods for discrete random media of particles. Understanding the scattering process is of key importance for spectroscopic, photometric, and polarimetric modeling of airless planetary objects, including radar studies. In our modeling, the size of the spherical random medium can range from microscopic to macroscopic sizes, whereas the particles are assumed to be of the order of the wavelength in size. We extend the radiative transfer and coherent backscattering method (RT-CB) to the case of dense packing of particles by adopting the ensemble-averaged first-order incoherent extinction, scattering, and absorption characteristics of a volume element of particles as input. In the radiative transfer part, at each absorption and scattering process, we account for absorption with the help of the single-scattering albedo and peel off the Stokes parameters of radiation emerging from the medium in predefined scattering angles. We then generate a new scattering direction using the joint probability density for the local polar and azimuthal scattering angles. In the coherent backscattering part, we utilize amplitude scattering matrices along the radiative-transfer path and the reciprocal path. Furthermore, we replace the far-field interactions of the RT-CB method with rigorous interactions facilitated by the Superposition T-matrix method (STMM). This gives rise to a new RT-RT method, radiative transfer with reciprocal interactions. For microscopic random media, we then compare the new results to asymptotically exact results computed using the STMM, succeeding in the numerical validation of the new methods.Acknowledgments. Research supported by European Research Council with Advanced Grant No. 320773 SAEMPL, Scattering and Absorption of ElectroMagnetic waves in ParticuLate media. Computational resources provided by CSC - IT Centre for Science Ltd, Finland.
Planetary CubeSats Come of Age
Sherwood, Brent; Spangelo, Sara; Frick, Andreas; Castillo-Rogez, Julie; Klesh, Andrew; Wyatt, E. Jay; Reh, Kim; Baker, John
2015-01-01
Jet Propulsion Laboratory initiatives in developing and formulating planetary CubeSats are described. Six flight systems already complete or underway now at JPL for missions to interplanetary space, the Moon, a near-Earth asteroid, and Mars are described at the subsystem level. Key differences between interplanetary nanospacecraft and LEO CubeSats are explained, as well as JPL's adaptation of vendor components and development of system solutions to meet planetary-mission needs. Feasible technology-demonstration and science measurement objectives are described for multiple modes of planetary mission implementation. Seven planetary-science demonstration mission concepts, already proposed to NASA by Discovery-2014 PIs partnered with JPL, are described for investigations at Sun-Earth L5, Venus, NEA 1999 FG3, comet Tempel 2, Phobos, main-belt asteroid 24 Themis, and metal asteroid 16 Psyche. The JPL staff and facilities resources available to PIs for analysis, design, and development of planetary nanospacecraft are catalogued.
The Earth Radiation Budget (ERB) experiment
Jacobowitz, H.; Stowe, L. L.; Hickey, J. R.
1978-01-01
The radiation budget of the earth on both synoptic and planetary scales by simultaneous measurement of incoming solar radiation and outgoing earth reflected (shortwave) and emitted (longwave) radiation was determined. Both fixed wide angle sampling of terrestrial fluxes at the satellite altitude, and scanned narrow-angle sampling of the radiance components, dependent on angle are used to determine outgoing radiation. Measurements of radiation are obtained in 22 different optical channels.
International Nuclear Information System (INIS)
Jacoby, G.H.
1980-01-01
Identifications of 19 and 34 faint planetary nebulae have been made in the central regions of the SMC and LMC, respectively, using on-line/off-line filter photography at [O III] and Hα. The previously known brighter planetary nebulae in these fields, eight in both the SMC and the LMC, were also identified. On the basis of the ratio of the numbers of faint to bright planetary nebulae in these fields and the numbers of bright planetary nebulae in the surrounding fields, the total numbers of planetary nebulae in the SMC and LMC are estimated to be 285 +- 78 and 996 +- 253, respectively. Corrections have been applied to account for omissions due to crowding confusion in previous surveys, spatial and detectability incompleteness, and obscuration by dust.Equatorial coordinates and finding charts are presented for all the identified planetary nebulae. The coordinates have uncertainties smaller than 0.''6 relative to nearby bright stars, thereby allowing acquisition of the planetary nebulae by bling offsetting.Monochromatic fluxes are derived photographically and used to determine the luminosity function for Magellanic Cloud planetary nebulae as faint as 6 mag below the brightest. The luminosity function is used to estimate the total numbers of planetary nebulae in eight Local Group galaxies in which only bright planetary nebulae have been identified. The dervied luminosity specific number of planetary nebulae per unit luminosity is nearly constant for all eight galaxies, having a value of 6.1 x 10 -7 planetary nebulae L -1 /sub sun/. The mass specific number, based on the three galaxies with well-determined masses, is 2.1 x 10 -7 planetary nebulae M -1 /sub sun/. With estimates for the luminosity and mass of our Galaxy, its total number of planetary nebulae is calculated to be 10,000 +- 4000, in support of the Cudworth distance scale
Energy Technology Data Exchange (ETDEWEB)
Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2014-03-11
The goals of this project are to develop Monte Carlo radiation transport methods and simulation software for engineering analysis that are robust, efficient and easy to use; and provide computational resources to assess and improve the predictive capability of radiation transport methods and nuclear data.
ISO Spectroscopy of Proto-Planetary Nebulae
Hrivnak, Bruce J.
2000-01-01
features at 3.3, 6,2, 7.7, and 11.3 micron, which are commonly observed in planetary nebulae and HII regions, are also seen in these PPNs. However, their strengths relative to the continuum plateaus at 8 and 12 micron are weaker than in planetary nebulae. The 6.9 micron feature, seen almost exclusively in PPNs, is strong. The spectral energy distributions of these PPNs were fitted with a radiative-transfer model, taking into account the emission features at 21, 26, and 30 micron. A significant fraction of the total energy output is emitted in these features: as high as 20% in the 30 micron feature and 8% in the 21 micron feature. The fact that so much energy is carried in these features suggests that the material responsible for this feature must be made of abundant elements, and most likely involves carbon. The change in the in feature strengths from stronger aliphatic bonds in PPNs to stronger aromatic bonds in PNs suggests a chemical and physical evolution in the carbonaceous circumstellar dust during this transition time scale of a few thousand years.
Interferometric observations of planetary nebulae
International Nuclear Information System (INIS)
Atherton, P.D.
1978-01-01
Studies of the velocity field of planetary nebulae can be used to derive important information concerning their structure and dynamics. A description is given of the design, construction and operation of a servo-controlled Fabry Perot Interferometer, for the Cassegrain focus, which was built to perform these studies. New evidence is presented concerning the structure and internal motions of NGC 3242, NGC 6720 and NGC 7027. A technique is described which uses the velocity field to map variations in the electron temperature and density along the line of sight as well as across the face of the nebula. It is shown how a Fabry Perot may be used in conjunction with multi-element array detectors to facilitate this technique. Finally some extensions to the technique of capacitance micrometry are discussed which allow the operation of a single air-spaced etalon over a wide range of capacitor gaps
Visualization Tools for Planetary Data
DeWolfe, Alexandria; Larsen, Kristopher; Brain, David; Chaffin, Michael; Harter, Bryan; Putnam, Brian
2017-04-01
We have developed a set of software tools for displaying and analyzing data from the MAVEN and MMS missions. In order to better visualize the science data and models, we have constructed 3D visualizations of MAVEN orbiting Mars and MMS orbiting Earth using the CesiumJS library. These visualizations allow viewing of not only spacecraft orientation and position over time, but also scientific data from the spacecraft, and atmospheric models as well. We have also developed a Python toolkit which replicates the functionality of the widely-used IDL "tplot" toolkit for analyzing planetary atmospheric data. We use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our open-source software is available on Github.
Planetary accretion in circumstellar disks
Lissauer, Jack J.; Stewart, Glen R.
1993-01-01
The formation of terrestrial planets and the cores of Jovian planets is reviewed in the framework of the planetesimal hypothesis, wherein planets are assumed to grow via the pairwise accumulation of small solid bodies. Emphasis is placed on the dynamics of solid body accretion from kilometer size planetesimals to terrestrial type planets. This stage of planetary growth is least dependent on the characteristics of the evolutionary state of the central star. It is concluded that the evolution of the planetesimal size distribution is determined by the gravitationally enhanced collision cross-section, which favors collisions between planetesimals with smaller velocities. Runaway growth of the largest planetesimal in each accretion zone appears to be a likely outcome. The subsequent accumulation of the resulting protoplanets leads to a large degree of radial mixing in the terrestrial planet region, and giant impacts are probable.
Sonic anemometry of planetary atmospheres
Cuerva, Alvaro; Sanz-Andrés, Angel; Lorenz, Ralph D.
2004-02-01
Sonic anemometers are robust, fast and reliable wind sensors which are able to measure the complete wind speed vector at high sampling rates. All these characteristics make sonic anemometers to be ideal candidates for atmospheric applications. Since sonic anemometers have not moving parts and they can be designed to have loss mass and power consumption, they have become adequate for planetary exploration purposes, both for atmosphere studies and for flying robot control. However, some challenges must be undertaken before implementing their use. Problems such as sound attenuation in different atmospheres, sensor/air acoustic impedance matching as well as flow/fluid dependence of sonic measurements have to be considered when these sensors are used in other atmospheres.
A Monte Carlo code for ion beam therapy
Anaïs Schaeffer
2012-01-01
Initially developed for applications in detector and accelerator physics, the modern Fluka Monte Carlo code is now used in many different areas of nuclear science. Over the last 25 years, the code has evolved to include new features, such as ion beam simulations. Given the growing use of these beams in cancer treatment, Fluka simulations are being used to design treatment plans in several hadron-therapy centres in Europe. Fluka calculates the dose distribution for a patient treated at CNAO with proton beams. The colour-bar displays the normalized dose values. Fluka is a Monte Carlo code that very accurately simulates electromagnetic and nuclear interactions in matter. In the 1990s, in collaboration with NASA, the code was developed to predict potential radiation hazards received by space crews during possible future trips to Mars. Over the years, it has become the standard tool to investigate beam-machine interactions, radiation damage and radioprotection issues in the CERN accelerator com...
DEVELOPMENT OF A MULTIMODAL MONTE CARLO BASED TREATMENT PLANNING SYSTEM.
Kumada, Hiroaki; Takada, Kenta; Sakurai, Yoshinori; Suzuki, Minoru; Takata, Takushi; Sakurai, Hideyuki; Matsumura, Akira; Sakae, Takeji
2017-10-26
To establish boron neutron capture therapy (BNCT), the University of Tsukuba is developing a treatment device and peripheral devices required in BNCT, such as a treatment planning system. We are developing a new multimodal Monte Carlo based treatment planning system (developing code: Tsukuba Plan). Tsukuba Plan allows for dose estimation in proton therapy, X-ray therapy and heavy ion therapy in addition to BNCT because the system employs PHITS as the Monte Carlo dose calculation engine. Regarding BNCT, several verifications of the system are being carried out for its practical usage. The verification results demonstrate that Tsukuba Plan allows for accurate estimation of thermal neutron flux and gamma-ray dose as fundamental radiations of dosimetry in BNCT. In addition to the practical use of Tsukuba Plan in BNCT, we are investigating its application to other radiation therapies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Monte Carlo Transverse Emittance Study on Cs2Te
Banfi, F; Galimberti, P G; Giannetti, C; Pagliara, S; Parmigiani, F; Pedersoli, E
2005-01-01
A Monte Carlo study of electron transport in Cs2Te films is performed to investigate the transverse emittance epsilon at the cathode surface. We find the photoemitted electron angular distribution and explain the physical mechanism involved in the process, a mechanism hindered by the statistical nature of the Monte Carlo method. The effects of electron-phonon scattering are discussed. The transverse emittance is calculated for different radiation wavelengths and a laser spot size of 1.5*10(-3) m. For a laser radiation at 265 nm we find epsilon = 0.56 mm-mrad. The dependence of epsilon and the quantum yield on the electron affinity Ea is also investigated. The data shows the importance of aging/contamination on the material.
Monte Carlo simulation of a CZT detector
International Nuclear Information System (INIS)
Chun, Sung Dae; Park, Se Hwan; Ha, Jang Ho; Kim, Han Soo; Cho, Yoon Ho; Kang, Sang Mook; Kim, Yong Kyun; Hong, Duk Geun
2008-01-01
CZT detector is one of the most promising radiation detectors for hard X-ray and γ-ray measurement. The energy spectrum of CZT detector has to be simulated to optimize the detector design. A CZT detector was fabricated with dimensions of 5x5x2 mm 3 . A Peltier cooler with a size of 40x40 mm 2 was installed below the fabricated CZT detector to reduce the operation temperature of the detector. Energy spectra of were measured with 59.5 keV γ-ray from 241 Am. A Monte Carlo code was developed to simulate the CZT energy spectrum, which was measured with a planar-type CZT detector, and the result was compared with the measured one. The simulation was extended to the CZT detector with strip electrodes. (author)
Monte Carlo simulations of medical imaging modalities
Energy Technology Data Exchange (ETDEWEB)
Estes, G.P. [Los Alamos National Lab., NM (United States)
1998-09-01
Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computer power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.
The International Planetary Data Alliance
Stein, T.; Arviset, C.; Crichton, D. J.
2017-12-01
The International Planetary Data Alliance (IPDA) is an association of partners with the aim of improving the quality of planetary science data and services to the end users of space based instrumentation. The specific mission of the IPDA is to facilitate global access to, and exchange of, high quality scientific data products managed across international boundaries. Ensuring proper capture, accessibility and availability of the data is the task of the individual member space agencies. The IPDA was formed in 2006 with the purpose of adopting standards and developing collaborations across agencies to ensure data is captured in common formats. Member agencies include: Armenian Astronomical Society, China National Space Agency (CNSA), European Space Agency (ESA), German Aerospace Center (DLR), Indian Space Research Organization (ISRO), Italian Space Agency (ASI), Japanese Aerospace Exploration Agency (JAXA), National Air and Space Administration (NASA), National Centre for Space Studies (CNES), Space Research Institute (IKI), UAE Space Agency, and UK Space Agency. The IPDA Steering Committee oversees the execution of projects and coordinates international collaboration. The IPDA conducts a number of focused projects to enable interoperability, construction of compatible archives, and the operation of the IPDA as a whole. These projects have helped to establish the IPDA and to move the collaboration forward. A key project that is currently underway is the implementation of the PDS4 data standard. Given the international focus, it has been critical that the PDS and the IPDA collaborate on its development. Also, other projects have been conducted successfully, including developing the IPDA architecture and corresponding requirements, developing shared registries for data and tools across international boundaries, and common templates for supporting agreements for archiving and sharing data for international missions. Several projects demonstrating interoperability across
Evaluation of equivalent doses in 18F PET/CT using the Monte Carlo method with MCNPX code
International Nuclear Information System (INIS)
Belinato, Walmir; Santos, William Souza; Perini, Ana Paula; Neves, Lucio Pereira; Souza, Divanizia N.
2017-01-01
The present work used the Monte Carlo method (MMC), specifically the Monte Carlo NParticle - MCNPX, to simulate the interaction of radiation involving photons and particles, such as positrons and electrons, with virtual adult anthropomorphic simulators on PET / CT scans and to determine absorbed and equivalent doses in adult male and female patients
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Radiation Damage in Electronic Memory Devices
Fetahović, Irfan; Pejović, Milić; Vujisić, Miloš
2013-01-01
This paper investigates the behavior of semiconductor memories exposed to radiation in order to establish their applicability in a radiation environment. The experimental procedure has been used to test radiation hardness of commercial semiconductor memories. Different types of memory chips have been exposed to indirect ionizing radiation by changing radiation dose intensity. The effect of direct ionizing radiation on semiconductor memory behavior has been analyzed by using Monte Carlo simula...
Monte carlo dose calculation in dental amalgam phantom
Mohd Zahri Abdul Aziz; A L Yusoff; N D Osman; R Abdullah; N A Rabaie; M S Salikin
2015-01-01
It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatm...
The evolution of solar ultraviolet luminosity. [influence on planetary atmospheres
Zahnle, K. J.; Walker, J. C. G.
1982-01-01
Astronomical observations of stars analogous to the sun are used to construct a tentative account of the evolution of solar UV luminosity. Evidence exists that the young sun was a much more powerful source of energetic particles and radiation than it is today, and while on the main sequence, solar activity has declined as an inverse power law of age as a consequence of angular momentum loss to the solar wind. Observations of pre-main sequence stars indicate that before the sun reached the main sequence, it may have emitted as much as ten thousand times the amount of ultraviolet radiation that it does today. The impact of the results on knowledge of photochemistry and escape of constituents of primordial planetary atmospheres is discussed.
Torsello, Daniele; Mino, Lorenzo; Bonino, Valentina; Agostino, Angelo; Operti, Lorenza; Borfecchia, Elisa; Vittone, Ettore; Lamberti, Carlo; Truccato, Marco
2018-01-01
We investigate the microscopic mechanism responsible for the change of macroscopic electrical properties of the B i2S r2CaC u2O8 +δ high-temperature superconductor induced by intense synchrotron hard x-ray beams. The possible effects of secondary electrons on the oxygen content via the knock-on interaction are studied by Monte Carlo simulations. The change in the oxygen content expected from the knock-on model is computed convoluting the fluence of photogenerated electrons in the material with the Seitz-Koehler cross section. This approach has been adopted to analyze several experimental irradiation sessions with increasing x-ray fluences. A close comparison between the expected variations in oxygen content and the experimental results allows determining the irradiation regime in which the knock-on mechanism can satisfactorily explain the observed changes. Finally, we estimate the threshold displacement energy of loosely bound oxygen atoms in this material Td=0 .15-0.01+0.025eV .
Orfanelli, Styliani; Gazis, E
The Compact Linear Collider (CLIC) study is a feasibility study aiming at the development of an electron/positron linear collider with a centre of mass energy in the multi-TeV energy range. Each Linac will have a length of 21 km, which means that very high accelerating gradients (>100 MV/m) are required. To achieve the high accelerating gradients, a novel two-beam acceleration scheme, in which RF power is transferred from a high-current, low-energy drive beam to the low-current, high energy main accelerating beam is designed. A Beam Loss Monitoring (BLM) system will be designed for CLIC to meet the requirements of the accelerator complex. Its main role as part of the machine protection scheme will be to detect potentially dangerous beam instabilities and prevent subsequent injection into the main beam or drive beam decelerators. The first part of this work describes the GEANT4 Monte Carlo simulations performed to estimate the damage potential of high energy electron beams impacting a copper target. The second...
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
International Nuclear Information System (INIS)
Coulot, J
2003-01-01
Monte Carlo techniques are involved in many applications in medical physics, and the field of nuclear medicine has seen a great development in the past ten years due to their wider use. Thus, it is of great interest to look at the state of the art in this domain, when improving computer performances allow one to obtain improved results in a dramatically reduced time. The goal of this book is to make, in 15 chapters, an exhaustive review of the use of Monte Carlo techniques in nuclear medicine, also giving key features which are not necessary directly related to the Monte Carlo method, but mandatory for its practical application. As the book deals with therapeutic' nuclear medicine, it focuses on internal dosimetry. After a general introduction on Monte Carlo techniques and their applications in nuclear medicine (dosimetry, imaging and radiation protection), the authors give an overview of internal dosimetry methods (formalism, mathematical phantoms, quantities of interest). Then, some of the more widely used Monte Carlo codes are described, as well as some treatment planning softwares. Some original techniques are also mentioned, such as dosimetry for boron neutron capture synovectomy. It is generally well written, clearly presented, and very well documented. Each chapter gives an overview of each subject, and it is up to the reader to investigate it further using the extensive bibliography provided. Each topic is discussed from a practical point of view, which is of great help for non-experienced readers. For instance, the chapter about mathematical aspects of Monte Carlo particle transport is very clear and helps one to apprehend the philosophy of the method, which is often a difficulty with a more theoretical approach. Each chapter is put in the general (clinical) context, and this allows the reader to keep in mind the intrinsic limitation of each technique involved in dosimetry (for instance activity quantitation). Nevertheless, there are some minor remarks to
Markov Chain Monte Carlo Methods-Simple Monte Carlo
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo ... New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (PI Ltd., Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560017, India.
An online planetary exploration tool: ;Country Movers;
Gede, Mátyás; Hargitai, Henrik
2017-08-01
Results in astrogeologic investigations are rarely communicated towards the general public by maps despite the new advances in planetary spatial informatics and new spatial datasets in high resolution and more complete coverage. Planetary maps are typically produced by astrogeologists for other professionals, and not by cartographers for the general public. We report on an application designed for students, which uses cartography as framework to aid the virtual exploration of other planets and moons, using the concepts of size comparison and travel time calculation. We also describe educational activities that build on geographic knowledge and expand it to planetary surfaces.
Gravitational effects on planetary neutron flux spectra
Feldman, W. C.; Drake, D. M.; O'Dell, R. D.; Brinkley, F. W., Jr.; Anderson, R. C.
1989-01-01
The effects of gravity on the planetary neutron flux spectra for planet Mars, and the lifetime of the neutron, were investigated using a modified one-dimensional diffusion accelerated neutral-particle transport code, coupled with a multigroup cross-section library tailored specifically for Mars. The results showed the presence of a qualitatively new feature in planetary neutron leakage spectra in the form of a component of returning neutrons with kinetic energies less than the gravitational binding energy (0.132 eV for Mars). The net effect is an enhancement in flux at the lowest energies that is largest at and above the outermost layer of planetary matter.
Planetary climates (princeton primers in climate)
Ingersoll, Andrew
2013-01-01
This concise, sophisticated introduction to planetary climates explains the global physical and chemical processes that determine climate on any planet or major planetary satellite--from Mercury to Neptune and even large moons such as Saturn's Titan. Although the climates of other worlds are extremely diverse, the chemical and physical processes that shape their dynamics are the same. As this book makes clear, the better we can understand how various planetary climates formed and evolved, the better we can understand Earth's climate history and future.
Lunar and Planetary Science XXXVI, Part 18
2005-01-01
Topics discussed include: PoDS: A Powder Delivery System for Mars In-Situ Organic, Mineralogic and Isotopic Analysis Instruments Planetary Differentiation of Accreting Planetesimals with 26Al and 60Fe as the Heat Sources Ground-based Observation of Lunar Surface by Lunar VIS/NIR Spectral Imager Mt. Oikeyama Structure: First Impact Structure in Japan? Central Mounds in Martian Impact Craters: Assessment as Possible Perennial Permafrost Mounds (Pingos) A Further Analysis of Potential Photosynthetic Life on Mars New Insight into Valleys-Ocean Boundary on Mars Using 128 Pixels per Degree MOLA Data: Implication for Martian Ocean and Global Climate Change; Recursive Topography Based Surface Age Computations for Mars: New Insight into Surficial Processes That Influenced Craters Distribution as a Step Toward the Formal Proof of Martian Ocean Recession, Timing and Probability; Laser-induced Breakdown Spectroscopy: A New Method for Stand-Off Quantitative Analysis of Samples on Mars; Milk Spring Channels Provide Further Evidence of Oceanic, >1.7-km-Deep Late Devonian Alamo Crater, Southern Nevada; Exploration of Martian Polar Residual Caps from HEND/ODYSSEY Data; Outflow Channels Influencing Martian Climate: Global Circulation Model Simulations with Emplaced Water; Presence of Nonmethane Hydrocarbons on Pluto; Difference in Degree of Space Weathering on the Newborn Asteroid Karin; Circular Collapsed Features Related to the Chaotic Terrain Formation on Mars; A Search for Live (sup 244)Pu in Deep-Sea Sediments: Preliminary Results of Method Development; Some Peculiarities of Quartz, Biotite and Garnet Transformation in Conditions of Step-like Shock Compression of Crystal Slate; Error Analysis of Remotely-Acquired Mossbauer Spectra; Cloud Activity on Titan During the Cassini Mission; Solar Radiation Pressure and Transient Flows on Asteroid Surfaces; Landing Site Characteristics for Europa 1: Topography; and The Crop Circles of Europa.
Microwave transport in EBT distribution manifolds using Monte Carlo ray-tracing techniques
International Nuclear Information System (INIS)
Lillie, R.A.; White, T.L.; Gabriel, T.A.; Alsmiller, R.G. Jr.
1983-01-01
Ray tracing Monte Carlo calculations have been carried out using an existing Monte Carlo radiation transport code to obtain estimates of the microsave power exiting the torus coupling links in EPT microwave manifolds. The microwave power loss and polarization at surface reflections were accounted for by treating the microwaves as plane waves reflecting off plane surfaces. Agreement on the order of 10% was obtained between the measured and calculated output power distribution for an existing EBT-S toroidal manifold. A cost effective iterative procedure utilizing the Monte Carlo history data was implemented to predict design changes which could produce increased manifold efficiency and improved output power uniformity
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
Sgouros, George
2003-01-01
This book examines the applications of Monte Carlo (MC) calculations in therapeutic nuclear medicine, from basic principles to computer implementations of software packages and their applications in radiation dosimetry and treatment planning. It is written for nuclear medicine physicists and physicians as well as radiation oncologists, and can serve as a supplementary text for medical imaging, radiation dosimetry and nuclear engineering graduate courses in science, medical and engineering faculties. With chapters is written by recognised authorities in that particular field, the book covers the entire range of MC applications in therapeutic medical and health physics, from its use in imaging prior to therapy to dose distribution modelling targeted radiotherapy. The contributions discuss the fundamental concepts of radiation dosimetry, radiobiological aspects of targeted radionuclide therapy and the various components and steps required for implementing a dose calculation and treatment planning methodology in ...
Exact Monte Carlo for molecules
Energy Technology Data Exchange (ETDEWEB)
Lester, W.A. Jr.; Reynolds, P.J.
1985-03-01
A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H2, and the singlet-triplet splitting in methylene are presented and discussed. 17 refs.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
time Technical Consultant to. Systat Software Asia-Pacific. (P) Ltd., in Bangalore, where the technical work for the development of the statistical software Systat takes place. His research interests have been in statistical pattern recognition and biostatistics. Keywords. Markov chain, Monte Carlo sampling, Markov chain Monte.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
Markov Chain Monte Carlo Methods. 2. The Markov Chain Case. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance. His spare time is ...
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
GENERAL ! ARTICLE. Markov Chain Monte Carlo Methods. 3. Statistical Concepts. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance.
Monte Carlo calculations of nuclei
Energy Technology Data Exchange (ETDEWEB)
Pieper, S.C. [Argonne National Lab., IL (United States). Physics Div.
1997-10-01
Nuclear many-body calculations have the complication of strong spin- and isospin-dependent potentials. In these lectures the author discusses the variational and Green`s function Monte Carlo techniques that have been developed to address this complication, and presents a few results.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
ter of the 20th century, due to rapid developments in computing technology ... early part of this development saw a host of Monte ... These iterative. Monte Carlo procedures typically generate a random se- quence with the Markov property such that the Markov chain is ergodic with a limiting distribution coinciding with the ...
Is Monte Carlo embarrassingly parallel?
International Nuclear Information System (INIS)
Hoogenboom, J. E.
2012-01-01
Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)
Extension of Einstein's Planetary Theory Based on Generalized ...
African Journals Online (AJOL)
In this article, the generalized Einstein's radial equation of motion in the equatorial plane of the Sun is transformed to obtain additional correction terms to all order of C2 to Einstein's planetary equation of motion and hence to the planetary parameters. Keywords: Radial Equation; Planetary Equation; Planetary parameters ...
Expansion patterns and parallaxes for planetary nebulae
Schönberner, D.; Balick, B.; Jacob, R.
2018-02-01
Aims: We aim to determine individual distances to a small number of rather round, quite regularly shaped planetary nebulae by combining their angular expansion in the plane of the sky with a spectroscopically measured expansion along the line of sight. Methods: We combined up to three epochs of Hubble Space Telescope imaging data and determined the angular proper motions of rim and shell edges and of other features. These results are combined with measured expansion speeds to determine individual distances by assuming that line of sight and sky-plane expansions are equal. We employed 1D radiation-hydrodynamics simulations of nebular evolution to correct for the difference between the spectroscopically measured expansion velocities of rim and shell and of their respective shock fronts. Results: Rim and shell are two independently expanding entities, driven by different physical mechanisms, although their model-based expansion timescales are quite similar. We derive good individual distances for 15 objects, and the main results are as follows: (i) distances derived from rim and shell agree well; (ii) comparison with the statistical distances in the literature gives reasonable agreement; (iii) our distances disagree with those derived by spectroscopic methods; (iv) central-star "plateau" luminosities range from about 2000 L⊙ to well below 10 000 L⊙, with a mean value at about 5000 L⊙, in excellent agreement with other samples of known distance (Galactic bulge, Magellanic Clouds, and K648 in the globular cluster M 15); (v) the central-star mass range is rather restricted: from about 0.53 to about 0.56 M⊙, with a mean value of 0.55 M⊙. Conclusions: The expansion measurements of nebular rim and shell edges confirm the predictions of radiation-hydrodynamics simulations and offer a reliable method for the evaluation of distances to suited objects. Results of this paper are based on observations made with the NASA/ESA Hubble Space Telescope in Cycle 16 (GO11122
Planetary science: Haze cools Pluto's atmosphere
West, Robert A.
2017-11-01
Modelling suggests that Pluto's atmospheric temperature is regulated by haze, unlike the other planetary bodies in the Solar System. The finding has implications for our understanding of exoplanetary atmospheres. See Letter p.352
Observatory for Planetary Investigations from the Stratosphere
National Aeronautics and Space Administration — The Observatory for Planetary Investigation from the Stratosphere (OPIS) project demonstrated the ability of the Wallops Arc Second Pointing (WASP) system to provide...
Planetary Impacts by Clustered Quark Matter Strangelets
Labun, Lance; Rafelski, Jan
2011-01-01
We propose a model of clustered u-d-s quark matter that leads to stable bulk strange quark matter. We discuss qualitatively consequences of impacts by sub-planetary mass strangelets on rocky solar system bodies.
National Aeronautics and Space Administration — The project is to prototype a soft X-ray Imager for planetary applications that has the sensitivity to observe solar system sources of soft X-ray emission. A strong...
Subsurface Prospecting by Planetary Drones, Phase I
National Aeronautics and Space Administration — The proposed program innovates subsurface prospecting by planetary drones to seek a solution to the difficulty of robotic prospecting, sample acquisition, and sample...
Artificial Intelligence in planetary spectroscopy
Waldmann, Ingo
2017-10-01
The field of exoplanetary spectroscopy is as fast moving as it is new. Analysing currently available observations of exoplanetary atmospheres often invoke large and correlated parameter spaces that can be difficult to map or constrain. This is true for both: the data analysis of observations as well as the theoretical modelling of their atmospheres.Issues of low signal-to-noise data and large, non-linear parameter spaces are nothing new and commonly found in many fields of engineering and the physical sciences. Recent years have seen vast improvements in statistical data analysis and machine learning that have revolutionised fields as diverse as telecommunication, pattern recognition, medical physics and cosmology.In many aspects, data mining and non-linearity challenges encountered in other data intensive fields are directly transferable to the field of extrasolar planets. In this conference, I will discuss how deep neural networks can be designed to facilitate solving said issues both in exoplanet atmospheres as well as for atmospheres in our own solar system. I will present a deep belief network, RobERt (Robotic Exoplanet Recognition), able to learn to recognise exoplanetary spectra and provide artificial intelligences to state-of-the-art atmospheric retrieval algorithms. Furthermore, I will present a new deep convolutional network that is able to map planetary surface compositions using hyper-spectral imaging and demonstrate its uses on Cassini-VIMS data of Saturn.
Sonar equations for planetary exploration.
Ainslie, Michael A; Leighton, Timothy G
2016-08-01
The set of formulations commonly known as "the sonar equations" have for many decades been used to quantify the performance of sonar systems in terms of their ability to detect and localize objects submerged in seawater. The efficacy of the sonar equations, with individual terms evaluated in decibels, is well established in Earth's oceans. The sonar equations have been used in the past for missions to other planets and moons in the solar system, for which they are shown to be less suitable. While it would be preferable to undertake high-fidelity acoustical calculations to support planning, execution, and interpretation of acoustic data from planetary probes, to avoid possible errors for planned missions to such extraterrestrial bodies in future, doing so requires awareness of the pitfalls pointed out in this paper. There is a need to reexamine the assumptions, practices, and calibrations that work well for Earth to ensure that the sonar equations can be accurately applied in combination with the decibel to extraterrestrial scenarios. Examples are given for icy oceans such as exist on Europa and Ganymede, Titan's hydrocarbon lakes, and for the gaseous atmospheres of (for example) Jupiter and Venus.
Electron densities in planetary nebulae
International Nuclear Information System (INIS)
Stanghellini, L.; Kaler, J.B.
1989-01-01
Electron densities for 146 planetary nebulae have been obtained for analyzing a large sample of forbidden lines by interpolating theoretical curves obtained from solutions of the five-level atoms using up-to-date collision strengths and transition probabilities. Electron temperatures were derived from forbidden N II and/or forbidden O III lines or were estimated from the He II 4686 A line strengths. The forbidden O II densities are generally lower than those from forbidden Cl III by an average factor of 0.65. For data sets in which forbidden O II and forbidden S II were observed in common, the forbidden O II values drop to 0.84 that of the forbidden S II, implying that the outermost parts of the nebulae might have elevated densities. The forbidden Cl II and forbidden Ar IV densities show the best correlation, especially where they have been obtained from common data sets. The data give results within 30 percent of one another, assuming homogeneous nebulae. 106 refs
Monte Carlo - Advances and Challenges
International Nuclear Information System (INIS)
Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.
2008-01-01
Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature
Galaxy dynamics with the Planetary Nebula Spectrograph
Napolitano, N. R.; Romanowsky, A. J.; Douglas, N. G.; Capaccioli, M.; Arnaboldi, M.; Kuijken, K.; Merrifield, M. R.; Freeman, K. C.; Gerhard, O.
2004-01-01
The Planetary Nebula Spectrograph is a dedicated instrument for measuring radial velocity of individual Planetary Nebulae (PNe) in galaxies. This new instrument is providing crucial data with which to probe the structure of dark halos in the outskirts of elliptical galaxies in particular, which are traditionally lacking of easy interpretable kinematical tracers at large distance from the center. Preliminary results on a sample of intermediate luminosity galaxies have shown little dark matter ...
International Planetary Data Alliance (IPDA) Information Model
Hughes, John Steven; Beebe, R.; Guinness, E.; Heather, D.; Huang, M.; Kasaba, Y.; Osuna, P.; Rye, E.; Savorskiy, V.
2007-01-01
This document is the third deliverable of the International Planetary Data Alliance (IPDA) Archive Data Standards Requirements Identification project. The goal of the project is to identify a subset of the standards currently in use by NASAs Planetary Data System (PDS) that are appropriate for internationalization. As shown in the highlighted sections of Figure 1, the focus of this project is the Information Model component of the Data Architecture Standards, namely the object models, a data dictionary, and a set of data formats.
Migration-induced architectures of planetary systems.
Szuszkiewicz, Ewa; Podlewska-Gaca, Edyta
2012-06-01
The recent increase in number of known multi-planet systems gives a unique opportunity to study the processes responsible for planetary formation and evolution. Special attention is given to the occurrence of mean-motion resonances, because they carry important information about the history of the planetary systems. At the early stages of the evolution, when planets are still embedded in a gaseous disc, the tidal interactions between the disc and planets cause the planetary orbital migration. The convergent differential migration of two planets embedded in a gaseous disc may result in the capture into a mean-motion resonance. The orbital migration taking place during the early phases of the planetary system formation may play an important role in shaping stable planetary configurations. An understanding of this stage of the evolution will provide insight on the most frequently formed architectures, which in turn are relevant for determining the planet habitability. The aim of this paper is to present the observational properties of these planetary systems which contain confirmed or suspected resonant configurations. A complete list of known systems with such configurations is given. This list will be kept by us updated from now on and it will be a valuable reference for studying the dynamics of extrasolar systems and testing theoretical predictions concerned with the origin and the evolution of planets, which are the most plausible places for existence and development of life.
Automatic Feature Extraction from Planetary Images
Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.
2010-01-01
With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.
Effects of clouds on limb radiative transfer in the infrared
Ewen, G. B.; Grainger, R. G.; Lambert, A.
2003-04-01
A forward model (known as McClouds_FM; the Monte carlo Cloud scattering Forward Model) is developed to predict the influence of cirrus clouds on radiances measured by an infrared limb sounding instrument e.g. MIPAS (Michelson Interferometer Passive Atmospheric Sounder). Areverse method three-dimensional Monte Carlo transfer model is combined with a forward model for radiative transfer through the non-cloudy atmosphere (i.e. the RFM; Reference Forward Model to explicitly account for the effects of multiple scattering by the clouds. The ice cloud microphysics are characterised by a size distribution of randomly oriented ice aggregate crystals, with the single scattering properties of the distribution obtained from T-matrix calculations. McClouds_FM can also be adapted to simulate multiple scattering by water clouds by characterising the cloud microphysics by a size distribution of spheroids and using single scattering properties calculated using Mie theory. Initial results are presented comparing McCloudS_FM simulations and real MIPAS spectra of cirrus showing good agreement. Of particular interest are several noticeable spectral features (i.e. inverted H_2O lines) in the data which are replicated in the simulations and can only be explained by tropospheric radiation scattered into the line of sight by the cloud ice particles. McClouds_FM will be used in a retrieval scheme to determine cloud optical properties from both MIPAS and HIRDLS (HIgh Resolution Dynamic Limb Sounder) infrared limb observations. The RFM was developed by Dr Anu Dudhia at the Department of Atmospheric, Oceanic and Planetary Physics, University of Oxford - http://www.atm.ox.ac.uk/RFM)
Planetary Geologic Mapping Handbook - 2010. Appendix
Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.
2010-01-01
Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by
Radiation Damage in Electronic Memory Devices
Directory of Open Access Journals (Sweden)
Irfan Fetahović
2013-01-01
Full Text Available This paper investigates the behavior of semiconductor memories exposed to radiation in order to establish their applicability in a radiation environment. The experimental procedure has been used to test radiation hardness of commercial semiconductor memories. Different types of memory chips have been exposed to indirect ionizing radiation by changing radiation dose intensity. The effect of direct ionizing radiation on semiconductor memory behavior has been analyzed by using Monte Carlo simulation method. Obtained results show that gamma radiation causes decrease in threshold voltage, being proportional to the absorbed dose of radiation. Monte Carlo simulations of radiation interaction with material proved to be significant and can be a good estimation tool in probing semiconductor memory behavior in radiation environment.
(U) Introduction to Monte Carlo Methods
Energy Technology Data Exchange (ETDEWEB)
Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
Geophysics of Small Planetary Bodies
Asphaug, Erik I.
1998-01-01
As a SETI Institute PI from 1996-1998, Erik Asphaug studied impact and tidal physics and other geophysical processes associated with small (low-gravity) planetary bodies. This work included: a numerical impact simulation linking basaltic achondrite meteorites to asteroid 4 Vesta (Asphaug 1997), which laid the groundwork for an ongoing study of Martian meteorite ejection; cratering and catastrophic evolution of small bodies (with implications for their internal structure; Asphaug et al. 1996); genesis of grooved and degraded terrains in response to impact; maturation of regolith (Asphaug et al. 1997a); and the variation of crater outcome with impact angle, speed, and target structure. Research of impacts into porous, layered and prefractured targets (Asphaug et al. 1997b, 1998a) showed how shape, rheology and structure dramatically affects sizes and velocities of ejecta, and the survivability and impact-modification of comets and asteroids (Asphaug et al. 1998a). As an affiliate of the Galileo SSI Team, the PI studied problems related to cratering, tectonics, and regolith evolution, including an estimate of the impactor flux around Jupiter and the effect of impact on local and regional tectonics (Asphaug et al. 1998b). Other research included tidal breakup modeling (Asphaug and Benz 1996; Schenk et al. 1996), which is leading to a general understanding of the role of tides in planetesimal evolution. As a Guest Computational Investigator for NASA's BPCC/ESS supercomputer testbed, helped graft SPH3D onto an existing tree code tuned for the massively parallel Cray T3E (Olson and Asphaug, in preparation), obtaining a factor xIO00 speedup in code execution time (on 512 cpus). Runs which once took months are now completed in hours.
Monte Carlo modelling of TRIGA research reactor
International Nuclear Information System (INIS)
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-01-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S(α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.