WorldWideScience

Sample records for codes phits fluka

  1. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  2. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  3. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm 3 , which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm 3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique

  4. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    Energy Technology Data Exchange (ETDEWEB)

    Ronningen, Reginald Martin [Michigan State University; Remec, Igor [Oak Ridge National Laboratory; Heilbronn, Lawrence H. [University of Tennessee-Knoxville

    2013-06-07

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for design simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".

  5. Simulation of ALTEA calibration data with PHITS, FLUKA and GEANT4

    International Nuclear Information System (INIS)

    La Tessa, C.; Di Fino, L.; Larosa, M.; Lee, K.; Mancusi, D.; Matthiae, D.; Narici, L.; Zaconte, V.

    2009-01-01

    The ALTEA-Space detector has been calibrated by testing its response to several monochromatic beams. These measurements provided energy-deposition spectra in silicon of 100, 600 and 1000 MeV/nucleon 12 C and 200 and 600 MeV/nucleon 48 Ti. The results have been compared to three Monte Carlo transport codes, namely PHITS, GEANT4 and FLUKA. Median, full width at half maximum (FWHM) and interquartile range (IQR) have been calculated for all datasets to characterize location, width and asymmetry of the energy-deposition spectra. Particular attention has been devoted to the influence of δ rays on the shape of the energy-deposition spectrum, both with the help of analytical calculations and Monte Carlo simulations. The two approaches proved that, in this range of projectile charge, projectile energy and detector size, the leakage of secondary electrons might introduce a difference between the energy-loss and energy-deposition spectrum, in particular by changing the location, width and symmetry of the distribution. The overall agreement between the Monte Carlo predictions and the measurements is fair and makes PHITS, FLUKA and GEANT4 all possible candidates for simulating ALTEA-Space experiment.

  6. Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes

    Science.gov (United States)

    Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela

    2017-09-01

    The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.

  7. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  8. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-01-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation

  9. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    Science.gov (United States)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  10. Particle and heavy ion transport code system; PHITS

    International Nuclear Information System (INIS)

    Niita, Koji

    2004-01-01

    Intermediate and high energy nuclear data are strongly required in design study of many facilities such as accelerator-driven systems, intense pulse spallation neutron sources, and also in medical and space technology. There is, however, few evaluated nuclear data of intermediate and high energy nuclear reactions. Therefore, we have to use some models or systematics for the cross sections, which are essential ingredients of high energy particle and heavy ion transport code to estimate neutron yield, heat deposition and many other quantities of the transport phenomena in materials. We have developed general purpose particle and heavy ion transport Monte Carlo code system, PHITS (Particle and Heavy Ion Transport code System), based on the NMTC/JAM code by the collaboration of Tohoku University, JAERI and RIST. The PHITS has three important ingredients which enable us to calculate (1) high energy nuclear reactions up to 200 GeV, (2) heavy ion collision and its transport in material, (3) low energy neutron transport based on the evaluated nuclear data. In the PHITS, the cross sections of high energy nuclear reactions are obtained by JAM model. JAM (Jet AA Microscopic Transport Model) is a hadronic cascade model, which explicitly treats all established hadronic states including resonances and all hadron-hadron cross sections parametrized based on the resonance model and string model by fitting the available experimental data. The PHITS can describe the transport of heavy ions and their collisions by making use of JQMD and SPAR code. The JQMD (JAERI Quantum Molecular Dynamics) is a simulation code for nucleus nucleus collisions based on the molecular dynamics. The SPAR code is widely used to calculate the stopping powers and ranges for charged particles and heavy ions. The PHITS has included some part of MCNP4C code, by which the transport of low energy neutron, photon and electron based on the evaluated nuclear data can be described. Furthermore, the high energy nuclear

  11. Applicability of the PHITS code to a tokamak fusion device

    International Nuclear Information System (INIS)

    Sukegawa, Atsuhiko; Okuno, Koichi; Kawasaki, Hiromitsu

    2011-01-01

    The three-dimensional Monte-Carlo code PHITS (particle and Heavy Ion Transport code System) has been developed to perform the radiation transport analysis, design of the radiation shields and neutronics calculations for tokamak-type D-D fusion reactors. A subroutine was included in PHITS to represent the toroidal neutron source of 2.45 MeV neutrons from the D-D reaction. Here, an example of preliminary tests using PHITS is given. (author)

  12. PHITS-a particle and heavy ion transport code system

    International Nuclear Information System (INIS)

    Niita, Koji; Sato, Tatsuhiko; Iwase, Hiroshi; Nose, Hiroyuki; Nakashima, Hiroshi; Sihver, Lembit

    2006-01-01

    The paper presents a summary of the recent development of the multi-purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS. In particular, we discuss in detail the development of two new models, JAM and JQMD, for high energy particle interactions, incorporated in PHITS, and show comparisons between model calculations and experiments for the validations of these models. The paper presents three applications of the code including spallation neutron source, heavy ion therapy and space radiation. The results and examples shown indicate PHITS has great ability of carrying out the radiation transport analysis of almost all particles including heavy ions within a wide energy range

  13. FLUKA: A Multi-Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  14. Features of the latest version of the PHITS code

    International Nuclear Information System (INIS)

    Sato, Tatsuhiko; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Noda, Shusaku; Ogawa, Tatsuhiko; Nakajima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Niita, Koji; Iwase, Hiroshi; Furuta, Takuya

    2013-01-01

    A Multi-purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, developed through collaborations between JAEA and several institutes in Japan and Europe, and upgraded recently and released as PHITS2.52 is presented as an overview. In the new version, higher accuracy of the simulation was achieved by implementing the latest nuclear reaction models such as Liege intra-nuclear cascade version 4.6 (INCL4.6) and a statistical multi-fragmentation model including JAM and JQMD for high-energy regions. The reliability of the simulation code was improved by modifying both the algorithms for the electron-, positron-, and photon-transport simulations in terms of not only the code itself but also the contents of its package, such as the attached data libraries. More than 800 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. (S. Ohno)

  15. PHITS: Particle and heavy ion transport code system, version 2.23

    International Nuclear Information System (INIS)

    Niita, Koji; Matsuda, Norihiro; Iwamoto, Yosuke; Sato, Tatsuhiko; Nakashima, Hiroshi; Sakamoto, Yukio; Iwase, Hiroshi; Sihver, Lembit

    2010-10-01

    A Particle and Heavy-Ion Transport code System PHITS has been developed under the collaboration of JAEA (Japan Atomic Energy Agency), RIST (Research Organization for Information Science and Technology) and KEK (High Energy Accelerator Research Organization). PHITS can deal with the transport of all particles (nucleons, nuclei, mesons, photons, and electrons) over wide energy ranges, using several nuclear reaction models and nuclear data libraries. Geometrical configuration of the simulation can be set with GG (General Geometry) or CG (Combinatorial Geometry). Various quantities such as heat deposition, track length and production yields can be deduced from the simulation, using implemented estimator functions called 'tally'. The code also has a function to draw 2D and 3D figures of the calculated results as well as the setup geometries, using a code ANGEL. Because of these features, PHITS has been widely used for various purposes such as designs of accelerator shielding, radiation therapy and space exploration. Recently PHITS introduces an event generator for particle transport parts in the low energy region. Thus, PHITS was completely rewritten for the introduction of the event generator for neutron-induced reactions in energy region less than 20 MeV. Furthermore, several new tallis were incorporated for estimation of the relative biological effects. This document provides a manual of the new PHITS. (author)

  16. New scope covered by PHITS. Particle and heavy ion transport code system

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Niita, Koji; Iwase, Hiroshi; Sato, Tatsuhiko

    2006-01-01

    PHITS is a general high energy transport calculation code from hadron to heavy ions, which embedded in NMTC-JAM with JQMD code. Outline of PHITS and many application examples are stated. PHITS has been used by the shielding calculations of J-PARC, GSI, RIA and Big-RIPS and the good results were reported. The evaluation of exposure dose of astronauts, airmen, proton and heavy ion therapy, and estimation of error frequency of semiconductor software are explained as the application examples. Relation between the event generator and Monte Carlo method and the future are described. (S.Y.)

  17. Applications of FLUKA Monte Carlo code for nuclear and accelerator physics

    CERN Document Server

    Battistoni, Giuseppe; Brugger, Markus; Campanella, Mauro; Carboni, Massimo; Empl, Anton; Fasso, Alberto; Gadioli, Ettore; Cerutti, Francesco; Ferrari, Alfredo; Ferrari, Anna; Lantz, Matthias; Mairani, Andrea; Margiotta, M; Morone, Christina; Muraro, Silvia; Parodi, Katerina; Patera, Vincenzo; Pelliccioni, Maurizio; Pinsky, Lawrence; Ranft, Johannes; Roesler, Stefan; Rollet, Sofia; Sala, Paola R; Santana, Mario; Sarchiapone, Lucia; Sioli, Maximiliano; Smirnov, George; Sommerer, Florian; Theis, Christian; Trovati, Stefania; Villari, R; Vincke, Heinz; Vincke, Helmut; Vlachoudis, Vasilis; Vollaire, Joachim; Zapp, Neil

    2011-01-01

    FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently the code is maintained on Linux. The validity of the physical models implemented in FLUKA has been benchmarked against a variety of experimental data over a wide energy range, from accelerator data to cosmic ray showers in the Earth atmosphere. FLUKA is widely used for studies related both to basic research and to applications in particle accelerators, radiation protection and dosimetry, including the specific issue of radiation damage in space missions, radiobiology (including radiotherapy) and cosmic ray calculations. After a short description of the main features that make FLUKA valuable for these topics, the present paper summarizes some of the recent applications of the FLUKA Monte Carlo code in the nuclear as well high energy physics. In particular it addresses such top...

  18. A Calculation Method of PKA, KERMA and DPA from Evaluated Nuclear Data with an Effective Single-particle Emission Approximation (ESPEA) and Introduction of Event Generator Mode in PHITS Code

    International Nuclear Information System (INIS)

    Fukahori, Tokio; Iwamoto, Yosuke

    2012-01-01

    The displacement calculation method from evaluated nuclear data file has been developed by using effective single-particle emission approximation (ESPEA). The ESPEA can be used effectively below about 50 MeV, because of since multiplicity of emitted particles. These are also reported in the Ref. 24. The displacement calculation method in PHITS has been developed. In the high energy region (≥ 20 MeV) for proton and neutron beams, DPA created by secondary particles increase due to nuclear reactions. For heavy-ion beams, DPA created by the primaries are dominant to total DPA due to the large Coulomb scattering cross sections. PHITS results agree with FLUKA ones within a factor of 1.7. In the high-energy region above 10 MeV/nucleon, comparisons among codes and measurements of displacement damage cross section are necessary. (author)

  19. FLUKA A multi-particle transport code (program version 2005)

    CERN Document Server

    Ferrari, A; Fassò, A; Ranft, Johannes

    2005-01-01

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner’s guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  20. Modelling plastic scintillator response to gamma rays using light transport incorporated FLUKA code

    Energy Technology Data Exchange (ETDEWEB)

    Ranjbar Kohan, M. [Physics Department, Tafresh University, Tafresh (Iran, Islamic Republic of); Etaati, G.R. [Department of Nuclear Engineering and Physics, Amir Kabir University of Technology, Tehran (Iran, Islamic Republic of); Ghal-Eh, N., E-mail: ghal-eh@du.ac.ir [School of Physics, Damghan University, Damghan (Iran, Islamic Republic of); Safari, M.J. [Department of Energy Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Afarideh, H. [Department of Nuclear Engineering and Physics, Amir Kabir University of Technology, Tehran (Iran, Islamic Republic of); Asadi, E. [Department of Physics, Payam-e-Noor University, Tehran (Iran, Islamic Republic of)

    2012-05-15

    The response function of NE102 plastic scintillator to gamma rays has been simulated using a joint FLUKA+PHOTRACK Monte Carlo code. The multi-purpose particle transport code, FLUKA, has been responsible for gamma transport whilst the light transport code, PHOTRACK, has simulated the transport of scintillation photons through scintillator and lightguide. The simulation results of plastic scintillator with/without light guides of different surface coverings have been successfully verified with experiments. - Highlights: Black-Right-Pointing-Pointer A multi-purpose code (FLUKA) and a light transport code (PHOTRACK) have been linked. Black-Right-Pointing-Pointer The hybrid code has been used to generate the response function of an NE102 scintillator. Black-Right-Pointing-Pointer The simulated response functions exhibit a good agreement with experimental data.

  1. Particle and heavy ion transport code system, PHITS, version 2.52

    International Nuclear Information System (INIS)

    Sato, Tatsuhiko; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Noda, Shusaku; Ogawa, Tatsuhiko; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Niita, Koji; Iwase, Hiroshi; Chiba, Satoshi; Furuta, Takuya; Sihver, Lembit

    2013-01-01

    An upgraded version of the Particle and Heavy Ion Transport code System, PHITS2.52, was developed and released to the public. The new version has been greatly improved from the previously released version, PHITS2.24, in terms of not only the code itself but also the contents of its package, such as the attached data libraries. In the new version, a higher accuracy of simulation was achieved by implementing several latest nuclear reaction models. The reliability of the simulation was improved by modifying both the algorithms for the electron-, positron-, and photon-transport simulations and the procedure for calculating the statistical uncertainties of the tally results. Estimation of the time evolution of radioactivity became feasible by incorporating the activation calculation program DCHAIN-SP into the new package. The efficiency of the simulation was also improved as a result of the implementation of shared-memory parallelization and the optimization of several time-consuming algorithms. Furthermore, a number of new user-support tools and functions that help users to intuitively and effectively perform PHITS simulations were developed and incorporated. Due to these improvements, PHITS is now a more powerful tool for particle transport simulation applicable to various research and development fields, such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. (author)

  2. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    CERN Document Server

    Böhlen, T T; Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Mairani, A; Sala, P R; Smirnov, G; Vlachoudis, V

    2014-01-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  3. Benchmark of neutron production cross sections with Monte Carlo codes

    Science.gov (United States)

    Tsai, Pi-En; Lai, Bo-Lun; Heilbronn, Lawrence H.; Sheu, Rong-Jiun

    2018-02-01

    Aiming to provide critical information in the fields of heavy ion therapy, radiation shielding in space, and facility design for heavy-ion research accelerators, the physics models in three Monte Carlo simulation codes - PHITS, FLUKA, and MCNP6, were systematically benchmarked with comparisons to fifteen sets of experimental data for neutron production cross sections, which include various combinations of 12C, 20Ne, 40Ar, 84Kr and 132Xe projectiles and natLi, natC, natAl, natCu, and natPb target nuclides at incident energies between 135 MeV/nucleon and 600 MeV/nucleon. For neutron energies above 60% of the specific projectile energy per nucleon, the LAQGMS03.03 in MCNP6, the JQMD/JQMD-2.0 in PHITS, and the RQMD-2.4 in FLUKA all show a better agreement with data in heavy-projectile systems than with light-projectile systems, suggesting that the collective properties of projectile nuclei and nucleon interactions in the nucleus should be considered for light projectiles. For intermediate-energy neutrons whose energies are below the 60% projectile energy per nucleon and above 20 MeV, FLUKA is likely to overestimate the secondary neutron production, while MCNP6 tends towards underestimation. PHITS with JQMD shows a mild tendency for underestimation, but the JQMD-2.0 model with a modified physics description for central collisions generally improves the agreement between data and calculations. For low-energy neutrons (below 20 MeV), which are dominated by the evaporation mechanism, PHITS (which uses GEM linked with JQMD and JQMD-2.0) and FLUKA both tend to overestimate the production cross section, whereas MCNP6 tends to underestimate more systems than to overestimate. For total neutron production cross sections, the trends of the benchmark results over the entire energy range are similar to the trends seen in the dominate energy region. Also, the comparison of GEM coupled with either JQMD or JQMD-2.0 in the PHITS code indicates that the model used to describe the first

  4. The FLUKA code for space applications Recent developments

    CERN Document Server

    Andersen, V; Battistoni, G; Campanella, M; Carboni, M; Cerutti, F; Empl, A; Fassò, A; Ferrari, A; Gadioli, E; Garzelli, M V; Lee, K; Ottolenghi, A; Pelliccioni, M; Pinsky, L S; Ranft, J; Roesler, S; Sala, P R; Wilson, T L

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to- date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results h...

  5. Use experience of FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    In order to conduct the shield design calculation of the Large Hadron Collider (LHC) under planning in CERN at present, the radiation group of CERN uses FLUKA (Monte Carlo High Energy Radiation Transport Code). Here is introduced on outline of FLUKA and use experience of FLUKA in the LHC-B detector shield design calculation in LHC plan. FLUKA can be said to be the highest standard in the high energy radiation transportation code of the world at every points of the physical model, the Monte Carlo calculation technique and the convenience at usage of the code. In Japan Atomic Energy Research Institute (JAERI), a using right of FLUKA for the target neutronics and facility shielding design at the neutron science research center is obtained and it seems to be an effective design means in these future designs. However, because FLUKA is allowed a limited opening and no own verification on the code, it will be supposed to be a large problem on investigating a validity in design. (K.G.)

  6. The FLUKA code: An accurate simulation tool for particle therapy

    CERN Document Server

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  7. SU-E-T-323: The FLUKA Monte Carlo Code in Ion Beam Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, I [Heidelberg University Hospital (Germany); Ludwig-Maximilian University Munich (Germany)

    2014-06-01

    Purpose: Monte Carlo (MC) codes are increasingly used in the ion beam therapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code demands accurate and reliable physical models for the transport and the interaction of all components of the mixed radiation field. This contribution will address an overview of the recent developments in the FLUKA code oriented to its application in ion beam therapy. Methods: FLUKA is a general purpose MC code which allows the calculations of particle transport and interactions with matter, covering an extended range of applications. The user can manage the code through a graphic interface (FLAIR) developed using the Python programming language. Results: This contribution will present recent refinements in the description of the ionization processes and comparisons between FLUKA results and experimental data of ion beam therapy facilities. Moreover, several validations of the largely improved FLUKA nuclear models for imaging application to treatment monitoring will be shown. The complex calculation of prompt gamma ray emission compares favorably with experimental data and can be considered adequate for the intended applications. New features in the modeling of proton induced nuclear interactions also provide reliable cross section predictions for the production of radionuclides. Of great interest for the community are the developments introduced in FLAIR. The most recent efforts concern the capability of importing computed-tomography images in order to build automatically patient geometries and the implementation of different types of existing positron-emission-tomography scanner devices for imaging applications. Conclusion: The FLUA code has been already chosen as reference MC code in many ion beam therapy centers, and is being continuously improved in order to match the needs of ion beam therapy applications. Parts of this work have been supported by the European

  8. Monte Carlo FLUKA code simulation for study of {sup 68}Ga production by direct proton-induced reaction

    Energy Technology Data Exchange (ETDEWEB)

    Mokhtari Oranj, Leila; Kakavand, Tayeb [Physics Faculty, Zanjan University, P.O. Box 451-313, Zanjan (Iran, Islamic Republic of); Sadeghi, Mahdi, E-mail: msadeghi@nrcam.org [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of); Aboudzadeh Rovias, Mohammadreza [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of)

    2012-06-11

    {sup 68}Ga is an important radionuclide for positron emission tomography. {sup 68}Ga can be produced by the {sup 68}Zn(p,n){sup 68}Ga reaction in a common biomedical cyclotrons. To facilitate optimization of target design and study activation of materials, Monte Carlo code can be used to simulate the irradiation of the target materials with charged hadrons. In this paper, FLUKA code simulation was employed to prototype a Zn target for the production of {sup 68}Ga by proton irradiation. Furthermore, the experimental data were compared with the estimated values for the thick target yield produced in the irradiation time according to FLUKA code. In conclusion, FLUKA code can be used for estimation of the production yield.

  9. Interactive fluka: a world wide web version for a simulation code in proton therapy

    International Nuclear Information System (INIS)

    Garelli, S.; Giordano, S.; Piemontese, G.; Squarcia, S.

    1998-01-01

    We considered the possibility of using the simulation code FLUKA, in the framework of TERA. We provided a window under World Wide Web in which an interactive version of the code is available. The user can find instructions for the installation, an on-line FLUKA manual and interactive windows for inserting all the data required by the configuration running file in a very simple way. The database choice allows a more versatile use for data verification and update, recall of old simulations and comparison with selected examples. A completely new tool for geometry drawing under Java has also been developed. (authors)

  10. Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.

    Science.gov (United States)

    Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko

    2016-01-01

    DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.

  11. Space Applications of the FLUKA Monte-Carlo Code: Lunar and Planetary Exploration

    International Nuclear Information System (INIS)

    Lee, Kerry; Wilson, Thomas; Zapp, Neal; Pinsky, Lawrence

    2007-01-01

    NASA has recognized the need for making additional heavy-ion collision measurements at the U.S. Brookhaven National Laboratory in order to support further improvement of several particle physics transport-code models for space exploration applications. FLUKA has been identified as one of these codes and we will review the nature and status of this investigation as it relates to high-energy heavy-ion physics

  12. The application of the Monte Carlo code FLUKA in radiation protection studies for the large hadron collider

    International Nuclear Information System (INIS)

    Battistoni, Giuseppe; Broggi, Francesco; Brugger, Markus

    2010-01-01

    The multi-purpose particle interaction and transport code FLUKA is integral part of all radiation protection studies for the design and operation of the Large Hadron Collider (LHC) at CERN. It is one of the very few codes available for this type of calculations which is capable to calculate in one and the same simulation proton-proton and heavy ion collisions at LHC energies as well as the entire hadronic and electromagnetic particle cascade initiated by secondary particles in detectors and beam-line components from TeV energies down to energies of thermal neutrons. The present paper reviews these capabilities of FLUKA in giving details of relevant physics models along with examples of radiation protection studies for the LHC such as shielding studies for underground areas occupied by personnel during LHC operation and the simulation of induced radioactivity around beam loss points. Integral part of the FLUKA development is a careful benchmarking of specific models as well as the code performance in complex, real life applications which is demonstrated with examples of studies relevant to radiation protection at the LHC. (author)

  13. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for recoil cross section spectra under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Iwamoto, Yosuke, E-mail: iwamoto.yosuke@jaea.go.jp; Ogawa, Tatsuhiko

    2017-04-01

    Because primary knock-on atoms (PKAs) create point defects and clusters in materials that are irradiated with neutrons, it is important to validate the calculations of recoil cross section spectra that are used to estimate radiation damage in materials. Here, the recoil cross section spectra of fission- and fusion-relevant materials were calculated using the Event Generator Mode (EGM) of the Particle and Heavy Ion Transport code System (PHITS) and also using the data processing code NJOY2012 with the nuclear data libraries TENDL2015, ENDF/BVII.1, and JEFF3.2. The heating number, which is the integral of the recoil cross section spectra, was also calculated using PHITS-EGM and compared with data extracted from the ACE files of TENDL2015, ENDF/BVII.1, and JENDL4.0. In general, only a small difference was found between the PKA spectra of PHITS + TENDL2015 and NJOY + TENDL2015. From analyzing the recoil cross section spectra extracted from the nuclear data libraries using NJOY2012, we found that the recoil cross section spectra were incorrect for {sup 72}Ge, {sup 75}As, {sup 89}Y, and {sup 109}Ag in the ENDF/B-VII.1 library, and for {sup 90}Zr and {sup 55}Mn in the JEFF3.2 library. From analyzing the heating number, we found that the data extracted from the ACE file of TENDL2015 for all nuclides were problematic in the neutron capture region because of incorrect data regarding the emitted gamma energy. However, PHITS + TENDL2015 can calculate PKA spectra and heating numbers correctly.

  14. Conceptual radiation shielding design of superconducting tokamak fusion device by PHITS

    International Nuclear Information System (INIS)

    Sukegawa, Atsuhiko M.; Kawasaki, Hiromitsu; Okuno, Koichi

    2010-01-01

    A complete 3D neutron and photon transport analysis by Monte Carlo transport code system PHITS (Particle and Heavy Ion Transport code System) have been performed for superconducting tokamak fusion device such as JT-60 Super Advanced (JT-60SA). It is possible to make use of PHITS in the port streaming analysis around the devices for the tokamak fusion device, the duct streaming analysis in the building where the device is installed, and the sky shine analysis for the site boundary. The neutron transport analysis by PHITS makes it clear that the shielding performance of the superconducting tokamak fusion device with the cryostat is improved by the graphical results. From the standpoint of the port streaming and the duct streaming, it is necessary to calculate by 3D Monte Carlo code such as PHITS for the neutronics analysis of superconducting tokamak fusion device. (author)

  15. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy.

    Science.gov (United States)

    Botta, F; Mairani, A; Battistoni, G; Cremonesi, M; Di Dia, A; Fassò, A; Ferrari, A; Ferrari, M; Paganelli, G; Pedroli, G; Valente, M

    2011-07-01

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. FLUKA outcomes have been compared to PENELOPE v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (ETRAN, GEANT4, MCNPX) has been done. Maximum percentage differences within 0.8.RCSDA and 0.9.RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8.X90 and 0.9.X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9.RCSDA and 0.9.X90 for electrons and isotopes, respectively. Concerning monoenergetic electrons, within 0.8.RCSDA (where 90%-97% of the particle energy is deposed), FLUKA and PENELOPE agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The

  16. Neutronic analysis of fusion tokamak devices by PHITS

    International Nuclear Information System (INIS)

    Sukegawa, Atsuhiko M.; Takiyoshi, Kouji; Amano, Toshio; Kawasaki, Hiromitsu; Okuno, Koichi

    2011-01-01

    A complete 3D neutronic analysis by PHITS (Particle and Heavy Ion Transport code System) has been performed for fusion tokamak devices such as JT-60U device and JT-60 Superconducting tokamak device (JT-60 Super Advanced). The mono-energetic neutrons (E n =2.45 MeV) of the DD fusion devices are used for the neutron source in the analysis. The visual neutron flux distribution for the estimation of the port streaming and the dose rate around the fusion tokamak devices has been calculated by the PHITS. The PHITS analysis makes it clear that the effect of the port streaming of superconducting fusion tokamak device with the cryostat is crucial and the calculated neutron spectrum results by PHITS agree with the MCNP-4C2 results. (author)

  17. Validating PHITS for heavy ion fragmentation reactions

    International Nuclear Information System (INIS)

    Ronningen, Reginald M.

    2015-01-01

    The performance of the Monte Carlo code system PHITS is validated for heavy-ion transport capabilities by performing simulations and comparing results against experimental data from heavy-ion reactions of benchmark quality. These data are from measurements of isotope yields produced in the fragmentation of a 140 MeV/u "4"8Ca beam on a beryllium target and on a tantalum target. The results of this study show that PHITS performs reliably. (authors)

  18. Nuclear model developments in FLUKA for present and future applications

    Science.gov (United States)

    Cerutti, Francesco; Empl, Anton; Fedynitch, Anatoli; Ferrari, Alfredo; Ruben, GarciaAlia; Sala, Paola R.; Smirnov, George; Vlachoudis, Vasilis

    2017-09-01

    The FLUKAS code [1-3] is used in research laboratories all around the world for challenging applications spanning a very wide range of energies, projectiles and targets. FLUKAS is also extensively used for in hadrontherapy research studies and clinical planning systems. In this paper some of the recent developments in the FLUKAS nuclear physics models of relevance for very different application fields including medical physics are presented. A few examples are shown demonstrating the effectiveness of the upgraded code.

  19. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy

    CERN Document Server

    Mairani, A; Valente, M; Battistoni, G; Botta, F; Pedroli, G; Ferrari, A; Cremonesi, M; Di Dia, A; Ferrari, M; Fasso, A

    2011-01-01

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy ((89)Sr, (90)Y, (131)I, (153)Sm, (177)Lu, (186)Re, and (188)Re). Point isotropic...

  20. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  1. Monte Carlo Codes Invited Session

    International Nuclear Information System (INIS)

    Trama, J.C.; Malvagi, F.; Brown, F.

    2013-01-01

    This document lists 22 Monte Carlo codes used in radiation transport applications throughout the world. For each code the names of the organization and country and/or place are given. We have the following computer codes. 1) ARCHER, USA, RPI; 2) COG11, USA, LLNL; 3) DIANE, France, CEA/DAM Bruyeres; 4) FLUKA, Italy and CERN, INFN and CERN; 5) GEANT4, International GEANT4 collaboration; 6) KENO and MONACO (SCALE), USA, ORNL; 7) MC21, USA, KAPL and Bettis; 8) MCATK, USA, LANL; 9) MCCARD, South Korea, Seoul National University; 10) MCNP6, USA, LANL; 11) MCU, Russia, Kurchatov Institute; 12) MONK and MCBEND, United Kingdom, AMEC; 13) MORET5, France, IRSN Fontenay-aux-Roses; 14) MVP2, Japan, JAEA; 15) OPENMC, USA, MIT; 16) PENELOPE, Spain, Barcelona University; 17) PHITS, Japan, JAEA; 18) PRIZMA, Russia, VNIITF; 19) RMC, China, Tsinghua University; 20) SERPENT, Finland, VTT; 21) SUPERMONTECARLO, China, CAS INEST FDS Team Hefei; and 22) TRIPOLI-4, France, CEA Saclay

  2. Flukacad/Pipsicad: three-dimensional interfaces between Fluka and Autocad

    International Nuclear Information System (INIS)

    Helmut Vincke

    2001-01-01

    FLUKA is a widely used 3-D particle transport program. Up to now there was no possibility to display the simulation geometry or the calculated tracks in three dimensions. Even with FLUKA there exists only an option to picture two-dimensional views through the geometry used. This paper covers the description of two interface programs between the particle transport code FLUKA and the CAD program AutoCAD. These programs provide a three-dimensional facility not only for illustrating the simulated FLUKA geometry (FLUKACAD), but also for picturing simulated particle tracks (PIPSICAD) in a three-dimensional set-up. Additionally, the programming strategy for connecting FLUKA with AutoCAD is shown. A number of useful features of the programs themselves, but also of AutoCAD in the context of FLUKACAD and PIPSICAD, are explained. (authors)

  3. PHITS code improvements by Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority

    International Nuclear Information System (INIS)

    Goko, Shinji

    2017-01-01

    As for the safety analysis to be carried out when a nuclear power company applies for installation permission of facility or equipment, business license, design approval etc., the Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority continuously conducts safety research for the introduction of various technologies and their improvement in order to evaluate the adequacy of this safety analysis. In the field of the shielding analysis of nuclear fuel transportation materials, this group improved the code to make PHITS applicable to this field, and has been promoting the improvement as a tool used for regulations since FY2013. This paper introduced the history and progress of this safety research. PHITS 2.88, which is the latest version as of November 2016, was equipped with the automatic generation function of variance reduction parameters [T-WWG] etc., and developed as the tool equipped with many effective functions in practical application to nuclear power regulations. In addition, this group conducted the verification analysis against nuclear fuel packages, which showed a good agreement with the analysis by MCNP, which is extensively used worldwide and abundant in actual results. It also shows a relatively good agreement with the measured values, when considering differences in analysis and measurement. (A.O.)

  4. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    Science.gov (United States)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  5. Hadron production simulation by FLUKA

    CERN Document Server

    Battistoni, G; Ferrari, A; Ranft, J; Roesler, S; Sala, P R

    2013-01-01

    For the purposes of accelerator based neutrino experiments, the simulation of parent hadron production plays a key role. In this paper a quick overview of the main ingredients of the PEANUT event generator implemented in the FLUKA Monte Carlo code is given, together with some benchmarking examples.

  6. Shielding calculations using FLUKA

    International Nuclear Information System (INIS)

    Yamaguchi, Chiri; Tesch, K.; Dinter, H.

    1988-06-01

    The dose equivalent on the surface of concrete shielding has been calculated using the Monte Carlo code FLUKA86 for incident proton energies from 10 to 800 GeV. The results have been compared with some simple equations. The value of the angular dependent parameter in Moyer's equation has been calculated from the locations where the values of the maximum dose equivalent occur. (author)

  7. R and D on automatic modeling methods for Monte Carlo codes FLUKA

    International Nuclear Information System (INIS)

    Wang Dianxi; Hu Liqin; Wang Guozhong; Zhao Zijia; Nie Fanzhi; Wu Yican; Long Pengcheng

    2013-01-01

    FLUKA is a fully integrated particle physics Monte Carlo simulation package. It is necessary to create the geometry models before calculation. However, it is time- consuming and error-prone to describe the geometry models manually. This study developed an automatic modeling method which could automatically convert computer-aided design (CAD) geometry models into FLUKA models. The conversion program was integrated into CAD/image-based automatic modeling program for nuclear and radiation transport simulation (MCAM). Its correctness has been demonstrated. (authors)

  8. The FLUKA Monte Carlo code coupled with the local effect model for biological calculations in carbon ion therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mairani, A [University of Pavia, Department of Nuclear and Theoretical Physics, and INFN, via Bassi 6, 27100 Pavia (Italy); Brons, S; Parodi, K [Heidelberg Ion Beam Therapy Center and Department of Radiation Oncology, Im Neuenheimer Feld 450, 69120 Heidelberg (Germany); Cerutti, F; Ferrari, A; Sommerer, F [CERN, 1211 Geneva 23 (Switzerland); Fasso, A [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Kraemer, M; Scholz, M, E-mail: Andrea.Mairani@mi.infn.i [GSI Biophysik, Planck-Str. 1, D-64291 Darmstadt (Germany)

    2010-08-07

    Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological effectiveness (RBE). At the GSI Helmholtzzentrum fuer Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed {sup 12}C ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-dose distributions in water used as input basic data in TRiP98 and the FLUKA recalculated ones. On the other hand, taking into account the differences in the physical beam modeling, the FLUKA-based biological calculations of the CHO cell survival profiles are found in good agreement with the experimental data as well with the TRiP98 predictions. The developed approach that combines the MC transport/interaction capability with the same biological model as in the treatment planning system (TPS) will be used at HIT to support validation/improvement of both dose and RBE-weighted dose calculations performed by the analytical TPS.

  9. Assessment of the production of medical isotopes using the Monte Carlo code FLUKA: Simulations against experimental measurements

    Energy Technology Data Exchange (ETDEWEB)

    Infantino, Angelo, E-mail: angelo.infantino@unibo.it [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Oehlke, Elisabeth [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada); Department of Radiation Science & Technology, Delft University of Technology, Postbus 5, 2600 AA Delft (Netherlands); Mostacci, Domiziano [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Schaffer, Paul; Trinczek, Michael; Hoehr, Cornelia [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada)

    2016-01-01

    The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, {sup 18}F, {sup 13}N, {sup 94}Tc, {sup 44}Sc, {sup 68}Ga, {sup 86}Y, {sup 89}Zr, {sup 52}Mn, {sup 61}Cu and {sup 55}Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of {sup 55}Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.

  10. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    Science.gov (United States)

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image

  11. Biological dose estimation for charged-particle therapy using an improved PHITS code coupled with a microdosimetric kinetic model

    International Nuclear Information System (INIS)

    Sato, Tatsuhiko; Watanabe, Ritsuko; Kase, Yuki; Niita, Koji; Sihver, Lembit

    2009-01-01

    High-energy heavy ions (HZE particles) have become widely used for radiotherapy of tumors owing to their high biological effectiveness. In the treatment planning of such charged-particle therapy, it is necessary to estimate not only physical but also biological dose, which is the product of physical dose and relative biological effectiveness (RBE). In the Heavy-ion Medical Accelerator in Chiba (HIMAC), the biological dose is estimated by a method proposed by Kanai et al., which is based on the linear-quadratic (LQ) model with its parameters α and β determined by the dose distribution in terms of the unrestricted linear energy transfer (LET). Thus, RBE is simply expressed as a function of LET in their model. However, RBE of HZE particles cannot be uniquely determined from their LET because of their large cross sections for high-energy δ-ray production. Hence, development of a biological dose estimation model that can explicitly consider the track structure of δ-rays around the trajectory of HZE particles is urgently needed. Microdosimetric quantities such as lineal energy y are better indexes for representing RBE of HZE particles in comparison to LET, since they can express the decrease of ionization densities around their trajectories due to the production of δ-rays. The difference of the concept between LET and y is illustrated in Figure 1. However, the use of microdosimetric quantities in computational dosimetry was severely limited because of the difficulty in calculating their probability densities (PDs) in macroscopic matter. We therefore improved the 3-dimensional particle transport simulation code PHITS, providing it with the capability of estimating the microdosimetric PDs in a macroscopic framework by incorporating a mathematical function that can instantaneously calculate the PDs around the trajectory of HZE particles with precision equivalent to a microscopic track-structure simulation. A new method for estimating biological dose from charged

  12. TVF-NMCRC-A powerful program for writing and executing simulation inputs for the FLUKA Monte Carlo Code system

    International Nuclear Information System (INIS)

    Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.

    2007-01-01

    We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)

  13. The FLUKA code: developments and challenges for high energy and medicalapplications

    Czech Academy of Sciences Publication Activity Database

    Böhlen, T.T.; Cerutti, F.; Chin, M.P.W.; Fasso, Alberto; Ferrari, A.; Ortega, P.G.; Mairani, A.; Sala, P.R.; Smirnov, G.; Vlachoudis, V.

    2014-01-01

    Roč. 120, Jul (2014), s. 211-214 ISSN 0090-3752 Institutional support: RVO:68378271 Keywords : FLUKA * radioprotection * beams * ions Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 4.571, year: 2014

  14. A new calculation of atmospheric neutrino flux: the FLUKA approach

    International Nuclear Information System (INIS)

    Battistoni, G.; Bloise, C.; Cavalli, D.; Ferrari, A.; Montaruli, T.; Rancati, T.; Resconi, S.; Ronga, F.; Sala, P.R.

    1999-01-01

    Preliminary results from a full 3-D calculation of atmospheric neutrino fluxes using the FLUKA interaction model are presented and compared to previous existing calculations. This effort is motivated mainly by the 3-D capability and the satisfactory degree of accuracy of the hadron-nucleus models embedded in the FLUKA code. Here we show examples of benchmarking tests of the model with cosmic ray experiment results. A comparison of our calculation of the atmospheric neutrino flux with that of the Bartol group, for E ν > 1 GeV, is presented

  15. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  16. Flair: A powerful but user friendly graphical interface for FLUKA

    International Nuclear Information System (INIS)

    Vlachoudis, V.

    2009-01-01

    FLAIR is an advanced user graphical interface for FLUKA, to enable the user to start and control FLUKA jobs completely from a GUI environment without the need for command-line interactions. It is written entirely with python and Tkinter allowing easier portability across various operating systems and great programming flexibility with focus to be used as an Application Programming Interface (API) for FLUKA. FLAIR is an integrated development environment (IDE) for FLUKA, it does not only provide means for the post processing of the output but a big emphasis has been set on the creation and checking of error free input files. It contains a fully featured editor for editing the input files in a human readable way with syntax highlighting, without hiding the inner functionality of FLUKA from the users. It provides also means for building the executable, debugging the geometry, running the code, monitoring the status of one or many runs, inspection of the output files, post processing of the binary files (data merging) and interface to plotting utilities like gnuplot and PovRay for high quality plots or photo-realistic images. The program includes also a database of selected properties of all known nuclides and their known isotopic composition as well a reference database of ∼ 300 predefined materials together with their Sterheimer parameters. (authors)

  17. Monte Carlo simulation for neutron yield produced by bombarding thick targets with high energy heavy ions

    Energy Technology Data Exchange (ETDEWEB)

    Oranj, Leila Mokhtari; Oh, Joo Hee; Yoon, Moo Hyun; Lee, Hee Seock [POSTECH, Pohang (Korea, Republic of)

    2013-04-15

    One of radiation shielding issues at heavy-ion accelerator facilities is to estimate neutron production by primary heavy ions. A few Monte Carlo transport codes such as FLUKA and PHITS can work with primary heavy ions. Recently IBS/RISP((Rare Isotope Science Project) started to design a high-energy, high-power rare isotope accelerator complex for nuclear physics, medical and material science and applications. There is a lack of experimental and simulated data about the interaction of major beam, {sup 238}U with materials. For the shielding design of the end of first accelerating section section, we calculate a differential neutron yield using the FLUKA code for the interaction of 18.5 MeV/u uranium ion beam with thin carbon stripper of 1.3 μm). The benchmarking studies were also done to prove the yield calculation for 400 MeV/n {sup 131}Xe and other heavy ions. In this study, the benchmarking for Xe-C, Xe-Cu, Xe-Al, Xe-Pb and U-C, other interactions were performed using the FLUKA code. All of results show that the FLUKA can evaluate the heavy ion induced reaction with good uncertainty. For the evaluation of neutron source term, the calculated neutron yields are shown in Fig. 2. The energy of Uranium ion beam is only 18.5 MeV/u, but the energy of produced secondary neutrons was extended over 100 MeV. So the neutron shielding and the damage by those neutrons is expected to be serious. Because of thin stripper, the neutron intensity at forward direction was high. But the the intensity of produced secondary photons was relatively low and mostly the angular property was isotropic. For the detail shielding design of stripper section of RISP rare istope accelerator, the benchmarking study and preliminary evaluation of neutron source term from uranium beam have been carried out using the FLUKA code. This study is also compared with the evaluation results using the PHITS code performed coincidently. Both studies shows that two monte carlo codes can give a good results for

  18. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for PKA energy spectra and heating number under neutron irradiation

    International Nuclear Information System (INIS)

    Iwamoto, Y.; Ogawa, T.

    2016-01-01

    The modelling of the damage in materials irradiated by neutrons is needed for understanding the mechanism of radiation damage in fission and fusion reactor facilities. The molecular dynamics simulations of damage cascades with full atomic interactions require information about the energy distribution of the Primary Knock on Atoms (PKAs). The most common process to calculate PKA energy spectra under low-energy neutron irradiation is to use the nuclear data processing code NJOY2012. It calculates group-to-group recoil cross section matrices using nuclear data libraries in ENDF data format, which is energy and angular recoil distributions for many reactions. After the NJOY2012 process, SPKA6C is employed to produce PKA energy spectra combining recoil cross section matrices with an incident neutron energy spectrum. However, intercomparison with different processes and nuclear data libraries has not been studied yet. Especially, the higher energy (~5 MeV) of the incident neutrons, compared to fission, leads to many reaction channels, which produces a complex distribution of PKAs in energy and type. Recently, we have developed the event generator mode (EGM) in the Particle and Heavy Ion Transport code System PHITS for neutron incident reactions in the energy region below 20 MeV. The main feature of EGM is to produce PKA with keeping energy and momentum conservation in a reaction. It is used for event-by-event analysis in application fields such as soft error analysis in semiconductors, micro dosimetry in human body, and estimation of Displacement per Atoms (DPA) value in metals and so on. The purpose of this work is to specify differences of PKA spectra and heating number related with kerma between different calculation method using PHITS-EGM and NJOY2012+SPKA6C with different libraries TENDL-2015, ENDF/B-VII.1 and JENDL-4.0 for fusion relevant materials

  19. Bulk Shielding Calculation for 90 .deg. Bending Section of RISP

    Energy Technology Data Exchange (ETDEWEB)

    Oh, J. H.; Jung, N. S.; Lee, H. S. [Pohang Accelerator Laboratory, Pohang (Korea, Republic of); Oranj, L. Mokhtari [POSTECH, Pohang (Korea, Republic of); Ko, S. K. [Univ. of Ulsan, Ulsan (Korea, Republic of)

    2014-10-15

    The charge state of {sup 238}U beams with maximum intensity was 79+ among multi-charge states of 70+ to 89+, which were estimated by using LISE++ code. The bending section consists of twenty four quadrupoles, two dipoles, two two-cell type superconducting RF cavities and eleven slits. The complicated radiation environment is caused by the beam losses occurred normally during the stripping process and when the produced {sup 238}U beams are transported along the beam line. Secondary radiations generated by {sup 238}U beams irradiation are very important for predicting the prompt and residual doses and the radiation damage at the component. The production characteristics of neutron and photon from thin carbon and thick iron were studied to set up the shielding strategy. The dose estimation was done to the pre-designed the tunnel structure. In these calculations, major Monte Carlo codes, PHITS and FLUKA, were used. The present study provided information of shielding analysis for the 90 .deg. bending section of RISP facility. The source term was evaluated to determine fundamental parameter of the shielding analysis using PHITS and FLUKA codes. And the distribution of the dose rate at the outside of thick shielding wall was presented.

  20. Estimation of Airborne Radioactivity Induced by 8-GeV-Class Electron LINAC Accelerator.

    Science.gov (United States)

    Asano, Yoshihiro

    2017-10-01

    Airborne radioactivity induced by high-energy electrons from 6 to 10 GeV is estimated by using analytical methods and the Monte Carlo codes PHITS and FLUKA. Measurements using a gas monitor with a NaI(Tl) scintillator are carried out in air from a dump room at SACLA, an x-ray free-electron laser facility with 7.8-GeV electrons and are compared to the simulations.

  1. A Benchmarking Study of High Energy Carbon Ion Induced Neutron Using Several Monte Carlo Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Oh, J. H.; Jung, N. S.; Lee, H. S. [Pohang Accelerator Laboratory, Pohang (Korea, Republic of); Shin, Y. S.; Kwon, D. Y.; Kim, Y. M. [Catholic Univ., Gyeongsan (Korea, Republic of); Oranj, L. Mokhtari [POSTECH, Pohang (Korea, Republic of)

    2014-10-15

    In this study, the benchmarking study was done for the representative particle interaction of the heavy ion accelerator, especially carbon-induced reaction. The secondary neutron is an important particle in the shielding analysis to define the source term and penetration ability of radiation fields. The performance of each Monte Carlo codes were verified for selected codes: MCNPX 2.7, PHITS 2.64 and FLUKA 2011.2b.6. For this benchmarking study, the experimental data of Kurosawa et al. in the SINBAD database of NEA was applied. The calculated results of the differential neutron yield produced from several materials irradiated by high energy carbon beam reproduced the experimental data well in small uncertainty. But the MCNPX results showed large discrepancy with experimental data, especially at the forward angle. The calculated results were lower a little than the experimental and it was clear in the cases of lower incident carbon energy, thinner target and forward angle. As expected, the influence of different model was found clearly at forward direction. In the shielding analysis, these characteristics of each Monte Carlo codes should be considered and utilized to determine the safety margin of a shield thickness.

  2. A Benchmarking Study of High Energy Carbon Ion Induced Neutron Using Several Monte Carlo Codes

    International Nuclear Information System (INIS)

    Kim, D. H.; Oh, J. H.; Jung, N. S.; Lee, H. S.; Shin, Y. S.; Kwon, D. Y.; Kim, Y. M.; Oranj, L. Mokhtari

    2014-01-01

    In this study, the benchmarking study was done for the representative particle interaction of the heavy ion accelerator, especially carbon-induced reaction. The secondary neutron is an important particle in the shielding analysis to define the source term and penetration ability of radiation fields. The performance of each Monte Carlo codes were verified for selected codes: MCNPX 2.7, PHITS 2.64 and FLUKA 2011.2b.6. For this benchmarking study, the experimental data of Kurosawa et al. in the SINBAD database of NEA was applied. The calculated results of the differential neutron yield produced from several materials irradiated by high energy carbon beam reproduced the experimental data well in small uncertainty. But the MCNPX results showed large discrepancy with experimental data, especially at the forward angle. The calculated results were lower a little than the experimental and it was clear in the cases of lower incident carbon energy, thinner target and forward angle. As expected, the influence of different model was found clearly at forward direction. In the shielding analysis, these characteristics of each Monte Carlo codes should be considered and utilized to determine the safety margin of a shield thickness

  3. Depth profiles of production yields of natPb(p, xn206,205,204,203,202 Bi reactions using 100-MeV proton beam

    Directory of Open Access Journals (Sweden)

    Oranj Leila Mokhtari

    2017-01-01

    Full Text Available In this study, results of the experimental study on the depth profiles of production yields of 206,205,204,203,202Bi radio-nuclei in the natural Pb target irradiated by a 100-MeV proton beam are presented. Irradiation was performed at proton linac facility (KOMAC in Korea. The target, irradiated by 100-MeV protons, was arranged in a stack consisting of natural Pb, Al, Au foils and Pb plates. The proton beam intensity was determined by activation analysis method using 27Al(p, 3p1n24Na, 197Au(p, p1n196Au, and 197Au(p, p3n194Au monitor reactions and also using dosimetry method by a Gafchromic film. The production yields of produced Bi radio-nuclei in the natural Pb foils and monitor reactions were measured by gamma-ray spectroscopy. Monte Carlo simulations were performed by FLUKA, PHITS, and MCNPX codes and compared with the measurements in order to verify validity of physical models and nuclear data libraries in the Monte Carlo codes. A fairly good agreement was observed between the present experimental data and the simulations by FLUKA, PHITS, and MCNPX. However, physical models and the nuclear data relevant to the end of range of protons in the codes need to be improved.

  4. The FLUKA Monte Carlo code coupled with the local effect model for biological calculations in carbon ion therapy

    CERN Document Server

    Mairani, A; Kraemer, M; Sommerer, F; Parodi, K; Scholz, M; Cerutti, F; Ferrari, A; Fasso, A

    2010-01-01

    Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological effectiveness (RBE). At the GSI Helmholtzzentrum fur Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed C-12 ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-d...

  5. FLUKA and PENELOPE simulations of 10keV to 10MeV photons in LYSO and soft tissue

    CERN Document Server

    Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Sala, P R

    2014-01-01

    Monte Carlo simulations of electromagnetic particle interactions and transport by FLUKA and PENELOPE were compared. 10 key to 10 MeV incident photon beams impinged a LYSO crystal and a soft-tissue phantom. Central-axis as well as off-axis depth doses agreed within 1 s.d.; no systematic under- or overestimate of the pulse height spectra was observed from 100 keV to 10 MeV for both materials, agreement was within 5\\%. Simulation of photon and electron transport and interactions at this level of precision and reliability is of significant impact, for instance, on treatment monitoring of hadrontherapy where a code like FLUKA is needed to simulate the full suite of particles and interactions (not just electromagnetic). At the interaction-by-interaction level, apart from known differences in condensed history techniques, two-quanta positron annihilation at rest was found to differ between the two codes. PENELOPE produced a 511 key sharp line, whereas FLUKA produced visible acolinearity, a feature recently implemen...

  6. A Platform to Build Mobile Health Apps: The Personal Health Intervention Toolkit (PHIT).

    Science.gov (United States)

    Eckhoff, Randall Peter; Kizakevich, Paul Nicholas; Bakalov, Vesselina; Zhang, Yuying; Bryant, Stephanie Patrice; Hobbs, Maria Ann

    2015-06-01

    Personal Health Intervention Toolkit (PHIT) is an advanced cross-platform software framework targeted at personal self-help research on mobile devices. Following the subjective and objective measurement, assessment, and plan methodology for health assessment and intervention recommendations, the PHIT platform lets researchers quickly build mobile health research Android and iOS apps. They can (1) create complex data-collection instruments using a simple extensible markup language (XML) schema; (2) use Bluetooth wireless sensors; (3) create targeted self-help interventions based on collected data via XML-coded logic; (4) facilitate cross-study reuse from the library of existing instruments and interventions such as stress, anxiety, sleep quality, and substance abuse; and (5) monitor longitudinal intervention studies via daily upload to a Web-based dashboard portal. For physiological data, Bluetooth sensors collect real-time data with on-device processing. For example, using the BinarHeartSensor, the PHIT platform processes the heart rate data into heart rate variability measures, and plots these data as time-series waveforms. Subjective data instruments are user data-entry screens, comprising a series of forms with validation and processing logic. The PHIT instrument library consists of over 70 reusable instruments for various domains including cognitive, environmental, psychiatric, psychosocial, and substance abuse. Many are standardized instruments, such as the Alcohol Use Disorder Identification Test, Patient Health Questionnaire-8, and Post-Traumatic Stress Disorder Checklist. Autonomous instruments such as battery and global positioning system location support continuous background data collection. All data are acquired using a schedule appropriate to the app's deployment. The PHIT intelligent virtual advisor (iVA) is an expert system logic layer, which analyzes the data in real time on the device. This data analysis results in a tailored app of interventions

  7. An integral test of FLUKA nuclear models with 160 MeV proton beams in multi-layer Faraday cups

    International Nuclear Information System (INIS)

    Rinaldi, I; Ferrari, A; Mairani, A; Parodi, K; Paganetti, H; Sala, P

    2011-01-01

    Monte Carlo (MC) codes are useful tools to simulate the complex processes of proton beam interactions with matter. In proton therapy, nuclear reactions influence the dose distribution. Therefore, the validation of nuclear models adopted in MC codes is a critical requisite for their use in this field. A simple integral test can be performed using a multi-layer Faraday cup (MLFC). This method allows separation of the nuclear and atomic interaction processes, which are responsible for secondary particle emission and the finite primary proton range, respectively. In this work, the propagation of 160 MeV protons stopping in two MLFCs made of polyethylene and copper has been simulated by the FLUKA MC code. The calculations have been performed with and without secondary electron emission and transport, as well as charge sharing in the dielectric layers. Previous results with other codes neglected those two effects. The impact of this approximation has been investigated and found to be relevant only in the proximity of the Bragg peak. Longitudinal charge distributions computed with FLUKA with both approaches have been compared with experimental data from the literature. Moreover, the contribution of different processes to the measurable signal has been addressed. A thorough analysis of the results has demonstrated that the nuclear and electromagnetic models of FLUKA reproduce the two sets of experimental data reasonably well.

  8. A dedicated tool for PET scanner simulations using FLUKA

    International Nuclear Information System (INIS)

    Ortega, P.G.; Boehlen, T.T.; Cerutti, F.; Chin, M.P.W.; Ferrari, A.; Mancini, C.; Vlachoudis, V.; Mairani, A.; Sala, Paola R.

    2013-06-01

    Positron emission tomography (PET) is a well-established medical imaging technique. It is based on the detection of pairs of annihilation gamma rays from a beta+-emitting radionuclide, usually inoculated in the body via a biologically active molecule. Apart from its wide-spread use for clinical diagnosis, new applications are proposed. This includes notably the usage of PET for treatment monitoring of radiation therapy with protons and ions. PET is currently the only available technique for non-invasive monitoring of ion beam dose delivery, which was tested in several clinical pilot studies. For hadrontherapy, the distribution of positron emitters, produced by the ion beam, can be analyzed to verify the correct treatment delivery. The adaptation of previous PET scanners to new environments and the necessity of more precise diagnostics by better image quality triggered the development of new PET scanner designs. The use of Monte Carlo (MC) codes is essential in the early stages of the scanner design to simulate the transport of particles and nuclear interactions from therapeutic ion beams or radioisotopes and to predict radiation fields in tissues and radiation emerging from the patient. In particular, range verification using PET is based on the comparison of detected and simulated activity distributions. The accuracy of the MC code for the relevant physics processes is obviously essential for such applications. In this work we present new developments of the physics models with importance for PET monitoring and integrated tools for PET scanner simulations for FLUKA, a fully-integrated MC particle-transport code, which is widely used for an extended range of applications (accelerator shielding, detector and target design, calorimetry, activation, dosimetry, medical physics, radiobiology, ...). The developed tools include a PET scanner geometry builder and a dedicated scoring routine for coincident event determination. The geometry builder allows the efficient

  9. Radiation protection studies for medical particle accelerators using FLUKA Monte Carlo code

    International Nuclear Information System (INIS)

    Infantino, Angelo; Mostacci, Domiziano; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Marengo, Mario

    2017-01-01

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of "4"1Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. (authors)

  10. The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy

    CERN Document Server

    Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F

    2010-01-01

    Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...

  11. A new three-tier architecture design for multi-sphere neutron spectrometer with the FLUKA code

    Science.gov (United States)

    Huang, Hong; Yang, Jian-Bo; Tuo, Xian-Guo; Liu, Zhi; Wang, Qi-Biao; Wang, Xu

    2016-07-01

    The current commercially, available Bonner sphere neutron spectrometer (BSS) has high sensitivity to neutrons below 20 MeV, which causes it to be poorly placed to measure neutrons ranging from a few MeV to 100 MeV. The paper added moderator layers and the auxiliary material layer upon 3He proportional counters with FLUKA code, with a view to improve. The results showed that the responsive peaks to neutrons below 20 MeV gradually shift to higher energy region and decrease slightly with the increasing moderator thickness. On the contrary, the response for neutrons above 20 MeV was always very low until we embed auxiliary materials such as copper (Cu), lead (Pb), tungsten (W) into moderator layers. This paper chose the most suitable auxiliary material Pb to design a three-tier architecture multi-sphere neutron spectrometer (NBSS). Through calculating and comparing, the NBSS was advantageous in terms of response for 5-100 MeV and the highest response was 35.2 times the response of polyethylene (PE) ball with the same PE thickness.

  12. A Monte Carlo code for ion beam therapy

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    Initially developed for applications in detector and accelerator physics, the modern Fluka Monte Carlo code is now used in many different areas of nuclear science. Over the last 25 years, the code has evolved to include new features, such as ion beam simulations. Given the growing use of these beams in cancer treatment, Fluka simulations are being used to design treatment plans in several hadron-therapy centres in Europe.   Fluka calculates the dose distribution for a patient treated at CNAO with proton beams. The colour-bar displays the normalized dose values. Fluka is a Monte Carlo code that very accurately simulates electromagnetic and nuclear interactions in matter. In the 1990s, in collaboration with NASA, the code was developed to predict potential radiation hazards received by space crews during possible future trips to Mars. Over the years, it has become the standard tool to investigate beam-machine interactions, radiation damage and radioprotection issues in the CERN accelerator com...

  13. An investigation of the neutron flux in bone-fluorine phantoms comparing accelerator based in vivo neutron activation analysis and FLUKA simulation data

    International Nuclear Information System (INIS)

    Mostafaei, F.; McNeill, F.E.; Chettle, D.R.; Matysiak, W.; Bhatia, C.; Prestwich, W.V.

    2015-01-01

    We have tested the Monte Carlo code FLUKA for its ability to assist in the development of a better system for the in vivo measurement of fluorine. We used it to create a neutron flux map of the inside of the in vivo neutron activation analysis irradiation cavity at the McMaster Accelerator Laboratory. The cavity is used in a system that has been developed for assessment of fluorine levels in the human hand. This study was undertaken to (i) assess the FLUKA code, (ii) find the optimal hand position inside the cavity and assess the effects on precision of a hand being in a non-optimal position and (iii) to determine the best location for our γ-ray detection system within the accelerator beam hall. Simulation estimates were performed using FLUKA. Experimental measurements of the neutron flux were performed using Mn wires. The activation of the wires was measured inside (1) an empty bottle, (2) a bottle containing water, (3) a bottle covered with cadmium and (4) a dry powder-based fluorine phantom. FLUKA was used to simulate the irradiation cavity, and used to estimate the neutron flux in different positions both inside, and external to, the cavity. The experimental results were found to be consistent with the Monte Carlo simulated neutron flux. Both experiment and simulation showed that there is an optimal position in the cavity, but that the effect on the thermal flux of a hand being in a non-optimal position is less than 20%, which will result in a less than 10% effect on the measurement precision. FLUKA appears to be a code that can be useful for modeling of this type of experimental system

  14. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    International Nuclear Information System (INIS)

    Baumann, K; Weber, U; Simeonov, Y; Zink, K

    2015-01-01

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular and thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system

  15. Energy deposition profile on ISOLDE Beam Dumps by FLUKA simulations

    CERN Document Server

    Vlachoudis, V

    2014-01-01

    In this report an estimation of the energy deposited on the current ISOLDE beam dumps obtained by means of FLUKA simulation code is presented. This is done for both ones GPS and HRS. Some estimations of temperature raise are given based on the assumption of adiabatic increase from energy deposited by the impinging protons. However, the results obtained here in relation to temperature are only a rough estimate. They are meant to be further studied through thermomechanical simulations using the energyprofiles hereby obtained.

  16. Reduction and resource recycling of high-level radioactive wastes through nuclear transmutation with PHITS code

    International Nuclear Information System (INIS)

    Fujita, Reiko

    2017-01-01

    In the ImPACT program of the Cabinet Office, programs are underway to reduce long-lived fission products (LLFP) contained in high-level radioactive waste through nuclear transmutation, or to recycle/utilize useful nuclear species. This paper outlines this program and describes recent achievements. This program consists of five projects: (1) separation/recovery technology, (2) acquisition of nuclear transmutation data, (3) nuclear reaction theory model and simulation, (4) novel nuclear reaction control and development of elemental technology, and (5) discussions on process concept. The project (1) develops a technology for dissolving vitrified solid, a technology for recovering LLFP from high-level waste liquid, and a technology for separating odd and even lasers. Project (2) acquires the new nuclear reaction data of Pd-107, Zr-93, Se-79, and Cs-135 using RIKEN's RIBF or JAEA's J-PARC. Project (3) improves new nuclear reaction theory and structural model using the nuclear reaction data measured in (2), improves/upgrades nuclear reaction simulation code PHITS, and proposes a promising nuclear transmutation pathway. Project (4) develops an accelerator that realizes the proposed transmutation route and its elemental technology. Project (5) performs the conceptual design of the process to realize (1) to (4), and constructs the scenario of reducing/utilizing high-level radioactive waste to realize this design. (A.O.)

  17. FLUKA-LIVE-an embedded framework, for enabling a computer to execute FLUKA under the control of a Linux OS

    International Nuclear Information System (INIS)

    Cohen, A.; Battistoni, G.; Mark, S.

    2008-01-01

    This paper describes a Linux-based OS framework for integrating the FLUKA Monte Carlo software (currently distributed only for Linux) into a CD-ROM, resulting in a complete environment for a scientist to edit, link and run FLUKA routines-without the need to install a UNIX/Linux operating system. The building process includes generating from scratch a complete operating system distribution which will, when operative, build all necessary components for successful operation of FLUKA software and libraries. Various source packages, as well as the latest kernel sources, are freely available from the Internet. These sources are used to create a functioning Linux system that integrates several core utilities in line with the main idea-enabling FLUKA to act as if it was running under a popular Linux distribution or even a proprietary UNIX workstation. On boot-up a file system will be created and the contents from the CD will be uncompressed and completely loaded into RAM-after which the presence of the CD is no longer necessary, and could be removed for use on a second computer. The system can operate on any i386 PC as long as it can boot from a CD

  18. The FLUKA Monte Carlo, Non-Perturbative QCD and Cosmic Ray Cascades

    International Nuclear Information System (INIS)

    Battistoni, G.

    2005-01-01

    The FLUKA Monte Carlo code, presently used in cosmic ray physics, contains packages to sample soft hadronic processes which are built according to the Dual Parton Model. This is a phenomenological model capable of reproducing many of the features of hadronic collisions in the non perturbative QCD regime. The basic principles of the model are summarized and, as an example, the associated Lambda-K production is discussed. This is a process which has some relevance for the calculation of atmospheric neutrino fluxes

  19. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginners course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge about t...

  20. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginner’s course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge abou...

  1. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    Science.gov (United States)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  2. Using FLUKA to Study Concrete Square Shield Performance in Attenuation of Neutron Radiation Produced by APF Plasma Focus Neutron Source

    Science.gov (United States)

    Nemati, M. J.; Habibi, M.; Amrollahi, R.

    2013-04-01

    In 2010, representatives from the Nuclear Engineering and physics Department of Amirkabir University of Technology (AUT) requested development of a project with the objective of determining the performance of a concrete shield for their Plasma Focus as neutron source. The project team in Laboratory of Nuclear Engineering and physics department of Amirkabir University of Technology choose some shape of shield to study on their performance with Monte Carlo code. In the present work, the capability of Monte Carlo code FLUKA will be explored to model the APF Plasma Focus, and investigating the neutron fluence on the square concrete shield in each region of problem. The physical models embedded in FLUKA are mentioned, as well as examples of benchmarking against future experimental data. As a result of this study suitable thickness of concrete for shielding APF will be considered.

  3. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    Science.gov (United States)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  4. PHIT for Duty, a Mobile Application for Stress Reduction, Sleep Improvement, and Alcohol Moderation.

    Science.gov (United States)

    Kizakevich, Paul N; Eckhoff, Randall; Brown, Janice; Tueller, Stephen J; Weimer, Belinda; Bell, Stacey; Weeks, Adam; Hourani, Laurel L; Spira, James L; King, Laurel A

    2018-03-01

    Post-traumatic stress and other problems often occur after combat, deployment, and other military operations. Because techniques such as mindfulness meditation show efficacy in improving mental health, our team developed a mobile application (app) for individuals in the armed forces with subclinical psychological problems as secondary prevention of more significant disease. Based on the Personal Health Intervention Toolkit (PHIT), a mobile app framework for personalized health intervention studies, PHIT for Duty integrates mindfulness-based relaxation, behavioral education in sleep quality and alcohol use, and psychometric and psychophysiological data capture. We evaluated PHIT for Duty in usability and health assessment studies to establish app quality for use in health research. Participants (N = 31) rated usability on a 1 (very hard) to 5 (very easy) scale and also completed the System Usability Scale (SUS) questionnaire (N = 9). Results were (mean ± SD) overall (4.5 ± 0.6), self-report instruments (4.5 ± 0.7), pulse sensor (3.7 ± 1.2), sleep monitor (4.4 ± 0.7), sleep monitor comfort (3.7 ± 1.1), and wrist actigraphy comfort (2.7 ± 0.9). The average SUS score was 85 ± 12, indicating a rank of 95%. A comparison of PHIT-based assessments to traditional paper forms demonstrated a high overall correlation (r = 0.87). These evaluations of usability, health assessment accuracy, physiological sensing, system acceptability, and overall functionality have shown positive results and affirmation for using the PHIT framework and PHIT for Duty application in mobile health research.

  5. FLUKA shielding calculations for the FAIR project

    International Nuclear Information System (INIS)

    Fehrenbacher, Georg; Kozlova, Ekaterina; Radon, Torsten; Sokolov, Alexey

    2015-01-01

    FAIR is an international accelerator project being in construction at GSI Helmholtz center for heavy ion research in Darmstadt. The Monte Carlo program FLUKA is used to study radiation protection problems. The contribution deals with general application possibilities of FLUKA and for FAIR with respect the radiation protection planning. The necessity to simulate the radiation transport through shielding of several meters thickness and to determine the equivalent doses outside the shielding with sufficient accuracy is demonstrated using two examples under consideration of the variance reduction. Results of simulation calculations for activation estimation in accelerator facilities are presented.

  6. Simulations of MATROSHKA experiments at ISS using PHITS

    CERN Document Server

    Sihver, L; Puchalska, M; Reitz, G

    2010-01-01

    Concerns about the biological effects of space radiation are increasing rapidly due to the perspective of long-duration manned missions, both in relation to the International Space Station (ISS) and to manned interplanetary missions to Moon and Mars in the future. As a preparation for these long duration space missions it is important to ensure an excellent capability to evaluate the impact of space radiation on human health in order to secure the safety of the astronauts/cosmonauts and minimize their risks. It is therefore necessary to measure the radiation load on the personnel both inside and outside the space vehicles and certify that organ and tissue equivalent doses can be simulated as accurate as possible. In this paper we will present simulations using the three-dimensional Monte Carlo Particle and Heavy Ion Transport code System (PHITS) of long term dose measurements performed with the ESA supported experiment MATROSHKA (MTR), which is an anthropomorphic phantom containing over 6000 radiation detecto...

  7. Development of particle and heavy ion transport code system

    International Nuclear Information System (INIS)

    Niita, Koji

    2004-01-01

    Particle and heavy ion transport code system (PHITS) is 3 dimension general purpose Monte Carlo simulation codes for description of transport and reaction of particle and heavy ion in materials. It is developed on the basis of NMTC/JAM for design and safety of J-PARC. What is PHITS, it's physical process, physical models and development process of PHITC code are described. For examples of application, evaluation of neutron optics, cancer treatment by heavy particle ray and cosmic radiation are stated. JAM and JQMD model are used as the physical model. Neutron motion in six polar magnetic field and gravitational field, PHITC simulation of trace of C 12 beam and secondary neutron track of small model of cancer treatment device in HIMAC and neutron flux in Space Shuttle are explained. (S.Y.)

  8. Simulation of thermal-neutron-induced single-event upset using particle and heavy-ion transport code system

    International Nuclear Information System (INIS)

    Arita, Yutaka; Kihara, Yuji; Mitsuhasi, Junichi; Niita, Koji; Takai, Mikio; Ogawa, Izumi; Kishimoto, Tadafumi; Yoshihara, Tsutomu

    2007-01-01

    The simulation of a thermal-neutron-induced single-event upset (SEU) was performed on a 0.4-μm-design-rule 4 Mbit static random access memory (SRAM) using particle and heavy-ion transport code system (PHITS): The SEU rates obtained by the simulation were in very good agreement with the result of experiments. PHITS is a useful tool for simulating SEUs in semiconductor devices. To further improve the accuracy of the simulation, additional methods for tallying the energy deposition are required for PHITS. (author)

  9. The FLUKA atmospheric neutrino flux calculation

    CERN Document Server

    Battistoni, G.; Montaruli, T.; Sala, P.R.

    2003-01-01

    The 3-dimensional (3-D) calculation of the atmospheric neutrino flux by means of the FLUKA Monte Carlo model is here described in all details, starting from the latest data on primary cosmic ray spectra. The importance of a 3-D calculation and of its consequences have been already debated in a previous paper. Here instead the focus is on the absolute flux. We stress the relevant aspects of the hadronic interaction model of FLUKA in the atmospheric neutrino flux calculation. This model is constructed and maintained so to provide a high degree of accuracy in the description of particle production. The accuracy achieved in the comparison with data from accelerators and cross checked with data on particle production in atmosphere certifies the reliability of shower calculation in atmosphere. The results presented here can be already used for analysis by current experiments on atmospheric neutrinos. However they represent an intermediate step towards a final release, since this calculation does not yet include the...

  10. Measurement of neutron production double-differential cross-sections on carbon bombared with 430 MeV/ Nucleon carbon irons

    Energy Technology Data Exchange (ETDEWEB)

    Itashiki, Yutaro; Imahayashi, Youichi; Shigyo, Nobuhiro; Uozumi, Yusuke [Kyushu University, Fukuoka (Japan); Satoh, Daiki [Japan Atomic Energy Agency, Ibaraki (Japan); Kajimoto, Tsuyoshi [Hiroshima University, Hiroshima (Japan); Sanami, Toshiya [High Energy Accelerator Research Organization, Ibaraki (Japan); Koba, Yusuke; Matufuji, Naruhiro [Institutes for Quantum and Radiological Science and Technology, Chiba (Japan)

    2016-12-15

    Carbon ion therapy has achieved satisfactory results. However, patients have a risk to get a secondary cancer. In order to estimate the risk, it is essential to understand particle transportation and nuclear reactions in the patient's body. The particle transport Monte Carlo simulation code is a useful tool to understand them. Since the code validation for heavy ion incident reactions is not enough, the experimental data of the elementary reaction processes are needed. We measured neutron production double-differential cross-sections (DDXs) on a carbon bombarded with 430 MeV/nucleon carbon beam at PH2 beam line of HIMAC facility in NIRS. Neutrons produced in the target were measured with NE213 liquid organic scintillators located at six angles of 15, 30, 45, 60, 75, and 90°. Neutron production double-differential cross-sections for carbon bombarded with 430 MeV/nucleon carbon ions were measured by the time-of-flight method with NE213 liquid organic scintillators at six angles of 15, 30, 45, 60, 75, and 90°. The cross sections were obtained from 1 MeV to several hundred MeV. The experimental data were compared with calculated results obtained by Monte Carlo simulation codes PHITS, Geant4, and FLUKA. PHITS was able to reproduce neutron production for elementary processes of carbon-carbon reaction precisely the best of three codes.

  11. Inter-comparison of MARS and FLUKA: Predictions on Energy Deposition in LHC IR Quadrupoles

    CERN Document Server

    Hoa, C; Cerutti, F; Ferrai, A

    2008-01-01

    Detailed modellings of the LHC insertion regions (IR) have earlier been performed to evaluate energy deposition in the IR superconducting magnets [1-4]. Proton-proton collisions at 14 TeV in the centre of mass lead to debris, depositing energy in the IR components. To evaluate uncertainties in those simulations and gain further confidence in the tools and approaches used, inter-comparison calculations have been performed with the latest versions of the FLUKA (2006.3b) [5, 6] and MARS15 [7, 8] Monte Carlo codes. These two codes, used worldwide for multi particle interaction and transport in accelerator, detector and shielding components, have been thoroughly benchmarked by the code authors and the user community (see, for example, recent [9, 10]). In the study described below, a better than 5% agreement was obtained for energy deposition calculated with these two codes - based on different independent physics models - for the identical geometry and initial conditions of a simple model representing the IR5 and ...

  12. Inter-comparison of MARS and FLUKA: Predictions on energy deposition in LHC IR quadrupoles

    International Nuclear Information System (INIS)

    Hoa, Christine; Cerutti, F.; Ferrari, A.; Mokhov, N.V.

    2008-01-01

    Detailed modelings of the LHC insertion regions (IR) have earlier been performed to evaluate energy deposition in the IR superconducting magnets [1-4]. Proton-proton collisions at 14 TeV in the centre of mass lead to debris, depositing energy in the IR components. To evaluate uncertainties in those simulations and gain further confidence in the tools and approaches used, inter-comparison calculations have been performed with the latest versions of the FLUKA (2006.3b) [5, 6] and MARS15 [7, 8] Monte Carlo codes. These two codes, used worldwide for multi particle interaction and transport in accelerator, detector and shielding components, have been thoroughly benchmarked by the code authors and the user community (see, for example, recent [9, 10]). In the study described below, a better than 5% agreement was obtained for energy deposition calculated with these two codes--based on different independent physics models--for the identical geometry and initial conditions of a simple model representing the IR5 and its first quadrupole

  13. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  14. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    Science.gov (United States)

    2015-04-01

    Award Number: W81XWH-11-2-0129 TITLE: PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury... Brain Injury 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Betty Diamond 5d. PROJECT NUMBER Paul N. Kizakevich 5e. TASK NUMBER E-Mail...and Google App stores.  ActiSleep. PHIT-based sleep diary for data collection in an adolescent sleep and marijuana study. National Institute on Drug

  15. Benchmarking nuclear models of FLUKA and GEANT4 for carbon ion therapy

    CERN Document Server

    Bohlen, TT; Quesada, J M; Bohlen, T T; Cerutti, F; Gudowska, I; Ferrari, A; Mairani, A

    2010-01-01

    As carbon ions, at therapeutic energies, penetrate tissue, they undergo inelastic nuclear reactions and give rise to significant yields of secondary fragment fluences. Therefore, an accurate prediction of these fluences resulting from the primary carbon interactions is necessary in the patient's body in order to precisely simulate the spatial dose distribution and the resulting biological effect. In this paper, the performance of nuclear fragmentation models of the Monte Carlo transport codes, FLUKA and GEANT4, in tissue-like media and for an energy regime relevant for therapeutic carbon ions is investigated. The ability of these Monte Carlo codes to reproduce experimental data of charge-changing cross sections and integral and differential yields of secondary charged fragments is evaluated. For the fragment yields, the main focus is on the consideration of experimental approximations and uncertainties such as the energy measurement by time-of-flight. For GEANT4, the hadronic models G4BinaryLightIonReaction a...

  16. Applications of the lahet simulation code to relativistic heavy ion detectors

    Energy Technology Data Exchange (ETDEWEB)

    Waters, L.; Gavron, A. [Los Alamos National Lab., NM (United States)

    1991-12-31

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article.

  17. Applications of the LAHET simulation code to relativistic heavy ion detectors

    International Nuclear Information System (INIS)

    Waters, L.S.; Gavron, A.

    1991-01-01

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article

  18. Development of general-purpose particle and heavy ion transport monte carlo code

    International Nuclear Information System (INIS)

    Iwase, Hiroshi; Nakamura, Takashi; Niita, Koji

    2002-01-01

    The high-energy particle transport code NMTC/JAM, which has been developed at JAERI, was improved for the high-energy heavy ion transport calculation by incorporating the JQMD code, the SPAR code and the Shen formula. The new NMTC/JAM named PHITS (Particle and Heavy-Ion Transport code System) is the first general-purpose heavy ion transport Monte Carlo code over the incident energies from several MeV/nucleon to several GeV/nucleon. (author)

  19. The Fluka Linebuilder and Element Database: Tools for Building Complex Models of Accelerators Beam Lines

    CERN Document Server

    Mereghetti, A; Cerutti, F; Versaci, R; Vlachoudis, V

    2012-01-01

    Extended FLUKA models of accelerator beam lines can be extremely complex: heavy to manipulate, poorly versatile and prone to mismatched positioning. We developed a framework capable of creating the FLUKA model of an arbitrary portion of a given accelerator, starting from the optics configuration and a few other information provided by the user. The framework includes a builder (LineBuilder), an element database and a series of configuration and analysis scripts. The LineBuilder is a Python program aimed at dynamically assembling complex FLUKA models of accelerator beam lines: positions, magnetic fields and scorings are automatically set up, and geometry details such as apertures of collimators, tilting and misalignment of elements, beam pipes and tunnel geometries can be entered at user’s will. The element database (FEDB) is a collection of detailed FLUKA geometry models of machine elements. This framework has been widely used for recent LHC and SPS beam-machine interaction studies at CERN, and led to a dra...

  20. Design of 6 Mev linear accelerator based pulsed thermal neutron source: FLUKA simulation and experiment

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N. [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India)

    2012-01-15

    The 6 MeV LINAC based pulsed thermal neutron source has been designed for bulk materials analysis. The design was optimized by varying different parameters of the target and materials for each region using FLUKA code. The optimized design of thermal neutron source gives flux of 3 Multiplication-Sign 10{sup 6}ncm{sup -2}s{sup -1} with more than 80% of thermal neutrons and neutron to gamma ratio was 1 Multiplication-Sign 10{sup 4}ncm{sup -2}mR{sup -1}. The results of prototype experiment and simulation are found to be in good agreement with each other. - Highlights: Black-Right-Pointing-Pointer The optimized 6 eV linear accelerator based thermal neutron source using FLUKA simulation. Black-Right-Pointing-Pointer Beryllium as a photonuclear target and reflector, polyethylene as a filter and shield, graphite as a moderator. Black-Right-Pointing-Pointer Optimized pulsed thermal neutron source gives neutron flux of 3 Multiplication-Sign 10{sup 6} n cm{sup -2} s{sup -1}. Black-Right-Pointing-Pointer Results of the prototype experiment were compared with simulations and are found to be in good agreement. Black-Right-Pointing-Pointer This source can effectively be used for the study of bulk material analysis and activation products.

  1. Procedures used during the verification of shielding and access-ways at CERN's Large Hadron Collider (LHC) using the FLUKA code

    International Nuclear Information System (INIS)

    Ferrari, A.; Huhtinen, M.; Rollet, S.; Stevenson, G.R.

    1997-01-01

    Several examples will be given which illustrate the special features of the Monte-Carlo cascade simulation program FLUKA, used in the verification studies of shielding for the LHC. These include the use of different estimators for dose equivalent, region importance weighting with particle splitting, Russian Roulette and weight windows both at region boundaries and m secondary production at inelastic reactions and decay-length biasing in order to favour secondary particle production. (author)

  2. FLUKA Calculation of the Neutron Albedo Encountered at Low Earth Orbits

    CERN Document Server

    Claret, Arnaud; Combier, Natacha; Ferrari, Alfredo; Laurent, Philippe

    2014-01-01

    This paper presents Monte-Carlo simulations based on the Fluka code aiming to calculate the contribution of the neutron albedo at a given date and altitude above the Earth chosen by the user. The main input parameters of our model are the solar modulation affecting the spectra of cosmic rays, and the date of the Earth’s geomagnetic fi eld. The results consist in a two-parameter distribution, the neutron energy and the angle to the tangent plane of the sphere containing the orbi t of interest, and are provided by geographical position above the E arth at the chosen altitude. This model can be used to predict the te mporal variation of the neutron fl ux encountered along the orbit, and thus constrain the determination of the instrumental backg round noise of space experiments in low earth orbit.

  3. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  4. Radiation transport simulation of the Martian GCR surface flux and dose estimation using spherical geometry in PHITS compared to MSL-RAD measurements.

    Science.gov (United States)

    Flores-McLaughlin, John

    2017-08-01

    Planetary bodies and spacecraft are predominantly exposed to isotropic radiation environments that are subject to transport and interaction in various material compositions and geometries. Specifically, the Martian surface radiation environment is composed of galactic cosmic radiation, secondary particles produced by their interaction with the Martian atmosphere, albedo particles from the Martian regolith and occasional solar particle events. Despite this complex physical environment with potentially significant locational and geometric dependencies, computational resources often limit radiation environment calculations to a one-dimensional or slab geometry specification. To better account for Martian geometry, spherical volumes with respective Martian material densities are adopted in this model. This physical description is modeled with the PHITS radiation transport code and compared to a portion of measurements from the Radiation Assessment Detector of the Mars Science Laboratory. Particle spectra measured between 15 November 2015 and 15 January 2016 and PHITS model results calculated for this time period are compared. Results indicate good agreement between simulated dose rates, proton, neutron and gamma spectra. This work was originally presented at the 1st Mars Space Radiation Modeling Workshop held in 2016 in Boulder, CO. Copyright © 2017. Published by Elsevier Ltd.

  5. An integral test of FLUKA nuclear models with 160 MeV proton beams in multi-layer Faraday cups

    CERN Document Server

    Rinaldi, I; Parodi, K; Ferrari, A; Sala, P; Mairani, A

    2011-01-01

    Monte Carlo (MC) codes are useful tools to simulate the complex processes of proton beam interactions with matter. In proton therapy, nuclear reactions influence the dose distribution. Therefore, the validation of nuclear models adopted in MC codes is a critical requisite for their use in this field. A simple integral test can be performed using a multi-layer Faraday cup (MLFC). This method allows separation of the nuclear and atomic interaction processes, which are responsible for secondary particle emission and the finite primary proton range, respectively. In this work, the propagation of 160 MeV protons stopping in two MLFCs made of polyethylene and copper has been simulated by the FLUKA MC code. The calculations have been performed with and without secondary electron emission and transport, as well as charge sharing in the dielectric layers. Previous results with other codes neglected those two effects. The impact of this approximation has been investigated and found to be relevant only in the proximity ...

  6. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    International Nuclear Information System (INIS)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-01-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies

  7. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    Science.gov (United States)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-10-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies.

  8. Neutron spectrometry using LNL bonner spheres and FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Sarchiapone, L.; Zafiropoulos, D. [INFN, Laboratori Nazionali di Legnaro (Italy)

    2013-07-18

    The characterization of neutron fields has been made with a system based on a scintillation detector and multiple moderating spheres. The system, together with the unfolding procedure, have been tested in quasi-monochromatic neutron energy fields and in complex, mixed, cyclotron based environments. FLUKA simulations have been used to produce response functions and reference energy spectra.

  9. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  10. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    Science.gov (United States)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  11. Benchmark of the SixTrack-Fluka Active Coupling Against the SPS Scrapers Burst Test

    CERN Multimedia

    Mereghetti, A; Cerutti, F

    2014-01-01

    The SPS scrapers are a key ingredient for the clean injection into the LHC: they cut off halo particles quite close to the beam core (e.g.~3.5 sigma) just before extraction, to minimise the risk for quenches. The improved beam parameters as envisaged by the LHC Injectors Upgrade (LIU) Project required a revision of the present system, to assess its suitability and robustness. In particular, a burst (i.e. endurance) test of the scraper blades has been carried out, with the whole bunch train being scraped at the centre (worst working conditions). In order to take into account the effect of betatron and longitudinal beam dynamics on energy deposition patterns, and nuclear and Coulomb scattering in the absorbing medium onto loss patterns, the SixTrack and Fluka codes have been coupled, profiting from the best of the refined physical models they respectively embed. The coupling envisages an active exchange of tracked particles between the two codes at each turn, and an on-line aperture check in SixTrack, in order ...

  12. Simulation of Experimental Background using FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Rokni, Sayed

    1999-05-11

    In November 1997, Experiment T423 began acquiring data with the intentions of understanding the energy spectra of high-energy neutrons generated in the interaction of electrons with lead. The following describes a series of FLUKA simulations studying (1) particle yields in the absence of all background; (2) the background caused from scattering in the room; (3) the effects of the thick lead shielding which surrounded the detector; (4) the sources of neutron background created in this lead shielding; and (5) the ratio of the total background to the ideal yield. In each case, particular attention is paid to the neutron yield.

  13. FLUKA Studies of the Asynchronous Beam Dump Effects on LHC Point 6

    CERN Document Server

    Versaci, R; Goddard, B; Schmidt, R; Vlachoudis, V; Mereghetti, A

    2011-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. FLUKA is the most widely used code for this kind of simulations at CERN because of the high reliability of its results and the ease to custom detailed simulations all along hundreds of meters of beam line. We have investigated the effects of an asynchronous beam dump on the LHC Point 6 where, beams with a stored energy of 360 MJ, can instantaneously release up to a few J cm−3 in the cryogenic magnets which have a quench limit of the order of the mJ cm−3. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump. We will then analyse the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  14. Testing FLUKA on neutron activation of Si and Ge at nuclear research reactor using gamma spectroscopy

    Science.gov (United States)

    Bazo, J.; Rojas, J. M.; Best, S.; Bruna, R.; Endress, E.; Mendoza, P.; Poma, V.; Gago, A. M.

    2018-03-01

    Samples of two characteristic semiconductor sensor materials, silicon and germanium, have been irradiated with neutrons produced at the RP-10 Nuclear Research Reactor at 4.5 MW. Their radionuclides photon spectra have been measured with high resolution gamma spectroscopy, quantifying four radioisotopes (28Al, 29Al for Si and 75Ge and 77Ge for Ge). We have compared the radionuclides production and their emission spectrum data with Monte Carlo simulation results from FLUKA. Thus we have tested FLUKA's low energy neutron library (ENDF/B-VIIR) and decay photon scoring with respect to the activation of these semiconductors. We conclude that FLUKA is capable of predicting relative photon peak amplitudes, with gamma intensities greater than 1%, of produced radionuclides with an average uncertainty of 13%. This work allows us to estimate the corresponding systematic error on neutron activation simulation studies of these sensor materials.

  15. Simulation of e-{gamma}-n targets by FLUKA and measurement of neutron flux at various angles for accelerator based neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N. [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India)

    2010-10-15

    A 6 MeV Race track Microtron (an electron accelerator) based pulsed neutron source has been designed specifically for the elemental analysis of short lived activation products where the low neutron flux requirement is desirable. The bremsstrahlung radiation emitted by impinging 6 MeV electron on the e-{gamma} primary target, was made to fall on the {gamma}-n secondary target to produce neutrons. The optimisation of bremsstrahlung and neutron producing target along with their spectra were estimated using FLUKA code. The measurement of neutron flux was carried out by activation of vanadium and the measured fluxes were 1.1878 x 10{sup 5}, 0.9403 x 10{sup 5}, 0.7428 x 10{sup 5}, 0.6274 x 10{sup 5}, 0.5659 x 10{sup 5}, 0.5210 x 10{sup 5} n/cm{sup 2}/s at 0{sup o}, 30{sup o}, 60{sup o}, 90{sup o}, 115{sup o}, 140{sup o} respectively. The results indicate that the neutron flux was found to be decreased as increase in the angle and in good agreement with the FLUKA simulation.

  16. Implementing displacement damage calculations for electrons and gamma rays in the Particle and Heavy-Ion Transport code System

    Science.gov (United States)

    Iwamoto, Yosuke

    2018-03-01

    In this study, the Monte Carlo displacement damage calculation method in the Particle and Heavy-Ion Transport code System (PHITS) was improved to calculate displacements per atom (DPA) values due to irradiation by electrons (or positrons) and gamma rays. For the damage due to electrons and gamma rays, PHITS simulates electromagnetic cascades using the Electron Gamma Shower version 5 (EGS5) algorithm and calculates DPA values using the recoil energies and the McKinley-Feshbach cross section. A comparison of DPA values calculated by PHITS and the Monte Carlo assisted Classical Method (MCCM) reveals that they were in good agreement for gamma-ray irradiations of silicon and iron at energies that were less than 10 MeV. Above 10 MeV, PHITS can calculate DPA values not only for electrons but also for charged particles produced by photonuclear reactions. In DPA depth distributions under electron and gamma-ray irradiations, build-up effects can be observed near the target's surface. For irradiation of 90-cm-thick carbon by protons with energies of more than 30 GeV, the ratio of the secondary electron DPA values to the total DPA values is more than 10% and increases with an increase in incident energy. In summary, PHITS can calculate DPA values for all particles and materials over a wide energy range between 1 keV and 1 TeV for electrons, gamma rays, and charged particles and between 10-5 eV and 1 TeV for neutrons.

  17. Simulation of equivalent dose due to accidental electron beam loss in Indus-1 and Indus-2 synchrotron radiation sources using FLUKA code

    International Nuclear Information System (INIS)

    Sahani, P.K.; Dev, Vipin; Singh, Gurnam; Haridas, G.; Thakkar, K.K.; Sarkar, P.K.; Sharma, D.N.

    2008-01-01

    Indus-1 and Indus-2 are two Synchrotron radiation sources at Raja Ramanna Centre for Advanced Technology (RRCAT), India. Stored electron energy in Indus-1 and Indus-2 are 450MeV and 2.5GeV respectively. During operation of storage ring, accidental electron beam loss may occur in addition to normal beam losses. The Bremsstrahlung radiation produced due to the beam losses creates a major radiation hazard in these high energy electron accelerators. FLUKA, the Monte Carlo radiation transport code is used to simulate the accidental beam loss. The simulation was carried out to estimate the equivalent dose likely to be received by a trapped person closer to the storage ring. Depth dose profile in water phantom for 450MeV and 2.5GeV electron beam is generated, from which percentage energy absorbed in 30cm water phantom (analogous to human body) is calculated. The simulation showed the percentage energy deposition in the phantom is about 19% for 450MeV electron and 4.3% for 2.5GeV electron. The dose build up factor in 30cm water phantom for 450MeV and 2.5GeV electron beam are found to be 1.85 and 2.94 respectively. Based on the depth dose profile, dose equivalent index of 0.026Sv and 1.08Sv are likely to be received by the trapped person near the storage ring in Indus-1 and Indus-2 respectively. (author)

  18. COMPARISON OF COSMIC-RAY ENVIRONMENTS ON EARTH, MOON, MARS AND IN SPACECARFT USING PHITS.

    Science.gov (United States)

    Sato, Tatsuhiko; Nagamatsu, Aiko; Ueno, Haruka; Kataoka, Ryuho; Miyake, Shoko; Takeda, Kazuo; Niita, Koji

    2017-09-29

    Estimation of cosmic-ray doses is of great importance not only in aircrew and astronaut dosimetry but also in evaluation of background radiation exposure to public. We therefore calculated the cosmic-ray doses on Earth, Moon and Mars as well as inside spacecraft, using Particle and Heavy Ion Transport code System PHITS. The same cosmic-ray models and dose conversion coefficients were employed in the calculation to properly compare between the simulation results for different environments. It is quantitatively confirmed that the thickness of physical shielding including the atmosphere and soil of the planets is the most important parameter to determine the cosmic-ray doses and their dominant contributors. The comparison also suggests that higher solar activity significantly reduces the astronaut doses particularly for the interplanetary missions. The information obtained from this study is useful in the designs of the future space missions as well as accelerator-based experiments dedicated to cosmic-ray research. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. TU-AB-BRC-02: Accuracy Evaluation of GPU-Based OpenCL Carbon Monte Carlo Package (goCMC) in Biological Dose and Microdosimetry in Comparison to FLUKA Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Peeler, C; Qin, N; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States)

    2016-06-15

    Purpose: One of the most accurate methods for radiation transport is Monte Carlo (MC) simulation. Long computation time prevents its wide applications in clinic. We have recently developed a fast MC code for carbon ion therapy called GPU-based OpenCL Carbon Monte Carlo (goCMC) and its accuracy in physical dose has been established. Since radiobiology is an indispensible aspect of carbon ion therapy, this study evaluates accuracy of goCMC in biological dose and microdosimetry by benchmarking it with FLUKA. Methods: We performed simulations of a carbon pencil beam with 150, 300 and 450 MeV/u in a homogeneous water phantom using goCMC and FLUKA. Dose and energy spectra for primary and secondary ions on the central beam axis were recorded. Repair-misrepair-fixation model was employed to calculate Relative Biological Effectiveness (RBE). Monte Carlo Damage Simulation (MCDS) tool was used to calculate microdosimetry parameters. Results: Physical dose differences on the central axis were <1.6% of the maximum value. Before the Bragg peak, differences in RBE and RBE-weighted dose were <2% and <1%. At the Bragg peak, the differences were 12.5% caused by small range discrepancy and sensitivity of RBE to beam spectra. Consequently, RBE-weighted dose difference was 11%. Beyond the peak, RBE differences were <20% and primarily caused by differences in the Helium-4 spectrum. However, the RBE-weighted dose agreed within 1% due to the low physical dose. Differences in microdosimetric quantities were small except at the Bragg peak. The simulation time per source particle with FLUKA was 0.08 sec, while goCMC was approximately 1000 times faster. Conclusion: Physical doses computed by FLUKA and goCMC were in good agreement. Although relatively large RBE differences were observed at and beyond the Bragg peak, the RBE-weighted dose differences were considered to be acceptable.

  20. Applications of the Los Alamos High Energy Transport code

    International Nuclear Information System (INIS)

    Waters, L.; Gavron, A.; Prael, R.E.

    1992-01-01

    Simulation codes reliable through a large range of energies are essential to analyze the environment of vehicles and habitats proposed for space exploration. The LAHET monte carlo code has recently been expanded to track high energy hadrons with FLUKA, while retaining the original Los Alamos version of HETC at lower energies. Electrons and photons are transported with EGS4, and an interface to the MCNP monte carlo code is provided to analyze neutrons with kinetic energies less than 20 MeV. These codes are benchmarked by comparison of LAHET/MCNP calculations to data from the Brookhaven experiment E814 participant calorimeter

  1. Research capacity building integrated into PHIT projects: leveraging research and research funding to build national capacity.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Chilengi, Roma; Jackson, Elizabeth; Michel, Cathy; Napua, Manuel; Odhiambo, Jackline; Bawah, Ayaga

    2017-12-21

    Inadequate research capacity impedes the development of evidence-based health programming in sub-Saharan Africa. However, funding for research capacity building (RCB) is often insufficient and restricted, limiting institutions' ability to address current RCB needs. The Doris Duke Charitable Foundation's African Health Initiative (AHI) funded Population Health Implementation and Training (PHIT) partnership projects in five African countries (Ghana, Mozambique, Rwanda, Tanzania and Zambia) to implement health systems strengthening initiatives inclusive of RCB. Using Cooke's framework for RCB, RCB activity leaders from each country reported on RCB priorities, activities, program metrics, ongoing challenges and solutions. These were synthesized by the authorship team, identifying common challenges and lessons learned. For most countries, each of the RCB domains from Cooke's framework was a high priority. In about half of the countries, domain specific activities happened prior to PHIT. During PHIT, specific RCB activities varied across countries. However, all five countries used AHI funding to improve research administrative support and infrastructure, implement research trainings and support mentorship activities and research dissemination. While outcomes data were not systematically collected, countries reported holding 54 research trainings, forming 56 mentor-mentee relationships, training 201 individuals and awarding 22 PhD and Masters-level scholarships. Over the 5 years, 116 manuscripts were developed. Of the 59 manuscripts published in peer-reviewed journals, 29 had national first authors and 18 had national senior authors. Trainees participated in 99 conferences and projects held 37 forums with policy makers to facilitate research translation into policy. All five PHIT projects strongly reported an increase in RCB activities and commended the Doris Duke Charitable Foundation for prioritizing RCB, funding RCB at adequate levels and time frames and for allowing

  2. Comprehensive and integrated district health systems strengthening: the Rwanda Population Health Implementation and Training (PHIT) Partnership.

    Science.gov (United States)

    Drobac, Peter C; Basinga, Paulin; Condo, Jeanine; Farmer, Paul E; Finnegan, Karen E; Hamon, Jessie K; Amoroso, Cheryl; Hirschhorn, Lisa R; Kakoma, Jean Baptise; Lu, Chunling; Murangwa, Yusuf; Murray, Megan; Ngabo, Fidele; Rich, Michael; Thomson, Dana; Binagwaho, Agnes

    2013-01-01

    Nationally, health in Rwanda has been improving since 2000, with considerable improvement since 2005. Despite improvements, rural areas continue to lag behind urban sectors with regard to key health outcomes. Partners In Health (PIH) has been supporting the Rwanda Ministry of Health (MOH) in two rural districts in Rwanda since 2005. Since 2009, the MOH and PIH have spearheaded a health systems strengthening (HSS) intervention in these districts as part of the Rwanda Population Health Implementation and Training (PHIT) Partnership. The partnership is guided by the belief that HSS interventions should be comprehensive, integrated, responsive to local conditions, and address health care access, cost, and quality. The PHIT Partnership represents a collaboration between the MOH and PIH, with support from the National University of Rwanda School of Public Health, the National Institute of Statistics, Harvard Medical School, and Brigham and Women's Hospital. The PHIT Partnership's health systems support aligns with the World Health Organization's six health systems building blocks. HSS activities focus across all levels of the health system - community, health center, hospital, and district leadership - to improve health care access, quality, delivery, and health outcomes. Interventions are concentrated on three main areas: targeted support for health facilities, quality improvement initiatives, and a strengthened network of community health workers. The impact of activities will be assessed using population-level outcomes data collected through oversampling of the demographic and health survey (DHS) in the intervention districts. The overall impact evaluation is complemented by an analysis of trends in facility health care utilization. A comprehensive costing project captures the total expenditures and financial inputs of the health care system to determine the cost of systems improvement. Targeted evaluations and operational research pieces focus on specific

  3. Heavy-ion transport codes for radiotherapy and radioprotection in space

    International Nuclear Information System (INIS)

    Mancusi, Davide

    2006-06-01

    Simulation of the transport of heavy ions in matter is a field of nuclear science that has recently received attention in view of its importance for some relevant applications. Accelerated heavy ions can, for example, be used to treat cancers (heavy-ion radiotherapy) and show some superior qualities with respect to more conventional treatment systems, like photons (x-rays) or protons. Furthermore, long-term manned space missions (like a possible future mission to Mars) pose the challenge to protect astronauts and equipment on board against the harmful space radiation environment, where heavy ions can be responsible for a significant share of the exposure risk. The high accuracy expected from a transport algorithm (especially in the case of radiotherapy) and the large amount of semi-empirical knowledge necessary to even state the transport problem properly rule out any analytical approach; the alternative is to resort to numerical simulations in order to build treatment-planning systems for cancer or to aid space engineers in shielding design. This thesis is focused on the description of HIBRAC, a one-dimensional deterministic code optimised for radiotherapy, and PHITS (Particle and Heavy- Ion Transport System), a general-purpose three-dimensional Monte-Carlo code. The structure of both codes is outlined and some relevant results are presented. In the case of PHITS, we also report the first results of an ongoing comprehensive benchmarking program for the main components of the code; we present the comparison of partial charge-changing cross sections for a 400 MeV/n 40 Ar beam impinging on carbon, polyethylene, aluminium, copper, tin and lead targets

  4. Simulation of electron, positron and Bremsstrahlung spectrum generated due to electromagnetic cascade by 2.5 GeV electron hitting lead target using FLUKA code

    International Nuclear Information System (INIS)

    Sahani, P.K.; Dev, Vipin; Haridas, G.; Thakkar, K.K.; Singh, Gurnam; Sarkar, P.K.; Sharma, D.N.

    2009-01-01

    INDUS-2 is a high energy electron accelerator facility where electrons are accelerated in circular ring up to maximum energy 2.5 GeV, to generate synchrotron radiation. During normal operation of the machine a fraction of these electrons is lost, which interact with the accelerator structures and components like vacuum chamber and residual gases in the cavity and hence generates significant amount of Bremsstrahlung radiation. The Bremsstrahlung radiation is highly dependent on the incident electron energy, target material and its thickness. The Bremsstrahlung radiation dominates the radiation environment in such electron storage rings. Because of its broad spectrum extending up to incident electron energy and pulsed nature, it is very difficult to segregate the Bremsstrahlung component from the mixed field environment in accelerators. With the help of FLUKA Monte Carlo code, Bremsstrahlung spectrum generated from 2.5 GeV electron on bombardment of high Z lead target is simulated. To study the variation in Bremsstrahlung spectrum on target thickness, lead targets of 3, 6, 9, 12, 15, 18 mm thickness was used. The energy spectrum of emerging electron and positron is also simulated. The study suggests that as the target thickness increases, the emergent Bremsstrahlung photon fluence increases. With increase in the target thickness Bremsstrahlung photons in the spectrum dominate the low energy part and degrade in high energy part. The electron and positron spectra also extend up to incident electron energy. (author)

  5. Heavy-ion transport codes for radiotherapy and radioprotection in space

    Energy Technology Data Exchange (ETDEWEB)

    Mancusi, Davide

    2006-06-15

    Simulation of the transport of heavy ions in matter is a field of nuclear science that has recently received attention in view of its importance for some relevant applications. Accelerated heavy ions can, for example, be used to treat cancers (heavy-ion radiotherapy) and show some superior qualities with respect to more conventional treatment systems, like photons (x-rays) or protons. Furthermore, long-term manned space missions (like a possible future mission to Mars) pose the challenge to protect astronauts and equipment on board against the harmful space radiation environment, where heavy ions can be responsible for a significant share of the exposure risk. The high accuracy expected from a transport algorithm (especially in the case of radiotherapy) and the large amount of semi-empirical knowledge necessary to even state the transport problem properly rule out any analytical approach; the alternative is to resort to numerical simulations in order to build treatment-planning systems for cancer or to aid space engineers in shielding design. This thesis is focused on the description of HIBRAC, a one-dimensional deterministic code optimised for radiotherapy, and PHITS (Particle and Heavy- Ion Transport System), a general-purpose three-dimensional Monte-Carlo code. The structure of both codes is outlined and some relevant results are presented. In the case of PHITS, we also report the first results of an ongoing comprehensive benchmarking program for the main components of the code; we present the comparison of partial charge-changing cross sections for a 400 MeV/n {sup 40}Ar beam impinging on carbon, polyethylene, aluminium, copper, tin and lead targets.

  6. Validation of FLUKA calculated cross-sections for radioisotope production in proton-on-target collisions at proton energies around 1 GeV

    CERN Document Server

    Felcini, M

    2006-01-01

    The production cross-sections of several radioisotopes induced by 1 GeV protons impinging on different target materials have been calculated using the FLUKA Monte Carlo and compared to measured cross-sections. The emphasis of this study is on the production of alpha and beta/gamma emitters of interest for activation evaluations at a research complex, such as the EURISOL complex, using several MW power proton driver at an energy of 1 GeV. The comparisons show that in most of the cases of interest for such evaluations, the FLUKA Monte Carlo reproduces radioisotope production cross-sections within less than a factor of two with respect to the measured values. This result implies that the FLUKA calculations are adequately accurate for proton induced activation estimates at a 1 GeV high power proton driver complex.

  7. Optix: A Monte Carlo scintillation light transport code

    Energy Technology Data Exchange (ETDEWEB)

    Safari, M.J., E-mail: mjsafari@aut.ac.ir [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Afarideh, H. [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Ghal-Eh, N. [School of Physics, Damghan University, PO Box 36716-41167, Damghan (Iran, Islamic Republic of); Davani, F. Abbasi [Nuclear Engineering Department, Shahid Beheshti University, PO Box 1983963113, Tehran (Iran, Islamic Republic of)

    2014-02-11

    The paper reports on the capabilities of Monte Carlo scintillation light transport code Optix, which is an extended version of previously introduced code Optics. Optix provides the user a variety of both numerical and graphical outputs with a very simple and user-friendly input structure. A benchmarking strategy has been adopted based on the comparison with experimental results, semi-analytical solutions, and other Monte Carlo simulation codes to verify various aspects of the developed code. Besides, some extensive comparisons have been made against the tracking abilities of general-purpose MCNPX and FLUKA codes. The presented benchmark results for the Optix code exhibit promising agreements. -- Highlights: • Monte Carlo simulation of scintillation light transport in 3D geometry. • Evaluation of angular distribution of detected photons. • Benchmark studies to check the accuracy of Monte Carlo simulations.

  8. FLUKA Monte Carlo Modelling of the CHARM Facility’s Test Area: Update of the Radiation Field Assessment

    CERN Document Server

    Infantino, Angelo

    2017-01-01

    The present Accelerator Note is a follow-up of the previous report CERN-ACC-NOTE-2016-12345. In the present work, the FLUKA Monte Carlo model of CERN’s CHARM facility has been improved to the most up-to-date configuration of the facility, including: new test positions, a global refinement of the FLUKA geometry, a careful review of the transport and physics parameters. Several configurations of the facility, in terms of target material and movable shielding configuration, have been simulated. The full set of results is reported in the following and can act as a reference guide to any potential user of the facility.

  9. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J [St. Jude Children’s Research Hospital, Memphis, TN (United States); Stewart, R [University of Washington, Seattle, WA. (United States)

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  10. Production of secondary particles and nuclei in cosmic rays collisions with the interstellar gas using the FLUKA code

    CERN Document Server

    Mazziotta, M N; Ferrari, A; Gaggero, D; Loparco, F; Sala, P R

    2016-01-01

    The measured fluxes of secondary particles produced by the interactions of Cosmic Rays (CRs) with the astronomical environment play a crucial role in understanding the physics of CR transport. In this work we present a comprehensive calculation of the secondary hadron, lepton, gamma-ray and neutrino yields produced by the inelastic interactions between several species of stable or long-lived cosmic rays projectiles (p, D, T, 3He, 4He, 6Li, 7Li, 9Be, 10Be, 10B, 11B, 12C, 13C, 14C, 14N, 15N, 16O, 17O, 18O, 20Ne, 24Mg and 28Si) and different target gas nuclei (p, 4He, 12C, 14N, 16O, 20Ne, 24Mg, 28Si and 40Ar). The yields are calculated using FLUKA, a simulation package designed to compute the energy distributions of secondary products with large accuracy in a wide energy range. The present results provide, for the first time, a complete and self-consistent set of all the relevant inclusive cross sections regarding the whole spectrum of secondary products in nuclear collisions. We cover, for the projectiles, a ki...

  11. A Simple Ripple Filter for FLUKA

    DEFF Research Database (Denmark)

    Bassler, Niels; Herrmann, Rochus

    In heavy ion radiotherapy, pristine C-12 beams are usually widened a few mm (FWHM) along the beam axis, before the actual spread out Bragg peak (SOBP) is build. The pristine beam widening is commonly performed with a ripple filter, known from the facility at GSI (Darmstadt) and at HIT (Heidelberg......). The ripple filter at GSI and HIT consists of several wedge like structures, which widens the Bragg-peak up to e.g. 3 mm. For Monte Carlo simulations of C-12 therapy, the exact setup, including the ripple filter needs to be simulated. In the Monte Carlo particle transport program FLUKA, the ripple filter can....... Since the ripple filter is a periodic geometry, one could use the LATTIC card with advantage, but here we shall take a Monte Carlo based approach istead. The advantage of this method is that our input file merely contains one body as the ripple filter, which can be a flat slab (or any other arbitrary...

  12. NASA space radiation transport code development consortium

    International Nuclear Information System (INIS)

    Townsend, L. W.

    2005-01-01

    Recently, NASA established a consortium involving the Univ. of Tennessee (lead institution), the Univ. of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking. (authors)

  13. PHITS simulations of the Protective curtain experiment onboard the Service module of ISS: Comparison with absorbed doses measured with TLDs

    Czech Academy of Sciences Publication Activity Database

    Ploc, Ondřej; Sihver, L.; Kartashov, D.; Shurshakov, V.; Tolochek, R. V.

    2013-01-01

    Roč. 52, č. 11 (2013), s. 1911-1918 ISSN 0273-1177 Institutional support: RVO:61389005 Keywords : protective curtain experiment * shielding of cosmic radiation * PHITS simulations * ISS Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.238, year: 2013

  14. Quantification of the validity of simulations based on Geant4 and FLUKA for photo-nuclear interactions in the high energy range

    Science.gov (United States)

    Quintieri, Lina; Pia, Maria Grazia; Augelli, Mauro; Saracco, Paolo; Capogni, Marco; Guarnieri, Guido

    2017-09-01

    Photo-nuclear interactions are relevant in many research fields of both fundamental and applied physics and, for this reason, accurate Monte Carlo simulations of photo-nuclear interactions can provide a valuable and indispensable support in a wide range of applications (i.e from the optimisation of photo-neutron source target to the dosimetric estimation in high energy accelerator, etc). Unfortunately, few experimental photo-nuclear data are available above 100 MeV, so that, in the high energy range (from hundreds of MeV up to GeV scale), the code predictions are based on physical models. The aim of this work is to compare the predictions of relevant observables involving photon-nuclear interaction modelling, obtained with GEANT4 and FLUKA, to experimental data (if available), in order to assess the code estimation reliability, over a wide energy range. In particular, the comparison of the estimated photo-neutron yields and energy spectra with the experimental results of the n@BTF experiment (carried out at the Beam Test Facility of DaΦne collider, in Frascati, Italy) is here reported and discussed. Moreover, the preliminary results of the comparison of the cross sections used in the codes with the"evaluated' data recommended by the IAEA are also presented for some selected cases (W, Pb, Zn).

  15. East Area Irradiation Test Facility: Preliminary FLUKA calculations

    CERN Document Server

    Lebbos, E; Calviani, M; Gatignon, L; Glaser, M; Moll, M; CERN. Geneva. ATS Department

    2011-01-01

    In the framework of the Radiation to Electronics (R2E) mitigation project, the testing of electronic equipment in a radiation field similar to the one occurring in the LHC tunnel and shielded areas to study its sensitivity to single even upsets (SEU) is one of the main topics. Adequate irradiation test facilities are therefore required, and one installation is under consideration in the framework of the PS East area renovation activity. FLUKA Monte Carlo calculations were performed in order to estimate the radiation field which could be obtained in a mixed field facility using the slowly extracted 24 GeV/c proton beam from the PS. The prompt ambient dose equivalent as well as the equivalent residual dose rate after operation was also studied and results of simulations are presented in this report.

  16. Benchmarking study and its application for shielding analysis of large accelerator facilities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee-Seock; Kim, Dong-hyun; Oranj, Leila Mokhtari; Oh, Joo-Hee; Lee, Arim; Jung, Nam-Suk [POSTECH, Pohang (Korea, Republic of)

    2015-10-15

    Shielding Analysis is one of subjects which are indispensable to construct large accelerator facility. Several methods, such as the Monte Carlo, discrete ordinate, and simplified calculation, have been used for this purpose. The calculation precision is overcome by increasing the trial (history) numbers. However its accuracy is still a big issue in the shielding analysis. To secure the accuracy in the Monte Carlo calculation, the benchmarking study using experimental data and the code comparison are adopted fundamentally. In this paper, the benchmarking result for electrons, protons, and heavy ions are presented as well as the proper application of the results is discussed. The benchmarking calculations, which are indispensable in the shielding analysis were performed for different particles: proton, heavy ion and electron. Four different multi-particle Monte Carlo codes, MCNPX, FLUKA, PHITS, and MARS, were examined for higher energy range equivalent to large accelerator facility. The degree of agreement between the experimental data including the SINBAD database and the calculated results were estimated in the terms of secondary neutron production and attenuation through the concrete and iron shields. The degree of discrepancy and the features of Monte Carlo codes were investigated and the application way of the benchmarking results are discussed in the view of safety margin and selecting the code for the shielding analysis. In most cases, the tested Monte Carlo codes give proper credible results except of a few limitation of each codes.

  17. Benchmarking study and its application for shielding analysis of large accelerator facilities

    International Nuclear Information System (INIS)

    Lee, Hee-Seock; Kim, Dong-hyun; Oranj, Leila Mokhtari; Oh, Joo-Hee; Lee, Arim; Jung, Nam-Suk

    2015-01-01

    Shielding Analysis is one of subjects which are indispensable to construct large accelerator facility. Several methods, such as the Monte Carlo, discrete ordinate, and simplified calculation, have been used for this purpose. The calculation precision is overcome by increasing the trial (history) numbers. However its accuracy is still a big issue in the shielding analysis. To secure the accuracy in the Monte Carlo calculation, the benchmarking study using experimental data and the code comparison are adopted fundamentally. In this paper, the benchmarking result for electrons, protons, and heavy ions are presented as well as the proper application of the results is discussed. The benchmarking calculations, which are indispensable in the shielding analysis were performed for different particles: proton, heavy ion and electron. Four different multi-particle Monte Carlo codes, MCNPX, FLUKA, PHITS, and MARS, were examined for higher energy range equivalent to large accelerator facility. The degree of agreement between the experimental data including the SINBAD database and the calculated results were estimated in the terms of secondary neutron production and attenuation through the concrete and iron shields. The degree of discrepancy and the features of Monte Carlo codes were investigated and the application way of the benchmarking results are discussed in the view of safety margin and selecting the code for the shielding analysis. In most cases, the tested Monte Carlo codes give proper credible results except of a few limitation of each codes

  18. Benchmark of the FLUKA model of crystal channeling against the UA9-H8 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schoofs, P.; Cerutti, F.; Ferrari, A. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Smirnov, G. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Joint Institute for Nuclear Research (JINR), Dubna (Russian Federation)

    2015-07-15

    Channeling in bent crystals is increasingly considered as an option for the collimation of high-energy particle beams. The installation of crystals in the LHC has taken place during this past year and aims at demonstrating the feasibility of crystal collimation and a possible cleaning efficiency improvement. The performance of CERN collimation insertions is evaluated with the Monte Carlo code FLUKA, which is capable to simulate energy deposition in collimators as well as beam loss monitor signals. A new model of crystal channeling was developed specifically so that similar simulations can be conducted in the case of crystal-assisted collimation. In this paper, most recent results of this model are brought forward in the framework of a joint activity inside the UA9 collaboration to benchmark the different simulation tools available. The performance of crystal STF 45, produced at INFN Ferrara, was measured at the H8 beamline at CERN in 2010 and serves as the basis to the comparison. Distributions of deflected particles are shown to be in very good agreement with experimental data. Calculated dechanneling lengths and crystal performance in the transition region between amorphous regime and volume reflection are also close to the measured ones.

  19. Fluka and thermo-mechanical studies for the CLIC main dump

    CERN Document Server

    Mereghetti, Alessio; Vlachoudis, Vasilis

    2011-01-01

    In order to best cope with the challenge of absorbing the multi-MW beam, a water beam dump at the end of the CLIC post-collision line has been proposed. The design of the dump for the Conceptual Design Report (CDR) was checked against with a set of FLUKA Monte Carlo simulations, for the estimation of the peak and total power absorbed by the water and the vessel. Fluence spectra of escaping particles and activation rates of radio-nuclides were computed as well. Finally, the thermal transient behavior of the water bath and a thermo-mechanical analysis of the preliminary design of the window were done.

  20. Comparison of physics model for 600 MeV protons 290 MeV·{sup n-}1 oxygen ions on carbon in MCNPX

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Arim; Kim, Dong Hyun; Jung, Nam Suk; Oh, Joo Hee [Pohang Accelerator Laboratory, POSTECH, Pohang (Korea, Republic of); Oranj, Leila Mokhtari [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    2016-06-15

    With the increase in the number of particle accelerator facilities under either operation or construction, the accurate calculation using Monte Carlo codes become more important in the shielding design and radiation safety evaluation of accelerator facilities. The calculations with different physics models were applied in both of cases: using only physics model and using the mix and match method of MCNPX code. The issued conditions were the interactions of 600 MeV proton and 290 MeV·{sup n-}1 oxygen with a carbon target. Both of cross-section libraries, JENDL High Energy File 2007 (JENDL/HE-2007) and LA150, were tested in this calculation. In the case of oxygen ion interactions, the calculation results using LAQGSM physics model and JENDL/HE-2007 library were compared with D. Satoh's experimental data. Other Monte Carlo calculations using PHITS and FLUKA codes were also carried out for further benchmarking study. It was clearly found that the physics models, especially intra-nuclear cascade model, gave a great effect to determine proton-induced secondary neutron spectrum in MCNPX code. The variety of physics models related to heavy ion interactions did not make big difference on the secondary particle productions. The variations of secondary neutron spectra and particle transports depending on various physics models in MCNPX code were studied and the result of this study can be used for the shielding design and radiation safety evaluation.

  1. Depth profile of production yields of {sup nat}Pb(p, xn) {sup 206,205,204,203,202,201}Bi nuclear reactions

    Energy Technology Data Exchange (ETDEWEB)

    Mokhtari Oranj, Leila [Division of Advanced Nuclear Engineering, POSTECH, Pohang 37673 (Korea, Republic of); Jung, Nam-Suk; Kim, Dong-Hyun; Lee, Arim; Bae, Oryun [Pohang Accelerator Laboratory, POSTECH, Pohang 37673 (Korea, Republic of); Lee, Hee-Seock, E-mail: lee@postech.ac.kr [Pohang Accelerator Laboratory, POSTECH, Pohang 37673 (Korea, Republic of)

    2016-11-01

    Experimental and simulation studies on the depth profiles of production yields of {sup nat}Pb(p, xn) {sup 206,205,204,203,202,201}Bi nuclear reactions were carried out. Irradiation experiments were performed at the high-intensity proton linac facility (KOMAC) in Korea. The targets, irradiated by 100-MeV protons, were arranged in a stack consisting of natural Pb, Al, Au foils and Pb plates. The proton beam intensity was determined by activation analysis method using {sup 27}Al(p, 3p1n){sup 24}Na, {sup 197}Au(p, p1n){sup 196}Au, and {sup 197}Au(p, p3n){sup 194}Au monitor reactions and also by Gafchromic film dosimetry method. The yields of produced radio-nuclei in the {sup nat}Pb activation foils and monitor foils were measured by HPGe spectroscopy system. Monte Carlo simulations were performed by FLUKA, PHITS/DCHAIN-SP, and MCNPX/FISPACT codes and the calculated data were compared with the experimental results. A satisfactory agreement was observed between the present experimental data and the simulations.

  2. Measurements of activation reaction rates in transverse shielding concrete exposed to the secondary particle field produced by intermediate energy heavy ions on an iron target

    International Nuclear Information System (INIS)

    Ogawa, T.; Morev, M.N.; Iimoto, T.; Kosako, T.

    2012-01-01

    Reaction rate distributions were measured inside a 60-cm thick concrete pile placed at the lateral position of a thick (stopping length) iron target that was bombarded with heavy ions, 400 MeV/u C and 800 MeV/u Si. Foils of aluminum and gold, as well as gold, tungsten and manganese covered with cadmium were inserted at various locations in the concrete pile to serve as activation detectors. Features of reaction rate distribution, such as the shape of the reaction rate profile, contribution of the neutrons from intra-nuclear cascade and that from evaporation to the activation reactions are determined by the analysis of measured reaction rates. The measured reaction rates were compared with those calculated with radiation transport simulation codes, FLUKA and PHITS, to verify their capability to predict induced activity. The simulated reaction rates agree with the experimental results within a factor of three in general. However, systematic discrepancies between simulated reaction rates and measured reaction rates attributed to the neutron source terms are observed.

  3. Validation of PHITS Spallation Models from the Perspective of the Shielding Design of Transmutation Experimental Facility

    Science.gov (United States)

    Iwamoto, Hiroki; Meigo, Shin-ichiro

    2017-09-01

    The impact of different spallation models implemented in the particle transport code PHITS on the shielding design of Transmutation Experimental Facility is investigated. For 400-MeV proton incident on a lead-bismuth eutectic target, an effective dose rate at the end of a thick radiation shield (3-m-thick iron and 3-m-thick concrete) calculated by the Liège intranuclear cascade (INC) model version 4.6 (INCL4.6) coupled with the GEMcode (INCL4.6/GEM) yields about twice as high as the Bertini INC model (Bertini/GEM). A comparison with experimental data for 500-MeV proton incident on a thick lead target suggest that the prediction accuracy of INCL4.6/GEM would be better than that of Bertini/GEM. In contrast, it is found that the dose rates in beam ducts in front of targets calculated by the INCL4.6/GEMare lower than those by the Bertini/GEM. Since both models underestimate the experimental results for neutron-production doubledifferential cross sections at 180° for 140-MeV proton incident on carbon, iron, and gold targets, it is concluded that it is necessary to allow a margin for uncertainty caused by the spallation models, which is a factor of two, in estimating the dose rate induced by neutron streaming through a beam duct.

  4. Computer codes in particle transport physics

    International Nuclear Information System (INIS)

    Pesic, M.

    2004-01-01

    Simulation of transport and interaction of various particles in complex media and wide energy range (from 1 MeV up to 1 TeV) is very complicated problem that requires valid model of a real process in nature and appropriate solving tool - computer code and data library. A brief overview of computer codes based on Monte Carlo techniques for simulation of transport and interaction of hadrons and ions in wide energy range in three dimensional (3D) geometry is shown. Firstly, a short attention is paid to underline the approach to the solution of the problem - process in nature - by selection of the appropriate 3D model and corresponding tools - computer codes and cross sections data libraries. Process of data collection and evaluation from experimental measurements and theoretical approach to establishing reliable libraries of evaluated cross sections data is Ion g, difficult and not straightforward activity. For this reason, world reference data centers and specialized ones are acknowledged, together with the currently available, state of art evaluated nuclear data libraries, as the ENDF/B-VI, JEF, JENDL, CENDL, BROND, etc. Codes for experimental and theoretical data evaluations (e.g., SAMMY and GNASH) together with the codes for data processing (e.g., NJOY, PREPRO and GRUCON) are briefly described. Examples of data evaluation and data processing to generate computer usable data libraries are shown. Among numerous and various computer codes developed in transport physics of particles, the most general ones are described only: MCNPX, FLUKA and SHIELD. A short overview of basic application of these codes, physical models implemented with their limitations, energy ranges of particles and types of interactions, is given. General information about the codes covers also programming language, operation system, calculation speed and the code availability. An example of increasing computation speed of running MCNPX code using a MPI cluster compared to the code sequential option

  5. FLUKA simulations of a moderated reduced weight high energy neutron detection system

    Energy Technology Data Exchange (ETDEWEB)

    Biju, K., E-mail: bijusivolli@gmail.com [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Tripathy, S.P.; Sunil, C.; Sarkar, P.K. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2012-08-01

    Neutron response of the systems containing high density polyethylene (HDPE) spheres coupled with different external metallic converters has been studied using the FLUKA Monte Carlo simulation code. A diameter of 17.8 cm (7 in.) of the moderating sphere is found to be optimum to obtain the maximum response when used with the neutron converter shells like W, Pb and Zr. Enhancement ratios of the neutron response due to the induced (n, xn) reactions in the outer converters made of W, Pb and Zr are analyzed. It is observed that the enhancement in the response by 1 cm thick Zr shell is comparable to that of 1 cm thick Pb in the energy region of 10-50 MeV. An appreciable enhancement is observed in the case of Zr converter for the higher energy neutrons. Thus, by reducing the dimension of the moderating sphere and using a Zr converter shell, the weight of the system reduces to 10 kg which is less compared to the presently available extended high energy neutron rem meters. The normalized energy dependent ambient dose equivalent response of the zirconium based rem counter (ZReC) at high energies is found to be in good agreement with the energy differential H{sup Low-Asterisk }(10) values suggested by the International Commission on Radiological Protection (ICRP). Based on this study, it is proposed that a rem meter made of 17.8 cm diameter HDPE sphere with 1 cm thick Zr can be used effectively and conveniently for routine monitoring in the accelerator environment.

  6. Design and spectrum calculation of 4H-SiC thermal neutron detectors using FLUKA and TCAD

    Science.gov (United States)

    Huang, Haili; Tang, Xiaoyan; Guo, Hui; Zhang, Yimen; Zhang, Yimeng; Zhang, Yuming

    2016-10-01

    SiC is a promising material for neutron detection in a harsh environment due to its wide band gap, high displacement threshold energy and high thermal conductivity. To increase the detection efficiency of SiC, a converter such as 6LiF or 10B is introduced. In this paper, pulse-height spectra of a PIN diode with a 6LiF conversion layer exposed to thermal neutrons (0.026 eV) are calculated using TCAD and Monte Carlo simulations. First, the conversion efficiency of a thermal neutron with respect to the thickness of 6LiF was calculated by using a FLUKA code, and a maximal efficiency of approximately 5% was achieved. Next, the energy distributions of both 3H and α induced by the 6LiF reaction according to different ranges of emission angle are analyzed. Subsequently, transient pulses generated by the bombardment of single 3H or α-particles are calculated. Finally, pulse height spectra are obtained with a detector efficiency of 4.53%. Comparisons of the simulated result with the experimental data are also presented, and the calculated spectrum shows an acceptable similarity to the experimental data. This work would be useful for radiation-sensing applications, especially for SiC detector design.

  7. Estimation of neutron production from accelerator head assembly of 15 MV medical LINAC using FLUKA simulations

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T., E-mail: sharad@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Pethe, S.N., E-mail: sanjay@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Krishnan, R., E-mail: krishnan@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N., E-mail: vnb@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India)

    2011-12-15

    For the production of a clinical 15 MeV photon beam, the design of accelerator head assembly has been optimized using Monte Carlo based FLUKA code. The accelerator head assembly consists of e-{gamma} target, flattening filter, primary collimator and an adjustable rectangular secondary collimator. The accelerators used for radiation therapy generate continuous energy gamma rays called Bremsstrahlung (BR) by impinging high energy electrons on high Z materials. The electron accelerators operating above 10 MeV can result in the production of neutrons, mainly due to photo nuclear reaction ({gamma}, n) induced by high energy photons in the accelerator head materials. These neutrons contaminate the therapeutic beam and give a non-negligible contribution to patient dose. The gamma dose and neutron dose equivalent at the patient plane (SSD = 100 cm) were obtained at different field sizes of 0 Multiplication-Sign 0, 10 Multiplication-Sign 10, 20 Multiplication-Sign 20, 30 Multiplication-Sign 30 and 40 Multiplication-Sign 40 cm{sup 2}, respectively. The maximum neutron dose equivalent is observed near the central axis of 30 Multiplication-Sign 30 cm{sup 2} field size. This is 0.71% of the central axis photon dose rate of 0.34 Gy/min at 1 {mu}A electron beam current.

  8. Analysis of multi-fragmentation reactions induced by relativistic heavy ions using the statistical multi-fragmentation model

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, T., E-mail: ogawa.tatsuhiko@jaea.go.jp [Research Group for Radiation Protection, Division of Environment and Radiation Sciences, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency, Shirakata-Shirane, Tokai, Ibaraki 319-1195 (Japan); Sato, T.; Hashimoto, S. [Research Group for Radiation Protection, Division of Environment and Radiation Sciences, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency, Shirakata-Shirane, Tokai, Ibaraki 319-1195 (Japan); Niita, K. [Research Organization for Information Science and Technology, Shirakata-shirane, Tokai, Ibaraki 319-1188 (Japan)

    2013-09-21

    The fragmentation cross-sections of relativistic energy nucleus–nucleus collisions were analyzed using the statistical multi-fragmentation model (SMM) incorporated with the Monte-Carlo radiation transport simulation code particle and heavy ion transport code system (PHITS). Comparison with the literature data showed that PHITS-SMM reproduces fragmentation cross-sections of heavy nuclei at relativistic energies better than the original PHITS by up to two orders of magnitude. It was also found that SMM does not degrade the neutron production cross-sections in heavy ion collisions or the fragmentation cross-sections of light nuclei, for which SMM has not been benchmarked. Therefore, SMM is a robust model that can supplement conventional nucleus–nucleus reaction models, enabling more accurate prediction of fragmentation cross-sections.

  9. Analysis of multi-fragmentation reactions induced by relativistic heavy ions using the statistical multi-fragmentation model

    International Nuclear Information System (INIS)

    Ogawa, T.; Sato, T.; Hashimoto, S.; Niita, K.

    2013-01-01

    The fragmentation cross-sections of relativistic energy nucleus–nucleus collisions were analyzed using the statistical multi-fragmentation model (SMM) incorporated with the Monte-Carlo radiation transport simulation code particle and heavy ion transport code system (PHITS). Comparison with the literature data showed that PHITS-SMM reproduces fragmentation cross-sections of heavy nuclei at relativistic energies better than the original PHITS by up to two orders of magnitude. It was also found that SMM does not degrade the neutron production cross-sections in heavy ion collisions or the fragmentation cross-sections of light nuclei, for which SMM has not been benchmarked. Therefore, SMM is a robust model that can supplement conventional nucleus–nucleus reaction models, enabling more accurate prediction of fragmentation cross-sections

  10. Hadronic models and experimental data for the neutrino beam production

    CERN Document Server

    Collazuol, G; Guglielmi, A M; Sala, P R

    2000-01-01

    The predictions of meson production by 450 GeV/c protons on Be using the Monte Carlo FLUKA standalone and GEANT-FLUKA and GEANT-GHEISHA in GEANT are compared with available experimental measurements. The comparison enlightens the improvements of the hadronic generator models of the present standalone code FLUKA with respect to the 1992 version which is embedded into GEANT-FLUKA. Worse results were obtained with the GHEISHA package. A complete simulation of the SPS neutrino beam line at CERN showed significant variations in the intensity and composition of the neutrino beam when FLUKA standalone instead of the GEANT-FLUKA package is used to simulate particle production in the Be target.

  11. Hadronic models and experimental data for the neutrino beam production

    International Nuclear Information System (INIS)

    Collazuol, G.; Ferrari, A.; Guglielmi, A.; Sala, P.R.

    2000-01-01

    The predictions of meson production by 450 GeV/c protons on Be using the Monte Carlo FLUKA standalone and GEANT-FLUKA and GEANT-GHEISHA in GEANT are compared with available experimental measurements. The comparison enlightens the improvements of the hadronic generator models of the present standalone code FLUKA with respect to the 1992 version which is embedded into GEANT-FLUKA. Worse results were obtained with the GHEISHA package. A complete simulation of the SPS neutrino beam line at CERN showed significant variations in the intensity and composition of the neutrino beam when FLUKA standalone instead of the GEANT-FLUKA package is used to simulate particle production in the Be target

  12. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Qin, N; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States); Peeler, C [UT MD Anderson Cancer Center, Houston, TX (United States); Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relative Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.

  13. Fluka studies of the Asynchronous Beam Dump Effects on LHC Point 6

    CERN Document Server

    Versaci, R; Goddard, B; Mereghetti, A; Schmidt, R; Vlachoudis, V; CERN. Geneva. ATS Department

    2011-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. Using FLUKA Monte Carlo simulations, we have investigated the effects of an asynchronous beam dump at the LHC Point 6 where beams, with a stored energy of 360 MJ, can instantaneously release up to a few J cm^-3 in the cryogenic magnets which have a quench limit of the order of the mJ cm^-3. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump. We will then analyze the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  14. The FLUKA study of the secondary particles fluence in the AD-Antiproton Decelerator target area

    CERN Document Server

    Calviani, M

    2014-01-01

    In this paper we present Monte Carlo FLUKA simulations [1, 2] carried out to investigate the secondary particles fluence emerging from the antiproton production target and their spatial distribution in the AD target area. The detailed quantitative analysis has been performed for different positions along the magnet dog-leg as well as after the main collimator. These results allow tuning the position of the new beam current transformers (BCT) in the target area, in order to have a precise pulse-by-pulse evaluation of the intensity of negative particles injected in the AD-ring before the deceleration phase.

  15. Path Toward a Unified Geometry for Radiation Transport

    Science.gov (United States)

    Lee, Kerry

    The Direct Accelerated Geometry for Radiation Analysis and Design (DAGRAD) element of the RadWorks Project under Advanced Exploration Systems (AES) within the Space Technology Mission Directorate (STMD) of NASA will enable new designs and concepts of operation for radiation risk assessment, mitigation and protection. This element is designed to produce a solution that will allow NASA to calculate the transport of space radiation through complex CAD models using the state-of-the-art analytic and Monte Carlo radiation transport codes. Due to the inherent hazard of astronaut and spacecraft exposure to ionizing radiation in low-Earth orbit (LEO) or in deep space, risk analyses must be performed for all crew vehicles and habitats. Incorporating these analyses into the design process can minimize the mass needed solely for radiation protection. Transport of the radiation fields as they pass through shielding and body materials can be simulated using Monte Carlo techniques or described by the Boltzmann equation, which is obtained by balancing changes in particle fluxes as they traverse a small volume of material with the gains and losses caused by atomic and nuclear collisions. Deterministic codes that solve the Boltzmann transport equation, such as HZETRN (high charge and energy transport code developed by NASA LaRC), are generally computationally faster than Monte Carlo codes such as FLUKA, GEANT4, MCNP(X) or PHITS; however, they are currently limited to transport in one dimension, which poorly represents the secondary light ion and neutron radiation fields. NASA currently uses HZETRN space radiation transport software, both because it is computationally efficient and because proven methods have been developed for using this software to analyze complex geometries. Although Monte Carlo codes describe the relevant physics in a fully three-dimensional manner, their computational costs have thus far prevented their widespread use for analysis of complex CAD models, leading

  16. Distributions of neutron yields and doses around a water phantom bombarded with 290-MeV/nucleon and 430-MeV/nucleon carbon ions

    Energy Technology Data Exchange (ETDEWEB)

    Satoh, D., E-mail: satoh.daiki@jaea.go.jp [Japan Atomic Energy Agency, Tokai-mura, Naka-gun, Ibaraki 319-1195 (Japan); Kajimoto, T. [Hiroshima University, Kagamiyama, Higashi-Hiroshima-shi, Hiroshima 739-8527 (Japan); Shigyo, N.; Itashiki, Y.; Imabayashi, Y. [Kyushu University, Motooka, Nishi-ku, Fukuoka 819-0395 (Japan); Koba, Y.; Matsufuji, N. [National Institute of Radiological Sciences, Anagawa, Inage-ku, Chiba 263-8555 (Japan); Sanami, T. [High Energy Accelerator Research Organization, Oho-cho, Tsukuba-shi, Ibaraki 305-0801 (Japan); Nakao, N. [Shimizu Corporation, Etchujima, Koto-ku, Tokyo 135-8530 (Japan); Uozumi, Y. [Kyushu University, Motooka, Nishi-ku, Fukuoka 819-0395 (Japan)

    2016-11-15

    Double-differential neutron yields from a water phantom bombarded with 290-MeV/nucleon and 430-MeV/nucleon carbon ions were measured at emission angles of 15°, 30°, 45°, 60°, 75°, and 90°, and angular distributions of neutron yields and doses around the phantom were obtained. The experimental data were compared with results of the Monte-Carlo simulation code PHITS. The PHITS results showed good agreement with the measured data. On the basis of the PHITS simulation, we estimated the angular distributions of neutron yields and doses from 0° to 180° including thermal neutrons.

  17. Energy deposition profile for modification proposal of ISOLDE’s HRS Beam Dump, from FLUKA simulations

    CERN Document Server

    Vlachoudis, V

    2014-01-01

    The current ISOLDE HRS beam dump has been found to be unsuitable on previous simulations, due to thermomechanical stresses. In this paper a proposal for modifying HRS dump is studied using FLUKA. The energy deposited in this modified beam dump and the amount of neutrons streaming to the tunnel area are scored and compared with the simulation of current dump. Two versions of the modification have been assessed, determining which of them is more desirable in terms of influence of radiation on ISOLDE’s tunnel. Finally, a rough estimate of temperature raise in the modified dump is shown. Further conclusions on the adequacy of these modifications need to include the thermomechanical calculations’ results, based on those presented here.

  18. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    Energy Technology Data Exchange (ETDEWEB)

    Fasso, A. [SLAC; Ferrari, A. [CERN; Ferrari, A. [HZDR, Dresden; Mokhov, N. V. [Fermilab; Mueller, S. E. [HZDR, Dresden; Nelson, W. R. [SLAC; Roesler, S. [CERN; Sanami, t.; Striganov, S. I. [Fermilab; Versaci, R. [Unlisted, CZ

    2016-12-01

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, and with the SLAC data.

  19. Simulation codes to evcaluate dose conversion coefficients for hadrons over 10 GeV

    International Nuclear Information System (INIS)

    Sato, T.; Tsuda, S.; Sakamoto, Y.; Yamaguchi, Y.; Niita, K.

    2002-01-01

    The conversion coefficients from fluence to effective dose for high energy hadrons are indispensable for various purposes such as accelerator shielding design and dose evaluation in space mission. Monte Carlo calculation code HETC-3STEP was used to evaluate dose conversion coefficients for neutrons and protons up to 10 GeV with an anthropomorphic model. The scaling model was incorporated in the code for simulation of high energy nuclear reactions. However, the secondary particle energy spectra predicted by the model were not smooth for nuclear reactions over several GeV. We attempted, therefore, to simulate transportation of such high energy particles by two newly developed Monte Carlo simulation codes: one is HETC-3STEP including the model used in EVENTQ instead of the scaling model, and the other is NMTC/JAM. By comparing calculated cross sections by these codes with experimental data for high energy nuclear reactions, it was found that NMTC/JAM had a better agreement with the data. We decided, therefore, to adopt NMTC/JAM for evaluation of dose conversion coefficients for hadrons with energies over 10 GeV. The effective dose conversion coefficients for high energy neutrons and protons evaluated by NMTC/JAM were found to be close to those by the FLUKA code

  20. SIMULATED 8 MeV NEUTRON RESPONSE FUNCTIONS OF A THIN SILICON NEUTRON SENSOR.

    Science.gov (United States)

    Takada, Masashi; Matsumoto, Tetsuro; Masuda, Akihiko; Nunomiya, Tomoya; Aoyama, Kei; Nakamura, Takashi

    2017-12-22

    Neutron response functions of a thin silicon neutron sensor are simulated using PHITS2 and MCNP6 codes for an 8 MeV neutron beam at angles of incidence of 0°, 30° and 60°. The contributions of alpha particles created from the 28Si(n,α)25Mg reaction and the silicon nuclei scattered elastically by neutrons in the silicon sensor have not been well reproduced using the MCNP6 code. The 8 MeV neutron response functions simulated using the PHITS2 code with an accurate event generator mode are in good agreement with experimental results and include the contributions of the alpha particles and silicon nuclei. © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. The physics of compensating calorimetry and the new CALOR89 code system

    International Nuclear Information System (INIS)

    Gabriel, T.A.; Brau, J.E.; Bishop, B.L.

    1989-03-01

    Much of the understanding of the physics of calorimetry has come from the use of excellent radiation transport codes. A new understanding of compensating calorimetry was introduced four years ago following detailed studies with a new CALOR system. Now, the CALOR system has again been revised to reflect a better comprehension of high energy nuclear collisions by incorporating a modified high energy fragmentation model from FLUKA87. This revision will allow for the accurate analysis of calorimeters at energies of 100's of GeV. Presented in this paper is a discussion of compensating calorimetry, the new CALOR system, the revisions to HETC, and recently generated calorimeter related data on modes of energy deposition and secondary neutron production (E < 50 MeV) in infinite iron and uranium blocks. 38 refs., 5 figs., 5 tabs

  2. FLUKA studies of hadron-irradiated scintillating crystals for calorimetry at the High-Luminosity LHC

    CERN Document Server

    Quittnat, Milena Eleonore

    2015-01-01

    Calorimetry at the High-Luminosity LHC (HL-LHC) will be performed in a harsh radiation environment with high hadron fluences. The upgraded CMS electromagnetic calorimeter design and suitable scintillating materials are a focus of current research. In this paper, first results using the Monte Carlo simulation program FLUKA are compared to measurements performed with proton-irradiated LYSO, YSO and cerium fluoride crystals. Based on these results, an extrapolation to the behavior of an electromagnetic sampling calorimeter, using one of the inorganic scintillators above as an active medium, is performed for the upgraded CMS experiment at the HL-LHC. Characteristic parameters such as the induced ambient dose, fluence spectra for different particle types and the residual nuclei are studied, and the suitability of these materials for a future calorimeter is surveyed. Particular attention is given to the creation of isotopes in an LYSO-tungsten calorimeter that might contribute a prohibitive background to the measu...

  3. Concrete shielding of neutron radiations of plasma focus and dose examination by FLUKA

    Science.gov (United States)

    Nemati, M. J.; Amrollahi, R.; Habibi, M.

    2013-07-01

    Plasma Focus (PF) is among those devices which are used in plasma investigations, but this device produces some dangerous radiations after each shot, which generate a hazardous area for the operators of this device; therefore, it is better for the operators to stay away as much as possible from the area, where plasma focus has been placed. In this paper FLUKA Monte Carlo simulation has been used to calculate radiations produced by a 4 kJ Amirkabir plasma focus device through different concrete shielding concepts with various thicknesses (square, labyrinth and cave concepts). The neutron yield of Amirkabir plasma focus at varying deuterium pressure (3-9 torr) and two charging voltages (11.5 and 13.5 kV) is (2.25 ± 0.2) × 108 neutrons/shot and (2.88 ± 0.29) × 108 neutrons/shot of 2.45 MeV, respectively. The most influential shield for the plasma focus device among these geometries is the labyrinth concept on four sides and the top with 20 cm concrete.

  4. Radiation Protection Considerations

    Science.gov (United States)

    Adorisio, C.; Roesler, S.; Urscheler, C.; Vincke, H.

    This chapter summarizes the legal Radiation Protection (RP) framework to be considered in the design of HiLumi LHC. It details design limits and constraints, dose objectives and explains how the As Low As Reasonably Achievable (ALARA) approach is formalized at CERN. Furthermore, features of the FLUKA Monte Carlo code are summarized that are of relevance for RP studies. Results of FLUKA simulations for residual dose rates during Long Shutdown 1 (LS1) are compared to measurements demonstrating good agreement and providing proof for the accuracy of FLUKA predictions for future shutdowns. Finally, an outlook for the residual dose rate evolution until LS3 is given.

  5. Radiation protection considerations

    CERN Document Server

    Adorisio, C; Urscheler, C; Vincke, H

    2015-01-01

    This chapter summarizes the legal Radiation Protection (RP) framework to be considered in the design of HiLumi LHC. It details design limits and constraints, dose objectives and explains how the As Low As Reasonably Achievable (ALARA) approach is formalized at CERN. Furthermore, features of the FLUKA Monte Carlo code are summarized that are of relevance for RP studies. Results of FLUKA simulations for residual dose rates during Long Shutdown 1 (LS1) are compared to measurements demonstrating good agreement and providing proof for the accuracy of FLUKA predictions for future shutdowns. Finally, an outlook for the residual dose rate evolution until LS3 is given.

  6. Meeting Radiation Protection Requirements and Reducing Spacecraft Mass - A Multifunctional Materials Approach

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Reddell, Brandon; Rojdev, Kristina; Franklin, Jennifer

    2010-01-01

    Both crew and radio-sensitive systems, especially electronics must be protected from the effects of the space radiation environment. One method of mitigating this radiation exposure is to use passive-shielding materials. In previous vehicle designs such as the International Space Station (ISS), materials such as aluminum and polyethylene have been used as parasitic shielding to protect crew and electronics from exposure, but these designs add mass and decrease the amount of usable volume inside the vehicle. Thus, it is of interest to understand whether structural materials can also be designed to provide the radiation shielding capability needed for crew and electronics, while still providing weight savings and increased useable volume when compared against previous vehicle shielding designs. In this paper, we present calculations and analysis using the HZETRN (deterministic) and FLUKA (Monte Carlo) codes to investigate the radiation mitigation properties of these structural shielding materials, which includes graded-Z and composite materials. This work is also a follow-on to an earlier paper, that compared computational results for three radiation transport codes, HZETRN, HETC, and FLUKA, using the Feb. 1956 solar particle event (SPE) spectrum. In the following analysis, we consider the October 1989 Ground Level Enhanced (GLE) SPE as the input source term based on the Band function fitting method. Using HZETRN and FLUKA, parametric absorbed doses at the center of a hemispherical structure on the lunar surface are calculated for various thicknesses of graded-Z layups and an all-aluminum structure. HZETRN and FLUKA calculations are compared and are in reasonable (18% to 27%) agreement. Both codes are in agreement with respect to the predicted shielding material performance trends. The results from both HZETRN and FLUKA are analyzed and the radiation protection properties and potential weight savings of various materials and materials lay-ups are compared.

  7. Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA.

    Directory of Open Access Journals (Sweden)

    Chaeyeong Lee

    Full Text Available Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1 was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators.

  8. Leakage of radioactive materials from particle accelerator facilities by non-radiation disasters like fire and flooding and its environmental impacts

    Science.gov (United States)

    Lee, A.; Jung, N. S.; Mokhtari Oranj, L.; Lee, H. S.

    2018-06-01

    The leakage of radioactive materials generated at particle accelerator facilities is one of the important issues in the view of radiation safety. In this study, fire and flooding at particle accelerator facilities were considered as the non-radiation disasters which result in the leakage of radioactive materials. To analyse the expected effects at each disaster, the case study on fired and flooded particle accelerator facilities was carried out with the property investigation of interesting materials presented in the accelerator tunnel and the activity estimation. Five major materials in the tunnel were investigated: dust, insulators, concrete, metals and paints. The activation levels on the concerned materials were calculated using several Monte Carlo codes (MCNPX 2.7+SP-FISPACT 2007, FLUKA 2011.4c and PHITS 2.64+DCHAIN-SP 2001). The impact weight to environment was estimated for the different beam particles (electron, proton, carbon and uranium) and the different beam energies (100, 430, 600 and 1000 MeV/nucleon). With the consideration of the leakage path of radioactive materials due to fire and flooding, the activation level of selected materials, and the impacts to the environment were evaluated. In the case of flooding, dust, concrete and metal were found as a considerable object. In the case of fire event, dust, insulator and paint were the major concerns. As expected, the influence of normal fire and flooding at electron accelerator facilities would be relatively low for both cases.

  9. European inter-comparison of Monte Carlo codes users for the uncertainty calculation of the kerma in air beside a caesium-137 source; Intercomparaison europeenne d'utilisateurs de codes monte carlo pour le calcul d'incertitudes sur le kerma dans l'air aupres d'une source de cesium-137

    Energy Technology Data Exchange (ETDEWEB)

    De Carlan, L.; Bordy, J.M.; Gouriou, J. [CEA Saclay, LIST, Laboratoire National Henri Becquerel, Laboratoire de Metrologie de la Dose 91 - Gif-sur-Yvette (France)

    2010-07-01

    Within the frame of the CONRAD European project (Coordination Network for Radiation Dosimetry), and more precisely within a work group paying attention to uncertainty assessment in computational dosimetry and aiming at comparing different approaches, the authors report the simulation of an irradiator containing a caesium 137 source to calculate the kerma in air as well as its uncertainty due to different parameters. They present the problem geometry, recall the studied issues (kerma uncertainty, influence of capsule source, influence of the collimator, influence of the air volume surrounding the source). They indicate the codes which have been used (MNCP, Fluka, Penelope, etc.) and discuss the obtained results for the first issue

  10. Induced Radioactivity in Lead Shielding at the National Synchrotron Light Source.

    Science.gov (United States)

    Ghosh, Vinita J; Schaefer, Charles; Kahnhauser, Henry

    2017-06-01

    The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory was shut down in September 2014. Lead bricks used as radiological shadow shielding within the accelerator were exposed to stray radiation fields during normal operations. The FLUKA code, a fully integrated Monte Carlo simulation package for the interaction and transport of particles and nuclei in matter, was used to estimate induced radioactivity in this shielding and stainless steel beam pipe from known beam losses. The FLUKA output was processed using MICROSHIELD® to estimate on-contact exposure rates with individually exposed bricks to help design and optimize the radiological survey process. This entire process can be modeled using FLUKA, but use of MICROSHIELD® as a secondary method was chosen because of the project's resource constraints. Due to the compressed schedule and lack of shielding configuration data, simple FLUKA models were developed. FLUKA activity estimates for stainless steel were compared with sampling data to validate results, which show that simple FLUKA models and irradiation geometries can be used to predict radioactivity inventories accurately in exposed materials. During decommissioning 0.1% of the lead bricks were found to have measurable levels of induced radioactivity. Post-processing with MICROSHIELD® provides an acceptable secondary method of estimating residual exposure rates.

  11. Fluka Studies of the Asynchronous Beam Dump Effects on LHC Point 6 for a 7 TeV beam

    CERN Document Server

    VERSACI, R; GODDARD, B; MEREGHETTI, A; SCHMIDT, R; VLACHOUDIS, V

    2012-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. Using FLUKA Monte Carlo simulations, we have investigated the effects of an asynchronous beam dump at the LHC Point 6 where beams, with a stored energy of 360 MJ, can instantaneously release up to a few J cm^{-3} in the cryogenic magnets which have a quench limit of the order of the mJ cm^{-3}. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump of a 7 TeV beam. We will then analyze the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  12. Simulations of muon-induced neutron flux at large depths underground

    International Nuclear Information System (INIS)

    Kudryavtsev, V.A.; Spooner, N.J.C.; McMillan, J.E.

    2003-01-01

    The production of neutrons by cosmic-ray muons at large depths underground is discussed. The most recent versions of the muon propagation code MUSIC, and particle transport code FLUKA are used to evaluate muon and neutron fluxes. The results of simulations are compared with experimental data

  13. Influence of commercial (Fluka) naphthenic acids on acid volatile sulfide (AVS) production and divalent metal precipitation.

    Science.gov (United States)

    McQueen, Andrew D; Kinley, Ciera M; Rodgers, John H; Friesen, Vanessa; Bergsveinson, Jordyn; Haakensen, Monique C

    2016-12-01

    Energy-derived waters containing naphthenic acids (NAs) are complex mixtures often comprising a suite of potentially problematic constituents (e.g. organics, metals, and metalloids) that need treatment prior to beneficial use, including release to receiving aquatic systems. It has previously been suggested that NAs can have biostatic or biocidal properties that could inhibit microbially driven processes (e.g. dissimilatory sulfate reduction) used to transfer or transform metals in passive treatment systems (i.e. constructed wetlands). The overall objective of this study was to measure the effects of a commercially available (Fluka) NA on sulfate-reducing bacteria (SRB), production of sulfides (as acid-volatile sulfides [AVS]), and precipitation of divalent metals (i.e. Cu, Ni, Zn). These endpoints were assessed following 21-d aqueous exposures of NAs using bench-scale reactors. After 21-days, AVS molar concentrations were not statistically different (pAVS production was sufficient in all NA treatments to achieve ∑SEM:AVS AVS) could be used to treat metals occurring in NAs affected waters. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. JASMIN: Japanese-American study of muon interactions and neutron detection

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.; /Fermilab; Kasugai, Yoshimi; Matsuda, Norihiro; Iwamoto, Yosuke; Sakamoto, Yukio; /JAEA, Ibaraki; Leveling, Anthony F.; Boehnlein, David J.; Vaziri, Kamran; /Fermilab; Matsumura, Hiroshi; Hagiwara, Masayuki; /KEK, Tsukuba /Tohoku U. /Shimizu, Tokyo /Kyushu U. /Kyoto U. /Tsukuba U. /Pohang Accelerator Lab. /Tokai, ROIST

    2010-08-01

    Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of a code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.

  15. Proceedings of 5. French speaking scientific days on calculation codes for radioprotection, radio-physics and dosimetry

    International Nuclear Information System (INIS)

    Simon-Cornu, Marie; Mourlon, Christophe; Bordy, J.M.; Daures, J.; Dusiac, D.; Moignau, F.; Gouriou, J.; Million, M.; Moreno, B.; Chabert, I.; Lazaro, D.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; De Carlan, L.; Patin, D.; Le Loirec, C.; Dupuis, P.; Gassa, F.; Guerin, L.; Batalla, A.; Leni, Pierre-Emmanuel; Laurent, Remy; Gschwind, Regine; Makovicka, Libor; Henriet, Julien; Salomon, Michel; Vivier, Alain; Lopez, Gerald; Dossat, C.; Pourrouquet, P.; Thomas, J.C.; Sarie, I.; Peyrard, P.F.; Chatry, N.; Lavielle, D.; Loze, R.; Brun, E.; Damian, F.; Diop, C.; Dumonteil, E.; Hugot, F.X.; Jouanne, C.; Lee, Y.K.; Malvagi, F.; Mazzolo, A.; Petit, O.; Trama, J.C.; Visonneau, T.; Zoia, A.; Courageot, Estelle; Gaillard-Lecanu, Emmanuelle; Kutschera, Reinald; Le Meur, Gaelle; Uzio, Fabien; De Conto, Celine; Gschwind, Regine; Makovicka, Libor; Farah, Jad; Martinetti, Florent; Sayah, Rima; Donadille, Laurent; Herault, Joel; Delacroix, Sabine; Nauraye, Catherine; Lee, Choonsik; Bolch, Wesley; Clairand, Isabelle; Horodynski, Jean-Michel; Pauwels, Nicolas; Robert, Pierre; VOLLAIRE, Joachim; Nicoletti, C.; Kitsos, S.; Tardy, M.; Marchaud, G.; Stankovskiy, Alexey; Van Den Eynde, Gert; Fiorito, Luca; Malambu, Edouard; Dreuil, Serge; Mougeot, X.; Be, M.M.; Bisch, C.; Villagrasa, C.; Dos Santos, M.; Clairand, I.; Karamitros, M.; Incerti, S.; Petitguillaume, Alice; Franck, Didier; Desbree, Aurelie; Bernardini, Michela; Labriolle-Vaylet, Claire de; Gnesin, Silvano; Leadermann, Jean-Pascal; Paterne, Loic; Bochud, Francois O.; Verdun, Francis R.; Baechler, Sebastien; Prior, John O.; Thomassin, Alain; Arial, Emmanuelle; Laget, Michael; Masse, Veronique; Saldarriaga Vargas, Clarita; Struelens, Lara; Vanhavere, Filip; Perier, Aurelien; Courageot, Estelle; Gaillard-Lecanu, Emmanuelle; Le-Meur, Gaelle; Monier, Catherine; Thers, Dominique; Le-Guen, Bernard; Blond, Serge; Cordier, Gerard; Le Roy, Maiwenn; De Carlan, Loic; Bordy, Jean-Marc; Caccia, Barbara; Andenna, Claudio; Charimadurai, Arun; Selvam, T Palani; Czarnecki, Damian; Zink, Klemens; Gschwind, Regine; Martin, Eric; Huot, Nicolas; Zoubair, Mariam; El Bardouni, Tarek; Lazaro, Delphine; Barat, Eric; Dautremer, Thomas; Montagu, Thierry; Chabert, Isabelle; Guerin, Lucie; Batalla, Alain; Moignier, C.; Huet, C.; Bassinet, C.; Baumann, M.; Barraux, V.; Sebe-Mercier, K.; Loiseau, C.; Batalla, A.; Makovicka, L.; Desnoyers, Yvon; Juhel, Gabriel; Mattera, Christophe; Tempier, Maryline

    2014-03-01

    - Calculation of secondary doses received by healthy tissues of patients treated by proton-therapy for eye or intracranial tumors (J. Farah); 14 - Use of FLUKA at the design stage of the installation of a X-Ray production facility: THOMX project (J.M. Horodynski); 15 - Shielding study using FLUKA for the construction of a new experimental zone for the N-TOF facility at CERN (J. Vollaire); 16 - Radioprotection studies optimisation for the dimensioning of radioactive materials storage and transport packagings (C. Nicoletti); 17 - Source terms calculation using ALEPH2 (A. Stankovskiy); 18 - Organs dose calculation in scanography (S. Dreuil); 19 - Calculation of beta spectra shape (X. Mougeot); 20 - Implementation of a DNA geometry in GEANT4-DNA Monte-Carlo calculations for radio-induced damages analysis (C. Villagrasa); 21 - Treatment planning optimization in nuclear medicine by personalized Monte-Carlo dosimetry: application to selective internal radiotherapy (SIRT) (A. Petitguillaume); 22 - Comparison between two personalized dosimetry approaches at the voxel scale for yttrium-90 radio-embolization (S. Gnesin); 23 - Criticality accident - Codac and dose assessment (A. Thomassin); 24 - Monte-Carlo calculations about the angular and energy dependence of the efficient dose for low energy photon beams and the impact of leaded clothing use (C. Saldarriaga Vargas); 25 - Study of Xenon-133 external exposure for the overall operating NPPs (A. Perier); 26 - Monte-Carlo codes intercomparison exercise for the modelling of a medical linear accelerator (M. Le Roy); 27 - Automatic determination of the primary electron beam parameters in Monte-Carlo simulations of linear accelerators for radiotherapy (D. Lazaro); 28 - Monte-Carlo determination of collimator opening and correction factors for a CyberKnife 1000 UM/min fitted with fixed collimators (C. Moignier, Presentation not available); 29 - Rooms investigation and characterization prior to dismantling - Coupling between a transport code and

  16. New results on the beam-loss criteria for heavy-ion accelerators

    International Nuclear Information System (INIS)

    Katrik, Peter; Hoffmann, Dieter H.H.; Mustafin, Edil; Strasik, Ivan; Pavlovic, Marius

    2015-01-01

    Activation of high-energy heavy-ion accelerators due to beam losses is a serious issue for accelerator parts like collimators, magnets, beam-lines, fragment separator targets, etc. The beam losses below 1 W/m are considered as tolerable for 'hands-on' maintenance in proton machines. In our previous studies, the FLUKA2008 code has been used for establishing a scaling law expanding the existing beam-loss tolerance for 1 GeV protons to heavy ions. This scaling law enabled specifying beam-loss criteria for projectile species from proton up to uranium at energies from 200 MeV/u up to 1 GeV/u. FLUKA2008 allowed nucleus-nucleus interactions down to 100 MeV/u only. In this work, we review our previous results and extend activation simulations to lower energies with the help of the new FLUKA version, namely FLUKA2011. It includes models for nucleus-nucleus interactions below 100 MeV/u. We also tried to expand the scaling law to lower energies. This, however, needs further studies, because the heavy-ion-induced nuclide composition starts deviating from the proton-induced nuclide composition at energies below 150 MeV/u. (authors)

  17. MO-FG-CAMPUS-TeP3-02: Benchmarks of a Proton Relative Biological Effectiveness (RBE) Model for DNA Double Strand Break (DSB) Induction in the FLUKA, MCNP, TOPAS, and RayStation™ Treatment Planning System

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, R [University of Washington, Seattle, WA (United States); Streitmatter, S [University of Utah Hospitals, Salt Lake City, UT (United States); Traneus, E [RAYSEARCH LABORATORIES AB, Stockholm (Sweden); Moskvin, V [St. Jude Children’s Hospital, Memphis, TN (United States); Schuemann, J [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: Validate implementation of a published RBE model for DSB induction (RBEDSB) in several general purpose Monte Carlo (MC) code systems and the RayStation™ treatment planning system (TPS). For protons and other light ions, DSB induction is a critical initiating molecular event that correlates well with the RBE for cell survival. Methods: An efficient algorithm to incorporate information on proton and light ion RBEDSB from the independently tested Monte Carlo Damage Simulation (MCDS) has now been integrated into MCNP (Stewart et al. PMB 60, 8249–8274, 2015), FLUKA, TOPAS and a research build of the RayStation™ TPS. To cross-validate the RBEDSB model implementation LET distributions, depth-dose and lateral (dose and RBEDSB) profiles for monodirectional monoenergetic (100 to 200 MeV) protons incident on a water phantom are compared. The effects of recoil and secondary ion production ({sub 2}H{sub +}, {sub 3}H{sub +}, {sub 3}He{sub 2+}, {sub 4}He{sub 2+}), spot size (3 and 10 mm), and transport physics on beam profiles and RBEDSB are examined. Results: Depth-dose and RBEDSB profiles among all of the MC models are in excellent agreement using a 1 mm distance criterion (width of a voxel). For a 100 MeV proton beam (10 mm spot), RBEDSB = 1.2 ± 0.03 (− 2–3%) at the tip of the Bragg peak and increases to 1.59 ± 0.3 two mm distal to the Bragg peak. RBEDSB tends to decrease as the kinetic energy of the incident proton increases. Conclusion: The model for proton RBEDSB has been accurately implemented into FLUKA, MCNP, TOPAS and the RayStation™TPS. The transport of secondary light ions (Z > 1) has a significant impact on RBEDSB, especially distal to the Bragg peak, although light ions have a small effect on (dosexRBEDSB) profiles. The ability to incorporate spatial variations in proton RBE within a TPS creates new opportunities to individualize treatment plans and increase the therapeutic ratio. Dr. Erik Traneus is employed full-time as a Research Scientist

  18. submitter A model for the accurate computation of the lateral scattering of protons in water

    CERN Document Server

    Bellinzona, EV; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T

    2016-01-01

    A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.

  19. Monte Carlo method in neutron activation analysis

    International Nuclear Information System (INIS)

    Majerle, M.; Krasa, A.; Svoboda, O.; Wagner, V.; Adam, J.; Peetermans, S.; Slama, O.; Stegajlov, V.I.; Tsupko-Sitnikov, V.M.

    2009-01-01

    Neutron activation detectors are a useful technique for the neutron flux measurements in spallation experiments. The study of the usefulness and the accuracy of this method at similar experiments was performed with the help of Monte Carlo codes MCNPX and FLUKA

  20. Excitation functions of the {sup nat}Cr(p,x){sup 44}Ti, {sup 56}Fe(p,x){sup 44}Ti, {sup nat}Ni(p,x){sup 44}Ti and {sup 93}Nb(p,x){sup 44}Ti reactions at energies up to 2.6 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Yu. E.; Batyaev, V.F.; Pavlov, K.V.; Titarenko, A. Yu. [National Research Center Kurchatov Institute, Institute for Theoretical and Experimental Physics, Moscow 117218 (Russian Federation); Zhivun, V.M. [National Research Center Kurchatov Institute, Institute for Theoretical and Experimental Physics, Moscow 117218 (Russian Federation); National Research Nuclear University MEPhI (Moscow Engineering Physics Institute), Moscow 115409 (Russian Federation); Chauzova, M.V.; Balyuk, S.A.; Bebenin, P.V. [National Research Center Kurchatov Institute, Institute for Theoretical and Experimental Physics, Moscow 117218 (Russian Federation); Ignatyuk, A.V. [National Research Center Kurchatov Institute, Institute for Theoretical and Experimental Physics, Moscow 117218 (Russian Federation); Institute of Physics and Power Engineering, Obninsk 249033 (Russian Federation); Mashnik, S.G. [Los Alamos National Laboratory (United States); Leray, S.; Boudard, A.; David, J.C.; Mancusi, D. [CEA/Saclay, Irfu/SPhN, 91191 Gif-sur-Yvette, Cedex (France); Cugnon, J. [University of Liege (Belgium); Yariv, Y. [SoreqNRC, Yavne (Israel); Nishihara, K.; Matsuda, N. [JAEA, Tokai (Japan); Kumawat, H. [BARC, Mumbai (India); Stankovskiy, A. Yu. [SCK-CEN (Belgium)

    2016-06-11

    The paper presents the measured cumulative yields of {sup 44}Ti for {sup nat}Cr, {sup 56}Fe, {sup nat}Ni and {sup 93}Nb samples irradiated by protons at the energy range 0.04–2.6 GeV. The obtained excitation functions are compared with calculations of the well-known codes: ISABEL, Bertini, INCL4.2+ABLA, INCL4.5+ABLA07, PHITS, CASCADE07 and CEM03.02. The predictive power of these codes regarding the studied nuclides is analyzed.

  1. Benchmark studies of induced radioactivity produced in LHC materials, Part I: Specific activities.

    Science.gov (United States)

    Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H

    2005-01-01

    Samples of materials which will be used in the LHC machine for shielding and construction components were irradiated in the stray radiation field of the CERN-EU high-energy reference field facility. After irradiation, the specific activities induced in the various samples were analysed with a high-precision gamma spectrometer at various cooling times, allowing identification of isotopes with a wide range of half-lives. Furthermore, the irradiation experiment was simulated in detail with the FLUKA Monte Carlo code. A comparison of measured and calculated specific activities shows good agreement, supporting the use of FLUKA for estimating the level of induced activity in the LHC.

  2. Benchmark studies of induced radioactivity produced in LHC materials, part I: Specific activities

    International Nuclear Information System (INIS)

    Brugger, M.; Khater, H.; Mayer, S.; Prinz, A.; Roesler, S.; Ulrici, L.; Vincke, H.

    2005-01-01

    Samples of materials which will be used in the LHC machine for shielding and construction components were irradiated in the stray radiation field of the CERN-EU high-energy reference field facility. After irradiation, the specific activities induced in the various samples were analysed with a high-precision gamma spectrometer at various cooling times, allowing identification of isotopes with a wide range of half-lives. Furthermore, the irradiation experiment was simulated in detail with the FLUKA Monte Carlo code. A comparison of measured and calculated specific activities shows good agreement, supporting the use of FLUKA for estimating the level of induced activity in the LHC. (authors)

  3. Measurement of angular distribution of neutron flux for the 6 MeV race-track microtron based pulsed neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India)

    2010-09-15

    The 6 MeV race track microtron based pulsed neutron source has been designed specifically for the elemental analysis of short lived activation products, where the low neutron flux requirement is desirable. Electrons impinges on a e-{gamma} target to generate bremsstrahlung radiations, which further produces neutrons by photonuclear reaction in {gamma}-n target. The optimisation of these targets along with their spectra were estimated using FLUKA code. The measurement of neutron flux was carried out by activation of vanadium at different scattering angles. Angular distribution of neutron flux indicates that the flux decreases with increase in the angle and are in good agreement with the FLUKA simulation.

  4. Prompt gamma-based neutron dosimetry for Am-Be and other workplace neutron spectra

    International Nuclear Information System (INIS)

    Udupi, Ashwini; Panikkath, Priyada; Sarkar, P.K.

    2016-01-01

    A new field-deployable technique for estimating the neutron ambient dose equivalent H*(10) by using the measured prompt gamma intensities emitted from borated high-density polyethylene (BHDPE) and the combination of normal HDPE and BHDPE with different configurations have been evaluated in this work. Monte Carlo simulations using the FLUKA code has been employed to calculate the responses from the prompt gammas emitted due to the monoenergetic neutrons interacting with boron, hydrogen, and carbon nuclei. A suitable linear combination of these prompt gamma responses (dose conversion coefficient (DCC)-estimated) is generated to approximate the International Commission on Radiological Protection provided DCC using the cross-entropy minimization technique. In addition, the shape and configurations of the HDPE and BHDPE combined system are optimized using the FLUKA code simulation results. The proposed method is validated experimentally, as well as theoretically, using different workplace neutron spectra with a satisfactory outcome. (author)

  5. Modern Housing Retrofit: Assessment of Upgrade Packages to EnerPHit Standard for 1940–1960 State Houses in Auckland

    Directory of Open Access Journals (Sweden)

    Paola Leardini

    2015-03-01

    Full Text Available New Zealand state housing includes a significant portion of problematic buildings constructed after the public housing scheme launched in 1936. Most of these houses are still uninsulated, thus, cold, draughty, mouldy, and progressively decaying; however, as they are fundamental elements of the country’s culture, society, and environment, and are built with good quality materials and sound construction, they are suitable candidates for effective energy upgrades. This paper presents findings of a study on problems and opportunities of retrofitting the state houses built between 1940 and 1960 in the Auckland region. It advocates strategic national policies and initiatives for retrofitting, based on more challenging performance thresholds. The research defines and virtually implements an incremental intervention strategy including different retrofit packages for a typical 1950s stand-alone house. Indoor and outdoor environmental parameters were monitored over a year, and data used to establish a base case for thermal simulation. The upgrade packages were then modelled to assess their impact on the house’s thermal performance, comparing heating requirements and comfort of various insulation and ventilation options. The paper reports on effective ways of preserving the integrity of such a house, while improving its thermal performance to the EnerPHit standard, and discusses the benefits of introducing this holistic approach into New Zealand retrofit practice.

  6. Measurements and simulations of the radiation exposure to aircraft crew workplaces due to cosmic radiation in the atmosphere

    International Nuclear Information System (INIS)

    Beck, P.; Latocha, M.; Dorman, L.; Pelliccioni, M.; Rollet, S.

    2007-01-01

    As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircraft have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H*(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at. (authors)

  7. Monte Carlo analysis of accelerator-driven systems studies on spallation neutron yield and energy gain

    CERN Document Server

    Hashemi-Nezhad, S R; Westmeier, W; Bamblevski, V P; Krivopustov, M I; Kulakov, B A; Sosnin, A N; Wan, J S; Odoj, R

    2001-01-01

    The neutron yield in the interaction of protons with lead and uranium targets has been studied using the LAHET code system. The dependence of the neutron multiplicity on target dimensions and proton energy has been calculated and the dependence of the energy amplification on the proton energy has been investigated in an accelerator-driven system of a given effective multiplication coefficient. Some of the results are compared with experimental findings and with similar calculations by the DCM/CEM code of Dubna and the FLUKA code system used in CERN. (14 refs).

  8. Radiation calculations using LAHET/MCNP/CINDER90

    International Nuclear Information System (INIS)

    Waters, L.

    1994-01-01

    The LAHET monte carlo code system has recently been expanded to include high energy hadronic interactions via the FLUKA code, while retaining the original Los Alamos versions of HETC and ISABEL at lower energies. Electrons and photons are transported with EGS4 or ITS, while the MCNP coupled neutron/photon monte carlo code provides analysis of neutrons with kinetic energies less than 20 MeV. An interface with the CINDER activation code is now in common use. Various other changes have been made to facilitate analysis of high energy accelerator radiation environments and experimental physics apparatus, such as those found at SSC and RHIC. Current code developments and applications are reviewed

  9. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Ozaki, Y.; Watanabe, H.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.

    2017-01-01

    Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192 Ir hairpins and 198 Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192 Ir hairpins and 198 Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained.

  10. CERN Technical Training 2008: Learning for the LHC!

    CERN Multimedia

    2008-01-01

    FLUKA Workshop 2008: 23-27 June 2008 http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications can be found on the FLUKA official website (www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginner’s course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics about FLUKA, as well as offer the possibility to...

  11. Large Hadron Collider at CERN: Beams Generating High-Energy-Density Matter

    CERN Document Server

    Tahir, N A; Shutov, A; Lomonosov, IV; Piriz, A R; Hoffmann, D H H; Deutsch, C; Fortov, V E

    2009-01-01

    This paper presents numerical simulations that have been carried out to study the thermodynamic and hydrodynamic response of a solid copper cylindrical target that is facially irradiated along the axis by one of the two Large Hadron Collider (LHC) 7 TeV/c proton beams. The energy deposition by protons in solid copper has been calculated using an established particle interaction and Monte Carlo code, FLUKA, which is capable of simulating all components of the particle cascades in matter, up to multi-TeV energies. This data has been used as input to a sophisticated two--dimensional hydrodynamic computer code, BIG2 that has been employed to study this problem. The prime purpose of these investigations was to assess the damage caused to the equipment if the entire LHC beam is lost at a single place. The FLUKA calculations show that the energy of protons will be deposited in solid copper within about 1~m assuming constant material parameters. Nevertheless, our hydrodynamic simulations have shown that the energy de...

  12. Monte Carlo simulations and experimental results on neutron production in the spallation target QUINTA irradiated with 660 MeV protons

    International Nuclear Information System (INIS)

    Khushvaktov, J.H.; Yuldashev, B.S.; Adam, J.; Vrzalova, J.; Baldin, A.A.; Furman, W.I.; Gustov, S.A.; Kish, Yu.V.; Solnyshkin, A.A.; Stegailov, V.I.; Tichy, P.; Tsoupko-Sitnikov, V.M.; Tyutyunnikov, S.I.; Zavorka, L.; Svoboda, J.; Zeman, M.; Vespalec, R.; Wagner, V.

    2017-01-01

    The activation experiment was performed using the accelerated beam of the Phasotron accelerator at the Joint Institute for Nuclear Research (JINR). The natural uranium spallation target QUINTA was irradiated with protons of energy 660 MeV. Monte Carlo simulations were performed using the FLUKA and Geant4 codes. The number of leakage neutrons from the sections of the uranium target surrounded by the lead shielding and the number of leakage neutrons from the lead shield were determined. The total number of fissions in the setup QUINTA were determined. Experimental values of reaction rates for the produced nuclei in the "1"2"7I sample were obtained, and several values of the reaction rates were compared with the results of simulations by the FLUKA and Geant4 codes. The experimentally determined fluence of neutrons in the energy range of 10-200 MeV using the (n, xn) reactions in the "1"2"7I(NaI) sample was compared with the results of simulations. Possibility of transmutation of the long-lived radionuclide "1"2"9I in the QUINTA setup was estimated. [ru

  13. Volatile elements production rates in a 1.4 Gev proton-irradiated molten lead-bismuth target

    CERN Document Server

    Zanini, L; Everaerts, P; Fallot, M; Franberg, H; Gröschel, F; Jost, C; Kirchner, T; Kojima, Y; Köster, U; Lebenhaft, J; Manfrina, E; Pitcher, E J; Ravn, H L; Tall, Y; Wagner, W; Wohlmuther, M

    2005-01-01

    Production rates of volatile elements following spallation reaction of 1.4 GeV protons on a liquid Pb/Bi target have been measured. The experiment was performed at the ISOLDE facility at CERN. These data are of interest for the developments of targets for accelerator driven systems such as MEGAPIE. Additional data have been taken on a liquid Pb target. Calculations were performed using the FLUKA and MCNPX Monte Carlo codes coupled with the evolution codes ORIHET3 and FISPACT using different options for the intra-nuclear cascades and evaporation models. Preliminary results from the data analysis show good comparison with calculations for Hg and for noble gases. For other elements such as I it is apparent that only a fraction of the produced isotopes is released. The agreement with the experimental data varies depending on the model combination used. The best results are obtained using MCNPX with the INCL4/ABLA models and with FLUKA. Discrepancies are found for some isotopes produced by fission using the MCNPX ...

  14. Shock loads induced on metal structures by LHC proton beams: modelling of thermo-mechanical effects

    CERN Document Server

    Peroni, L; Dallocchio, A; Bertarelli, A

    2011-01-01

    In this work, the numerical simulations of the LHC high energy particle beam impact against a metal structure are performed using the commercial FEM code LS-DYNA. The evaluation of thermal loads on the hit material is performed using a statistical code, called FLUKA, based on the Monte-Carlo method, which returns an energy map on a particular geometry (taking into account all the particles in the cascade generated by the interaction between the proton beam and the target). The FLUKA results are then used as input for thermo-structural studies. The first step of this work is the validation of the numerical procedure on a simple geometry for two different materials (copper and tungsten) and constitutive material models. In particular, the high energy particle impact is examined on a facially irradiated cylindrical bar: the beam hits the component directly on the centre of the basis. Then the final step is the study of the impact on a real structure with an energy beam of 5 TeV (the next target in the energy val...

  15. Prompt radiation, shielding and induced radioactivity in a high-power 160 MeV proton linac

    Energy Technology Data Exchange (ETDEWEB)

    Magistris, Matteo [CERN, CH-1211 Geneva 23 (Switzerland)]. E-mail: matteo.magistris@cern.ch; Silari, Marco [CERN, CH-1211 Geneva 23 (Switzerland)

    2006-06-23

    CERN is designing a 160 MeV proton linear accelerator, both for a future intensity upgrade of the LHC and as a possible first stage of a 2.2 GeV superconducting proton linac. A first estimate of the required shielding was obtained by means of a simple analytical model. The source terms and the attenuation lengths used in the present study were calculated with the Monte Carlo cascade code FLUKA. Detailed FLUKA simulations were performed to investigate the contribution of neutron skyshine and backscattering to the expected dose rate in the areas around the linac tunnel. An estimate of the induced radioactivity in the magnets, vacuum chamber, the cooling system and the concrete shield was performed. A preliminary thermal study of the beam dump is also discussed.

  16. First Investigations on the Energy Deposited in a D0 early separation scheme Dipole for the LHC upgrade

    CERN Document Server

    Hoa, C

    2007-01-01

    This note gives the first results of energy deposition calculation on a simplified model for an early scheme separation dipole D0, located at 3.5 m from the IP. The Monte Carlo code FLUKA version 2006.3 has been used for modelling the multi-particle interactions and energy transport. After a short introduction to particle interaction with matter and power deposition processes, the FLUKA modelling is described with bench marked power deposition calculation on the TAS, the absorber located in front of the triplet quadrupoles. The power deposition results for the D0 early scheme are then discussed in details, with the averaged and peak power density, and the variations of the total heat load in the dipole with the longitudinal position and with the aperture diameter.

  17. Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems

    Science.gov (United States)

    Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul

    2014-01-01

    NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for

  18. Fluence to Effective Dose and Effective Dose Equivalent Conversion Coefficients for Photons from 50 KeV to 10 GeV

    International Nuclear Information System (INIS)

    Ferrari, A.; Pelliccioni, M.; Pillon, M.

    1996-07-01

    Effective dose equivalent and effective dose per unit photon fluence have been calculated by the FLUKA code for various geometrical conditions of irradiation of an anthropomorphic phantom placed in a vacuum. Calculations have been performed for monoenergetic photons of energy ranging from 50 keV to 10 GeV. The agreement with the results of other authors, when existing, is generally very satisfactory

  19. Analytic model of heat deposition in spallation neutron target

    International Nuclear Information System (INIS)

    Findlay, D.J.S.

    2015-01-01

    A simple analytic model for estimating deposition of heat in a spallation neutron target is presented—a model that can readily be realised in an unambitious spreadsheet. The model is based on simple representations of the principal underlying physical processes, and is intended largely as a ‘sanity check’ on results from Monte Carlo codes such as FLUKA or MCNPX.

  20. Analytic model of heat deposition in spallation neutron target

    Energy Technology Data Exchange (ETDEWEB)

    Findlay, D.J.S.

    2015-12-11

    A simple analytic model for estimating deposition of heat in a spallation neutron target is presented—a model that can readily be realised in an unambitious spreadsheet. The model is based on simple representations of the principal underlying physical processes, and is intended largely as a ‘sanity check’ on results from Monte Carlo codes such as FLUKA or MCNPX.

  1. Simulation analysis of radiation fields inside phantoms for neutron irradiation

    International Nuclear Information System (INIS)

    Satoh, Daiki; Takahashi, Fumiaki; Endo, Akira; Ohmachi, Y.; Miyahara, N.

    2007-01-01

    Radiation fields inside phantoms have been calculated for neutron irradiation. Particle and heavy-ion transport code system PHITS was employed for the calculation. Energy and size dependences of neutron dose were analyzed using tissue equivalent spheres of different size. A voxel phantom of mouse was developed based on CT images of an 8-week-old male C3H/HeNs mouse. Deposition energy inside the mouse was calculated for 2- and 10-MeV neutron irradiation. (author)

  2. Antiproton Radiation Therapy

    DEFF Research Database (Denmark)

    Bassler, Niels; Holzscheiter, Michael H.; Petersen, Jørgen B.B.

    2007-01-01

    the radiobiological properties using antiprotons at 50 and 125 MeV from the Antiproton Decelerator (AD) at CERN. Dosimetry experiments were carried out with ionization chambers, alanine pellets and radiochromic film. Radiobiological experiments were done with Chinese V79 WNRE hamster cells. Monte Carlo particle...... transport codes were investigated and compared with results obtained from the ionization chambers and alanine pellets. A track structure model have been applied on the calculated particle spectrum, and been used to predict the LET-dependent response of the alanine pellets. The particle transport program...... FLUKA produced data which were in excellent agreement with our ionization chamber measurements, and in good agreement with our alanine measurements. FLUKA is now being used to generate a wide range of depth dose data at several energies, including secondary particle–energy spectra, which will be used...

  3. Measurement of angular distribution of cosmic-ray muon fluence rate

    International Nuclear Information System (INIS)

    Lin, Jeng-Wei; Chen, Yen-Fu; Sheu, Rong-Jiun; Jiang, Shiang-Huei

    2010-01-01

    In this work a Berkeley Lab cosmic ray detector was used to measure the angular distribution of the cosmic-ray muon fluence rate. Angular response functions of the detector at each measurement orientation were calculated by using the FLUKA Monte Carlo code, where no energy attenuation was taken into account. Coincidence counting rates were measured at ten orientations with equiangular intervals. The muon angular fluence rate spectrum was unfolded from the measured counting rates associated with the angular response functions using both the MAXED code and the parameter adjusting method.

  4. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  5. Minimizing the background radiation in the new neutron time-of-flight facility at CERN FLUKA Monte Carlo simulations for the optimization of the n_TOF second experimental line

    CERN Document Server

    Bergström, Ida; Elfgren, Erik

    2013-06-11

    At the particle physics laboratory CERN in Geneva, Switzerland, the Neutron Time-of-Flight facility has recently started the construction of a second experimental line. The new neutron beam line will unavoidably induce radiation in both the experimental area and in nearby accessible areas. Computer simulations for the minimization of the background were carried out using the FLUKA Monte Carlo simulation package. The background radiation in the new experimental area needs to be kept to a minimum during measurements. This was studied with focus on the contributions from backscattering in the beam dump. The beam dump was originally designed for shielding the outside area using a block of iron covered in concrete. However, the backscattering was never studied in detail. In this thesis, the fluences (i.e. the flux integrated over time) of neutrons and photons were studied in the experimental area while the beam dump design was modified. An optimized design was obtained by stopping the fast neutrons in a high Z mat...

  6. Study of {sup 24}Na activity in concrete using 20-MeV proton beam on Cu

    Energy Technology Data Exchange (ETDEWEB)

    Oranj, Leila Mokhtanri; Jung, Nam Suk; Lee, Arim; Heo, Tae Min; Bakhtian, Mahdi; Lee, Hee Seock [POSTECH, Pohang (Korea, Republic of)

    2017-04-15

    The number of medical cyclotrons capable of accelerating protons to about 20 MeV is increasing in Korea. In such facilities, various radionuclides could be induced in shielding materials like concrete from secondary neutrons which Causes problems from the view point of radiation safety. Among these radionuclides, gamma-ray from {sup 24}Na (Tz1/2 = 15 h) is the most important origin of radiation exposure. {sup 24}Na could be produced from secondary neutrons on Na, Al and Mg component which exist in the concrete. {sup 24} Na Could be produced from thermal neutrons on Na and fast neutron with energy lower than 20 MeV on Al and Mg. Due to interaction of 20 MeV protons on Cu target, secondary neutrons with the energy of less than 20 MeV were produced. therefore, among the concrete components, Na, Al and Mg are only corespondent to produce {sup 24}Na. In this work, {sup 24}Na activity induced in concrete and chemical reagents of concrete (NaHCO{sub 3}, Al{sub 2}O{sub 3} and MgO) were measured. To produce neutrons, Cu target was irradiated by 20 MeV protons. Measured data were compared with results of simulations by FLUKA and MARS as well as earlier works and theocratical data. In the case of Mg and Al chemical reagents, FLUKA code overestimates our measurements by approximately four times, while, for Na sample, FLUKA underestimates the experimental data by almost 0.5. Data from FLUKA and measurement for the concrete are consistent. Calculation from TALYS for Mg overestimates the measured data by a factor of 2.5.

  7. Benchmark calculations with simple phantom for neutron dosimetry (2)

    International Nuclear Information System (INIS)

    Yukio, Sakamoto; Shuichi, Tsuda; Tatsuhiko, Sato; Nobuaki, Yoshizawa; Hideo, Hirayama

    2004-01-01

    Benchmark calculations for high-energy neutron dosimetry were undertaken after SATIF-5. Energy deposition in a cylindrical phantom with 100 cm radius and 30 cm depth was calculated for the irradiation of neutrons from 100 MeV to 10 GeV. Using the ICRU four-element loft tissue phantom and four single-element (hydrogen, carbon, nitrogen and oxygen) phantoms, the depth distributions of deposition energy and those total at the central region of phantoms within l cm radius and at the whole region of phantoms within 100 cm radius were calculated. The calculated results of FLUKA, MCNPX, MARS, HETC-3STEP and NMTC/JAM codes were compared. It was found that FLUKA, MARS and NMTC/JAM showed almost the same results. For the high-energy neutron incident, the MCNP-X results showed the largest ones in the total deposition energy and the HETC-3STEP results show'ed smallest ones. (author)

  8. FRENDY: A new nuclear data processing system being developed at JAEA

    Directory of Open Access Journals (Sweden)

    Tada Kenichi

    2017-01-01

    Full Text Available JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.

  9. FRENDY: A new nuclear data processing system being developed at JAEA

    Science.gov (United States)

    Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

    2017-09-01

    JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.

  10. Neutronics Assessments for a RIA Fragmentation Line Beam Dump Concept

    CERN Document Server

    Boles, Jason; Reyes, Susana; Stein, Werner

    2005-01-01

    Heavy ion and radiation transport calculations are in progress for conceptual beam dump designs for the fragmentation line of the proposed Rare Isotope Accelerator (RIA). Using the computer code PHITS, a preliminary design of a motor-driven rotating wheel beam dump and adjacent downstream multipole has been modeled. Selected results of these calculations are given, including neutron and proton flux in the wheel, absorbed dose and displacements per atom in the hub materials, and heating from prompt radiation and from decay heat in the multipole.

  11. Porous silicon carbide and aluminum oxide with unidirectional open porosity as model target materials for radioisotope beam production

    Science.gov (United States)

    Czapski, M.; Stora, T.; Tardivat, C.; Deville, S.; Santos Augusto, R.; Leloup, J.; Bouville, F.; Fernandes Luis, R.

    2013-12-01

    New silicon carbide (SiC) and aluminum oxide (Al2O3) of a tailor-made microstructure were produced using the ice-templating technique, which permits controlled pore formation conditions within the material. These prototypes will serve to verify aging of the new advanced target materials under irradiation with proton beams. Before this, the evaluation of their mechanical integrity was made based on the energy deposition spectra produced by FLUKA codes.

  12. Analysis of the water dynamics for the MSE-COIL and theMST-COIL

    CERN Document Server

    Massidda, L; Kadi, Y; Balhan, B

    2005-01-01

    In this report, we present the technical specification for the numerical model and the study of the acoustic wave propagation in the water tubes of the extraction septum magnet (MSE) and the thin magnetic septum (MST) in the event of an asynchronous firing of the extraction kickers (MKE). The deposited energy densities, estimated by the high-energy particle transport code FLUKA, were converted to internal heat generation rates according to the time dependence of the extracted beam. The transient response to this thermal load was obtained by simulating power deposition and acoustic wave propagation by the spectral-element code ELSE.

  13. Dynamic structural analysis of the TPSG4 & TPSG6 beam diluters

    CERN Document Server

    Massidda, L; Kadi, Y; Balhan, B

    2005-01-01

    In this report we present the technical specification for the numerical model and the study of the dynamic structural behaviour of the beam diluter elements (TPSG4 & 6) protecting the extraction septum magnets (MSE & MST) in the event of an asynchronous firing of the extraction kickers (MKE). The deposited energy densities, estimated by the high-energy particle transport code FLUKA, were converted to internal heat generation rates according to the time dependence of the extracted beam. The transient response to this thermal load was obtained by solving the power deposition and structural deformation problem by the spectral-element code ELSE.

  14. Developing of a New Atmospheric Ionizing Radiation (AIR) Model

    Science.gov (United States)

    Clem, John M.; deAngelis, Giovanni; Goldhagen, Paul; Wilson, John W.

    2003-01-01

    As a result of the research leading to the 1998 AIR workshop and the subsequent analysis, the neutron issues posed by Foelsche et al. and further analyzed by Hajnal have been adequately resolved. We are now engaged in developing a new atmospheric ionizing radiation (AIR) model for use in epidemiological studies and air transportation safety assessment. A team was formed to examine a promising code using the basic FLUKA software but with modifications to allow multiple charged ion breakup effects. A limited dataset of the ER-2 measurements and other cosmic ray data will be used to evaluate the use of this code.

  15. Predicting induced radioactivity for the accelerator operations at the Taiwan Photon Source.

    Science.gov (United States)

    Sheu, R J; Jiang, S H

    2010-12-01

    This study investigates the characteristics of induced radioactivity due to the operations of a 3-GeV electron accelerator at the Taiwan Photon Source (TPS). According to the beam loss analysis, the authors set two representative irradiation conditions for the activation analysis. The FLUKA Monte Carlo code has been used to predict the isotope inventories, residual activities, and remanent dose rates as a function of time. The calculation model itself is simple but conservative for the evaluation of induced radioactivity in a light source facility. This study highlights the importance of beam loss scenarios and demonstrates the great advantage of using FLUKA in comparing the predicted radioactivity with corresponding regulatory limits. The calculated results lead to the conclusion that, due to fairly low electron consumption, the radioactivity induced in the accelerator components and surrounding concrete walls of the TPS is rather moderate and manageable, while the possible activation of air and cooling water in the tunnel and their environmental releases are negligible.

  16. Study for magnets and electronics protection in the LHC Betatron-cleaning insertion

    International Nuclear Information System (INIS)

    Magistris, Matteo; Ferrari, Alfredo; Santana, Mario; Tsoulou, Katerina; Vlachoudis, Vasilis

    2006-01-01

    The collimation system of the future LHC at CERN is a challenging project, since the transverse energy intensities of the LHC beams are three orders of magnitude greater than at other current facilities. The two cleaning insertions (IR3 and IR7) housing the collimators will be among the most radioactive areas of LHC. The 1.5 km long IR7 insertion was fully implemented with the Monte Carlo cascade code FLUKA. Extensive simulations were performed to estimate the radiation level along the tunnel, as well as the energy deposition in the most critical elements. In particular, this paper discusses the latest results of the FLUKA studies, including the design of passive absorbers (to protect warm magnets) and a comparison of W and Cu as material for the active absorber jaws (to protect cold magnets). Any electronic device operating in strong radiation fields such as those expected for the LHC tunnel will undergo degradation. A shielding study was done to reduce radiation damage to the electronics

  17. Modelling the Influence of Shielding on Physical and Biological Organ Doses

    CERN Document Server

    Ballarini, Francesca; Ferrari, Alfredo; Ottolenghi, Andrea; Pelliccioni, Maurizio; Scannicchio, Domenico

    2002-01-01

    Distributions of "physical" and "biological" dose in different organs were calculated by coupling the FLUKA MC transport code with a geometrical human phantom inserted into a shielding box of variable shape, thickness and material. While the expression "physical dose" refers to the amount of deposited energy per unit mass (in Gy), "biological dose" was modelled with "Complex Lesions" (CL), clustered DNA strand breaks calculated in a previous work based on "event-by-event" track-structure simulations. The yields of complex lesions per cell and per unit dose were calculated for different radiation types and energies, and integrated into a version of FLUKA modified for this purpose, allowing us to estimate the effects of mixed fields. As an initial test simulation, the phantom was inserted into an aluminium parallelepiped and was isotropically irradiated with 500 MeV protons. Dose distributions were calculated for different values of the shielding thickness. The results were found to be organ-dependent. In most ...

  18. Hadron cascades produced by electromagnetic cascades

    International Nuclear Information System (INIS)

    Nelson, W.R.; Jenkins, T.M.; Ranft, J.

    1986-12-01

    A method for calculating high energy hadron cascades induced by multi-GeV electron and photon beams is described. Using the EGS4 computer program, high energy photons in the EM shower are allowed to interact hadronically according to the vector meson dominance (VMD) model, facilitated by a Monte Carlo version of the dual multistring fragmentation model which is used in the hadron cascade code FLUKA. The results of this calculation compare very favorably with experimental data on hadron production in photon-proton collisions and on the hadron production by electron beams on targets (i.e., yields in secondary particle beam lines). Electron beam induced hadron star density contours are also presented and are compared with those produced by proton beams. This FLUKA-EGS4 coupling technique could find use in the design of secondary beams, in the determination high energy hadron source terms for shielding purposes, and in the estimation of induced radioactivity in targets, collimators and beam dumps

  19. SU-F-T-46: The Effect of Inter-Seed Attenuation and Tissue Composition in Prostate 125I Brachytherapy Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Tamura, K; Araki, F; Ohno, T [Kumamoto University, Kumamoto, Kumamoto (Japan)

    2016-06-15

    Purpose: To investigate the difference of dose distributions with/without the effect of inter-seed attenuation and tissue compositions in prostate {sup 125}I brachytherapy dose calculations, using Monte Carlo simulations of Particle and Heavy Ion Transport code System (PHITS). Methods: The dose distributions in {sup 125}I prostate brachytherapy were calculated using PHITS for non-simultaneous and simultaneous alignments of STM1251 sources in water or prostate phantom for six patients. The PHITS input file was created from DICOM-RT file which includes source coordinates and structures for clinical target volume (CTV) and organs at risk (OARs) of urethra and rectum, using in-house Matlab software. Photon and electron cutoff energies were set to 1 keV and 100 MeV, respectively. The dose distributions were calculated with the kerma approximation and the voxel size of 1 × 1 × 1 mm{sup 3}. The number of incident photon was set to be the statistical uncertainty (1σ) of less than 1%. The effect of inter-seed attenuation and prostate tissue compositions was evaluated from dose volume histograms (DVHs) for each structure, by comparing to results of the AAPM TG-43 dose calculation (without the effect of inter-seed attenuation and prostate tissue compositions). Results: The dose reduction due to the inter-seed attenuation by source capsules was approximately 2% for CTV and OARs compared to those of TG-43. In additions, by considering prostate tissue composition, the D{sub 90} and V{sub 100} of CTV reduced by 6% and 1%, respectively. Conclusion: It needs to consider the dose reduction due to the inter-seed attenuation and tissue composition in prostate {sup 125}I brachytherapy dose calculations.

  20. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  1. MC 93 - Proceedings of the International Conference on Monte Carlo Simulation in High Energy and Nuclear Physics

    Science.gov (United States)

    Dragovitsch, Peter; Linn, Stephan L.; Burbank, Mimi

    1994-01-01

    The Table of Contents for the book is as follows: * Preface * Heavy Fragment Production for Hadronic Cascade Codes * Monte Carlo Simulations of Space Radiation Environments * Merging Parton Showers with Higher Order QCD Monte Carlos * An Order-αs Two-Photon Background Study for the Intermediate Mass Higgs Boson * GEANT Simulation of Hall C Detector at CEBAF * Monte Carlo Simulations in Radioecology: Chernobyl Experience * UNIMOD2: Monte Carlo Code for Simulation of High Energy Physics Experiments; Some Special Features * Geometrical Efficiency Analysis for the Gamma-Neutron and Gamma-Proton Reactions * GISMO: An Object-Oriented Approach to Particle Transport and Detector Modeling * Role of MPP Granularity in Optimizing Monte Carlo Programming * Status and Future Trends of the GEANT System * The Binary Sectioning Geometry for Monte Carlo Detector Simulation * A Combined HETC-FLUKA Intranuclear Cascade Event Generator * The HARP Nucleon Polarimeter * Simulation and Data Analysis Software for CLAS * TRAP -- An Optical Ray Tracing Program * Solutions of Inverse and Optimization Problems in High Energy and Nuclear Physics Using Inverse Monte Carlo * FLUKA: Hadronic Benchmarks and Applications * Electron-Photon Transport: Always so Good as We Think? Experience with FLUKA * Simulation of Nuclear Effects in High Energy Hadron-Nucleus Collisions * Monte Carlo Simulations of Medium Energy Detectors at COSY Jülich * Complex-Valued Monte Carlo Method and Path Integrals in the Quantum Theory of Localization in Disordered Systems of Scatterers * Radiation Levels at the SSCL Experimental Halls as Obtained Using the CLOR89 Code System * Overview of Matrix Element Methods in Event Generation * Fast Electromagnetic Showers * GEANT Simulation of the RMC Detector at TRIUMF and Neutrino Beams for KAON * Event Display for the CLAS Detector * Monte Carlo Simulation of High Energy Electrons in Toroidal Geometry * GEANT 3.14 vs. EGS4: A Comparison Using the DØ Uranium/Liquid Argon

  2. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  3. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  4. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams

    International Nuclear Information System (INIS)

    Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo

    2016-01-01

    Purpose: The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Methods: Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the FLUKA code [A. Ferrari et al., “FLUKA: A multi-particle transport code,” in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., “The FLUKA Code: Developments and challenges for high energy and medical applications,” Nucl. Data Sheets 120, 211–214 (2014)], to partial fluence corrections measured experimentally. Results: A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Conclusions: Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary

  5. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams

    Energy Technology Data Exchange (ETDEWEB)

    Lourenço, Ana, E-mail: am.lourenco@ucl.ac.uk [Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, United Kingdom and Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Thomas, Russell; Bouchard, Hugo [Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Kacperek, Andrzej [National Eye Proton Therapy Centre, Clatterbridge Cancer Centre, Wirral CH63 4JY (United Kingdom); Vondracek, Vladimir [Proton Therapy Center, Budinova 1a, Prague 8 CZ-180 00 (Czech Republic); Royle, Gary [Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT (United Kingdom); Palmans, Hugo [Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW, United Kingdom and Medical Physics Group, EBG MedAustron GmbH, A-2700 Wiener Neustadt (Austria)

    2016-07-15

    Purpose: The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Methods: Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the FLUKA code [A. Ferrari et al., “FLUKA: A multi-particle transport code,” in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., “The FLUKA Code: Developments and challenges for high energy and medical applications,” Nucl. Data Sheets 120, 211–214 (2014)], to partial fluence corrections measured experimentally. Results: A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Conclusions: Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary

  6. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  7. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  8. Monte Carlo simulation of neutron detection efficiency for NE213 scintillation detector

    International Nuclear Information System (INIS)

    Xi Yinyin; Song Yushou; Chen Zhiqiang; Yang Kun; Zhangsu Yalatu; Liu Xingquan

    2013-01-01

    A NE213 liquid scintillation neutron detector was simulated by using the FLUKA code. The light output of the detector was obtained by transforming the secondary particles energy deposition using Birks formula. According to the measurement threshold, detection efficiencies can be calculated by integrating the light output. The light output, central efficiency and the average efficiency as a function of the front surface radius of the detector, were simulated and the results agreed well with experimental results. (authors)

  9. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  10. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  11. The GEANT-CALOR interface

    International Nuclear Information System (INIS)

    Zeitnitz, C.; Gabriel, T.A.

    1994-01-01

    The simulation of large scale high energy physics experiments is based mainly on the GEANT package. In the current version 3.15 the simulation of hadronic interacting particles is based on GHEISHA or FLUKA. Both programs miss an accurate simulation of the interaction of low energy neutrons (E kin < 20 MeV) with the materials of the detector. The CALOR89 program package contains a low energetic neutron code. An interface between the CALOR program parts and the GEANT package has been developed

  12. Estimation of saturation activities for activation experiments in CHARM and CSBF using Fluence Conversion Coefficients

    CERN Document Server

    Guerin, Helene Chloe; Iliopoulou, Elpida; CERN. Geneva. HSE Department

    2017-01-01

    As summer student at CERN, I have been working in the Radiation Protection group for 10 weeks. I worked with the \\textsc{Fluka} Monte Carlo simulation code, using Fluence Conversion Coefficients method to perform simulations to estimate the saturation activities for activation experiments in the \\textsc{CSBF} and the \\textsc{Charm} facility in the East Experimental Area. The provided results will be used to plan a Monte Carlo benchmark in the \\textsc{CSBF} during a beam period at the end of August 2017.

  13. TH-A-19A-05: Modeling Physics Properties and Biologic Effects Induced by Proton and Helium Ions

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Titt, U; Peeler, C; Guan, F; Mirkovic, D; Grosshans, D; Mohan, R [UT MD Anderson Cancer Center, Houston, TX (United States)

    2014-06-15

    Purpose: Currently, proton and carbon ions are used for cancer treatment. More recently, other light ions including helium ions have shown interesting physical and biological properties. The purpose of this work is to study the biological and physical properties of helium ions (He-3) in comparison to protons. Methods: Monte Carlo simulations with FLUKA, GEANT4 and MCNPX were used to calculate proton and He-3 dose distributions in water phantoms. The energy spectra of proton and He-3 beams were calculated with high resolution for use in biological models. The repair-misrepairfixation (RMF) model was subsequently used to calculate the RBE. Results: The proton Bragg curve calculations show good agreement between the three general purpose Monte Carlo codes. In contrast, the He-3 Bragg curve calculations show disagreement (for the magnitude of the Bragg peak) between FLUKA and the other two Monte Carlo codes. The differences in the magnitude of the Bragg peak are mainly due to the discrepancy in the secondary fragmentation cross sections used by the codes. The RBE for V79 cell lines is about 0.96 and 0.98 at the entrance of proton and He-3 ions depth dose respectively. The RBE increases to 1.06 and 1.59 at the Bragg peak of proton and He-3 ions. The results demonstrated that LET, microdosimetric parameters (such as dose-mean lineal energy) and RBE are nearly constant along the plateau region of Bragg curve, while all parameters increase within the Bragg peak and at the distal edge for both proton and He-3 ions. Conclusion: The Monte Carlo codes should revise the fragmentation cross sections to more accurately simulate the physical properties of He-3 ions. The increase in RBE for He-3 ions is higher than for proton beams at the Bragg peak.

  14. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  15. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  16. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  17. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  18. Analytical Model for Estimating the Zenith Angle Dependence of Terrestrial Cosmic Ray Fluxes.

    Directory of Open Access Journals (Sweden)

    Tatsuhiko Sato

    Full Text Available A new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA version 4.0" was developed to facilitate instantaneous estimation of not only omnidirectional but also angular differential energy spectra of cosmic ray fluxes anywhere in Earth's atmosphere at nearly any given time. It consists of its previous version, PARMA3.0, for calculating the omnidirectional fluxes and several mathematical functions proposed in this study for expressing their zenith-angle dependences. The numerical values of the parameters used in these functions were fitted to reproduce the results of the extensive air shower simulation performed by Particle and Heavy Ion Transport code System (PHITS. The angular distributions of ground-level muons at large zenith angles were specially determined by introducing an optional function developed on the basis of experimental data. The accuracy of PARMA4.0 was closely verified using multiple sets of experimental data obtained under various global conditions. This extension enlarges the model's applicability to more areas of research, including design of cosmic-ray detectors, muon radiography, soil moisture monitoring, and cosmic-ray shielding calculation. PARMA4.0 is available freely and is easy to use, as implemented in the open-access EXcel-based Program for Calculating Atmospheric Cosmic-ray Spectrum (EXPACS.

  19. Interfacing MCNPX and McStas for simulation of neutron transport

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Lauritzen, Bent; Nonbøl, Erik

    2013-01-01

    Stas[4, 5, 6, 7]. The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve......Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX[1] or FLUKA[2, 3] whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as Mc...... geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides....

  20. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  1. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  2. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  3. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  4. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  5. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  6. Calculation of Bremsstrahlung yield for thin target

    International Nuclear Information System (INIS)

    Demir, N.; Akkurt, I.; Tekin, H. O.; Cakirli, R. B.; Akkus, B.; Kupa, I.

    2010-01-01

    The Bremsstrahlung photon is created by de-accelerating electron beam in an electric field which is usually a thin material so-called radiator. The obtained Bremsstrahlung yield depends on some parameter such as incoming electron beam energy, the thickness and also Z number of the radiator. The main aim of this work is to obtain optimum radiator to be used at Bremsstrahlung photon beam facility at TAC. For this purposes the Bremsstrahlung photon yield has been obtained using FLUKA code for different types materials.

  7. Narrow beam dosimetry for high-energy hadrons and electrons

    CERN Document Server

    Pelliccioni, M; Ulrici, Luisa

    2001-01-01

    Organ doses and effective dose were calculated with the latest version of the Monte Carlo transport code FLUKA in the case of an anthropomorphic mathematical model exposed to monoenergetic narrow beams of protons, pions and electrons in the energy range 10°— 400 GeV. The target organs considered were right eye, thyroid, thymus, lung and breast. Simple scaling laws to the calculated values are given. The present data and formula should prove useful for dosimetric estimations in case of accidental exposures to high-energy beams.

  8. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    International Nuclear Information System (INIS)

    Lee, C.C.; Lee, Y.J.; Tung, C.J.; Cheng, H.W.; Chao, T.C.

    2014-01-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R 50% ) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R 50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent R eq,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively. - Highlights: ► Proton dose simulation based on the MCNPX 2.6.0 in homogeneous and CT phantoms. ► CT number (HU) conversion to electron density based on Schneider's approach. ► Good agreement among MCNPX, GEANT4 and FLUKA codes in a homogeneous water phantom. ► Water equivalent R 50 in CT phantoms are compatible to those of NIST database

  9. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  10. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  11. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  12. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  13. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  14. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  15. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  16. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  17. Interactions of secondary particles with thorium samples in the setup QUINTA irradiated with 6 GeV deuterons

    Energy Technology Data Exchange (ETDEWEB)

    Khushvaktov, J., E-mail: khushvaktov@jinr.ru [Joint Institute for Nuclear Research, Dubna (Russian Federation); Institute of Nuclear Physics ASRU, Tashkent (Uzbekistan); Adam, J. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Nuclear Physics Institute ASCR PRI (Czech Republic); Baldin, A.A. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Institute for Advanced Studies “OMEGA”, Dubna (Russian Federation); Chilap, V.V. [Center of Physical and Technical Projects “Atomenergomash”, Moscow (Russian Federation); Furman, V.I.; Sagimbaeva, F.; Solnyshkin, A.A.; Stegailov, V.I.; Tichy, P.; Tsoupko-Sitnikov, V.M.; Tyutyunnikov, S.I.; Vespalec, R. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Vrzalova, J. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Nuclear Physics Institute ASCR PRI (Czech Republic); Yuldashev, B.S. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Institute of Nuclear Physics ASRU, Tashkent (Uzbekistan); Wagner, V. [Nuclear Physics Institute ASCR PRI (Czech Republic); Zavorka, L.; Zeman, M. [Joint Institute for Nuclear Research, Dubna (Russian Federation)

    2016-08-15

    The natural uranium assembly, QUINTA, was irradiated with 6 GeV deuterons. The {sup 232}Th samples were placed at the central axis of the setup QUINTA. The spectra of gamma rays emitted by the activated {sup 232}Th samples have been analysed and more than one hundred nuclei produced have been identified. For each of those products, reaction rates have been determined. The ratio of the weight of produced {sup 233}U to {sup 232}Th is presented. Experimental results were compared with the results of Monte Carlo simulations by FLUKA code.

  18. SEE cross section calibration and application to quasi-monoenergetic and spallation facilities

    Directory of Open Access Journals (Sweden)

    Alía Rubén García

    2017-01-01

    Full Text Available We describe an approach to calibrate SEE-based detectors in monoenergetic fields and apply the resulting semi-empiric responses to more general mixed-field cases in which a broad variety of particle species and energy spectra are involved. The calibration of the response functions is based both on experimental proton and neutron data and considerations derived from Monte Carlo simulations using the FLUKA code. The application environments include the quasi-monoenergetic neutrons at RCNP, the atmospheric-like VESUVIO spallation spectrum and the CHARM high-energy accelerator test facility.

  19. Calculations of the photon dose behind concrete shielding of high energy proton accelerators

    International Nuclear Information System (INIS)

    Dworak, D.; Tesch, K.; Zazula, J.M.

    1992-02-01

    The photon dose per primary beam proton behind lateral concrete shieldings was calculated by using an extension of the Monte Carlo particle shower code FLUKA. The following photon-producing processes were taken into account: capture of thermal neutrons, deexcitation of nuclei after nuclear evaporation, inelastic neutron scattering and nuclear reactions below 140 MeV, as well as photons from electromagnetic cascades. The obtained ratio of the photon dose to the neutron dose equivalent varies from 8% to 20% and it well compares with measurements performed recently at DESY giving a mean ratio of 14%. (orig.)

  20. Study of Energy Deposition and Activation for the LINAC4 Dump

    CERN Document Server

    Cerutti, F; Mauro, E; Mereghetti, A; Silari, M; CERN. Geneva. AB Department

    2008-01-01

    This document provides estimates of energy deposition and activation for the dump of the future LINAC4 accelerator. Detailed maps of power density deposited in the dump are given, allowing to perform further thermo mechanical studies. Residual dose rates at a few cooling times for different irradiation scenarios have been calculated. Moreover, the air activation has been evaluated and doses to the reference population group and to a worker intervening in the cave at the shutdown have been predicted. Calculations were performed with the Monte Carlo particle transport and interaction code FLUKA.

  1. Induced radioactivity of materials by stray radiation fields at an electron accelerator

    CERN Document Server

    Rokni, S H; Gwise, T; Liu, J C; Roesler, S

    2002-01-01

    Samples of soil, water, aluminum, copper and iron were irradiated in the stray radiation field generated by the interaction of a 28.5 GeV electron beam in a copper-dump in the Beam Dump East facility at the Stanford Linear Accelerator Center. The specific activity induced in the samples was measured by gamma spectroscopy and other techniques. In addition, the isotope production in the samples was calculated with detailed Monte Carlo simulations using the FLUKA code. The calculated activities are compared to the experimental values and differences are discussed.

  2. Interactions of secondary particles with thorium samples in the setup QUINTA irradiated with 6-GeV deuterons

    International Nuclear Information System (INIS)

    Khushvaktov, J.H.; Yuldashev, B.S.; Adam, J.; Vrzalova, J.; Baldin, A.A.; Chilap, V.V.; Furman, V.I.; Sagimbaeva, F.; Solnyshkin, A.A.; Stegailov, V.I.; Tichy, P.; Tsoupko-Sitnikov, V.M.; Tyutyunnikov, S.I.; Vespalec, R.; Zavorka, L.; Wagner, V.; Zeman, M.

    2016-01-01

    The natural uranium assembly, QUINTA, was irradiated with 6-GeV deuterons. The 232 Th samples were placed at the central axis of the setup QUINTA. The spectra of gamma rays emitted by the activated 232 Th samples have been analysed, and more than one hundred nuclei produced have been identified. For each of those products, reaction rates have been determined. The ratio of the weight of produced 233 U to that of 232 Th is presented. Experimental results were compared with the results of Monte Carlo simulations by the FLUKA code. [ru

  3. A Monte Carlo transport code study of the space radiation environment using FLUKA and ROOT

    CERN Document Server

    Wilson, T; Carminati, F; Brun, R; Ferrari, A; Sala, P; Empl, A; MacGibbon, J

    2001-01-01

    We report on the progress of a current study aimed at developing a state-of-the-art Monte-Carlo computer simulation of the space radiation environment using advanced computer software techniques recently available at CERN, the European Laboratory for Particle Physics in Geneva, Switzerland. By taking the next-generation computer software appearing at CERN and adapting it to known problems in the implementation of space exploration strategies, this research is identifying changes necessary to bring these two advanced technologies together. The radiation transport tool being developed is tailored to the problem of taking measured space radiation fluxes impinging on the geometry of any particular spacecraft or planetary habitat and simulating the evolution of that flux through an accurate model of the spacecraft material. The simulation uses the latest known results in low-energy and high-energy physics. The output is a prediction of the detailed nature of the radiation environment experienced in space as well a...

  4. A visual user interface program, EGSWIN, for EGS4

    International Nuclear Information System (INIS)

    Qiu Rui; Li Junli; Wu Zhen

    2005-01-01

    To overcome the inconvenience and difficulty in using the EGS4 code by novice users, a visual user interface program, called the EGSWIN system, has been developed by the Monte Carlo Research Center of Tsinghua University in China. EGSWIN allows users to run EGS4 for many applications without any user coding. A mixed-language programming technique with Visual C++ and Visual Fortran is used in order to embed both EGS4 and PEGS4 into EGSWIN. The system has the features of visual geometry input, geometry processing, visual definitions of source, scoring and computing parameters, and particle trajectories display. Comparison between the calculated results with EGS4 and EGSWIN, as well as with FLUKA and GEANT, has been made to validate EGSWIN. (author)

  5. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  6. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  7. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  8. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  9. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  10. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  11. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  12. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  13. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  14. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  15. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  16. High and low energy gamma beam dump designs for the gamma beam delivery system at ELI-NP

    International Nuclear Information System (INIS)

    Yasin, Zafar; Matei, Catalin; Ur, Calin A.; Mitu, Iani-Octavian; Udup, Emil; Petcu, Cristian

    2016-01-01

    The Extreme Light Infrastructure - Nuclear Physics (ELI-NP) is under construction in Magurele, Bucharest, Romania. The facility will use two 10 PW lasers and a high intensity, narrow bandwidth gamma beam for stand-alone and combined laser-gamma experiments. The accurate estimation of particle doses and their restriction within the limits for both personel and general public is very important in the design phase of any nuclear facility. In the present work, Monte Carlo simulations are performed using FLUKA and MCNPX to design 19.4 and 4 MeV gamma beam dumps along with shielding of experimental areas. Dose rate contour plots from both FLUKA and MCNPX along with numerical values of doses in experimental area E8 of the facility are performed. The calculated doses are within the permissible limits. Furthermore, a reasonable agreement between both codes enhances our confidence in using one or both of them for future calculations in beam dump designs, radiation shielding, radioactive inventory, and other calculations releated to radiation protection. Residual dose rates and residual activity calculations are also performed for high-energy beam dump and their effect is negligible in comparison to contributions from prompt radiation.

  17. Use of borated polyethylene to improve low energy response of a prompt gamma based neutron dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Priyada, P.; Ashwini, U.; Sarkar, P.K., E-mail: pradip.sarkar@manipal.edu

    2016-05-21

    The feasibility of using a combined sample of borated polyethylene and normal polyethylene to estimate neutron ambient dose equivalent from measured prompt gamma emissions is investigated theoretically to demonstrate improvements in low energy neutron dose response compared to only polyethylene. Monte Carlo simulations have been carried out using the FLUKA code to calculate the response of boron, hydrogen and carbon prompt gamma emissions to mono energetic neutrons. The weighted least square method is employed to arrive at the best linear combination of these responses that approximates the ICRP fluence to dose conversion coefficients well in the energy range of 10{sup −8} MeV to 14 MeV. The configuration of the combined system is optimized through FLUKA simulations. The proposed method is validated theoretically with five different workplace neutron spectra with satisfactory outcome. - Highlights: • An improved method is proposed for estimating H⁎(10) using prompt gamma emissions. • A combination of BHDPE and HDPE cylinders is used as a sample. • Linear combination of prompt gamma intensities approximates ICRP-DCC closely. • Feasibility of the method was tested theoretically using workplace neutron spectra.

  18. High and low energy gamma beam dump designs for the gamma beam delivery system at ELI-NP

    Energy Technology Data Exchange (ETDEWEB)

    Yasin, Zafar, E-mail: zafar.yasin@eli-np.ro; Matei, Catalin; Ur, Calin A.; Mitu, Iani-Octavian; Udup, Emil; Petcu, Cristian [Extreme Light Infrastructure - Nuclear Physics / Horia Hulubei National Institute for R& D in Physics and Nuclear Engineering, Bucharest-Magurele (Romania)

    2016-03-25

    The Extreme Light Infrastructure - Nuclear Physics (ELI-NP) is under construction in Magurele, Bucharest, Romania. The facility will use two 10 PW lasers and a high intensity, narrow bandwidth gamma beam for stand-alone and combined laser-gamma experiments. The accurate estimation of particle doses and their restriction within the limits for both personel and general public is very important in the design phase of any nuclear facility. In the present work, Monte Carlo simulations are performed using FLUKA and MCNPX to design 19.4 and 4 MeV gamma beam dumps along with shielding of experimental areas. Dose rate contour plots from both FLUKA and MCNPX along with numerical values of doses in experimental area E8 of the facility are performed. The calculated doses are within the permissible limits. Furthermore, a reasonable agreement between both codes enhances our confidence in using one or both of them for future calculations in beam dump designs, radiation shielding, radioactive inventory, and other calculations releated to radiation protection. Residual dose rates and residual activity calculations are also performed for high-energy beam dump and their effect is negligible in comparison to contributions from prompt radiation.

  19. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    Directory of Open Access Journals (Sweden)

    Tatsuhiko Sato

    Full Text Available By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni, muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS simulation performed by Particle and Heavy Ion Transport code System (PHITS. The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS. Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  20. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  1. Poster - 40: Treatment Verification of a 3D-printed Eye Phantom for Proton Therapy

    International Nuclear Information System (INIS)

    Dunning, Chelsea; Lindsay, Clay; Unick, Nick; Sossi, Vesna; Martinez, Mark; Hoehr, Cornelia

    2016-01-01

    Purpose: Ocular melanoma is a form of eye cancer which is often treated using proton therapy. The benefit of the steep proton dose gradient can only be leveraged for accurate patient eye alignment. A treatment-planning program was written to plan on a 3D-printed anatomical eye-phantom, which was then irradiated to demonstrate the feasibility of verifying in vivo dosimetry for proton therapy using PET imaging. Methods: A 3D CAD eye model with critical organs was designed and voxelized into the Monte-Carlo transport code FLUKA. Proton dose and PET isotope production were simulated for a treatment plan of a test tumour, generated by a 2D treatment-planning program developed using NumPy and proton range tables. Next, a plastic eye-phantom was 3D-printed from the CAD model, irradiated at the TRIUMF Proton Therapy facility, and imaged using a PET scanner. Results: The treatment-planning program prediction of the range setting and modulator wheel was verified in FLUKA to treat the tumour with at least 90% dose coverage for both tissue and plastic. An axial isotope distribution of the PET isotopes was simulated in FLUKA and converted to PET scan counts. Meanwhile, the 3D-printed eye-phantom successfully yielded a PET signal. Conclusions: The 2D treatment-planning program can predict required parameters to sufficiently treat an eye tumour, which was experimentally verified using commercial 3D-printing hardware to manufacture eye-phantoms. Comparison between the simulated and measured PET isotope distribution could provide a more realistic test of eye alignment, and a variation of the method using radiographic film is being developed.

  2. Poster - 40: Treatment Verification of a 3D-printed Eye Phantom for Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, Chelsea; Lindsay, Clay; Unick, Nick; Sossi, Vesna; Martinez, Mark; Hoehr, Cornelia [University of British Columbia, University of Victoria, University of British Columbia, University of British Columbia, University of British Columbia, TRIUMF (Canada)

    2016-08-15

    Purpose: Ocular melanoma is a form of eye cancer which is often treated using proton therapy. The benefit of the steep proton dose gradient can only be leveraged for accurate patient eye alignment. A treatment-planning program was written to plan on a 3D-printed anatomical eye-phantom, which was then irradiated to demonstrate the feasibility of verifying in vivo dosimetry for proton therapy using PET imaging. Methods: A 3D CAD eye model with critical organs was designed and voxelized into the Monte-Carlo transport code FLUKA. Proton dose and PET isotope production were simulated for a treatment plan of a test tumour, generated by a 2D treatment-planning program developed using NumPy and proton range tables. Next, a plastic eye-phantom was 3D-printed from the CAD model, irradiated at the TRIUMF Proton Therapy facility, and imaged using a PET scanner. Results: The treatment-planning program prediction of the range setting and modulator wheel was verified in FLUKA to treat the tumour with at least 90% dose coverage for both tissue and plastic. An axial isotope distribution of the PET isotopes was simulated in FLUKA and converted to PET scan counts. Meanwhile, the 3D-printed eye-phantom successfully yielded a PET signal. Conclusions: The 2D treatment-planning program can predict required parameters to sufficiently treat an eye tumour, which was experimentally verified using commercial 3D-printing hardware to manufacture eye-phantoms. Comparison between the simulated and measured PET isotope distribution could provide a more realistic test of eye alignment, and a variation of the method using radiographic film is being developed.

  3. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  4. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  5. Investigation and quality assurance of numerical methods in radiation protection and dosimetry

    International Nuclear Information System (INIS)

    Chartier, J.L.

    2000-01-01

    This report is primarily intended as a non-exhaustive overview of and a pointer to some of the major Monte Carlo and deterministic codes used in radiation transport in general and radiation protection and dosimetry in particular, with an extended bibliography for those codes. These will include MCNP, EGS, LAHET, FLUKA, MARS, MCBEND, TRIPOLI, SCALES and others. Some deterministic codes such as ANISM, TORT, EVENT, ect. will also be described in some detail, as will be although briefly, BEAM, PEREGRINE and rtt M C which are used in medical physics applications. The codes' order of description and the amount space dedicated to each of them has been randomly dedicated by the time when the sections were written and by their authorship. In this challenging and ambitious exercise, wherever possible (and it has not been easy), the involvement and help of the authors or main developers and users of the codes were sought, at least through their regularly updated web sites. This work, however, stopped short of being either a strong inter-comparison or a benchmarking exercise

  6. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  7. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  8. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  9. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  10. New quantum codes derived from a family of antiprimitive BCH codes

    Science.gov (United States)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  11. Surface acoustic wave coding for orthogonal frequency coded devices

    Science.gov (United States)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  12. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  13. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  14. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  15. Quantum Codes From Cyclic Codes Over The Ring R 2

    International Nuclear Information System (INIS)

    Altinel, Alev; Güzeltepe, Murat

    2016-01-01

    Let R 2 denotes the ring F 2 + μF 2 + υ 2 + μυ F 2 + wF 2 + μwF 2 + υwF 2 + μυwF 2 . In this study, we construct quantum codes from cyclic codes over the ring R 2 , for arbitrary length n, with the restrictions μ 2 = 0, υ 2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R 2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R 2 and we give an example of quantum error-correcting codes form cyclic codes over R 2 . (paper)

  16. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  17. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    Science.gov (United States)

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  18. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  19. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  20. Generic radiation safety design for SSRL synchrotron radiation beamlines

    Energy Technology Data Exchange (ETDEWEB)

    Liu, James C. [Radiation Protection Department, Stanford Linear Accelerator Center (SLAC), MS 48, P.O. Box 20450, Stanford, CA 94309 (United States)]. E-mail: james@slac.stanford.edu; Fasso, Alberto [Radiation Protection Department, Stanford Linear Accelerator Center (SLAC), MS 48, P.O. Box 20450, Stanford, CA 94309 (United States); Khater, Hesham [Radiation Protection Department, Stanford Linear Accelerator Center (SLAC), MS 48, P.O. Box 20450, Stanford, CA 94309 (United States); Prinz, Alyssa [Radiation Protection Department, Stanford Linear Accelerator Center (SLAC), MS 48, P.O. Box 20450, Stanford, CA 94309 (United States); Rokni, Sayed [Radiation Protection Department, Stanford Linear Accelerator Center (SLAC), MS 48, P.O. Box 20450, Stanford, CA 94309 (United States)

    2006-12-15

    To allow for a conservative, simple, uniform, consistent, efficient radiation safety design for all SSRL beamlines, a generic approach has been developed, considering both synchrotron radiation (SR) and gas bremsstrahlung (GB) hazards. To develop the methodology and rules needed for generic beamline design, analytic models, the STAC8 code, and the FLUKA Monte Carlo code were used to pre-calculate sets of curves and tables that can be looked up for each beamline safety design. Conservative beam parameters and standard targets and geometries were used in the calculations. This paper presents the SPEAR3 beamline parameters that were considered in the design, the safety design considerations, and the main pre-calculated results that are needed for generic shielding design. In the end, the rules and practices for generic SSRL beamline design are summarized.

  1. Some Families of Asymmetric Quantum MDS Codes Constructed from Constacyclic Codes

    Science.gov (United States)

    Huang, Yuanyuan; Chen, Jianzhang; Feng, Chunhui; Chen, Riqing

    2018-02-01

    Quantum maximal-distance-separable (MDS) codes that satisfy quantum Singleton bound with different lengths have been constructed by some researchers. In this paper, seven families of asymmetric quantum MDS codes are constructed by using constacyclic codes. We weaken the case of Hermitian-dual containing codes that can be applied to construct asymmetric quantum MDS codes with parameters [[n,k,dz/dx

  2. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  3. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  4. An open, interoperable, and scalable prehospital information technology network architecture.

    Science.gov (United States)

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  5. Preliminary Modelling of Radiation Levels at the Fermilab PIP-II Linac

    Energy Technology Data Exchange (ETDEWEB)

    Lari, L. [CERN; Cerutti, F. [CERN; Esposito, L. S. [CERN; Baffes, C. [Fermilab; Dixon, S. J. [Fermilab; Mokhov, N. V. [Fermilab; Rakhno, I. [Fermilab; Tropin, I. S. [Fermilab

    2018-04-01

    PIP-II is the Fermilab's flagship project for providing powerful, high-intensity proton beams to the laboratory's experiments. The heart of PIP-II is an 800-MeV superconducting linac accelerator. It will be located in a new tunnel with new service buildings and connected to the present Booster through a new transfer line. To support the design of civil engineering and mechanical integration, this paper provides preliminary estimation of radiation level in the gallery at an operational beam loss limit of 0.1 W/m, by means of Monte Carlo calculations with FLUKA and MARS15 codes.

  6. LHCb: Evaluation of the Radiation Environment of the LHCb Experiment

    CERN Multimedia

    Karacson, M

    2011-01-01

    The characterization of all aspects of the radiation field of the LHCb experiment is needed to understand the impact of the unprecedented radiation levels to which its detector and electronics are exposed to. The methodology on how this is done is described. Analysis of the measurements of active and passive sensors of various types which are distributed in and around the detector will be carried out. Appropriate cross calibrations will be applied and comparisons between them will be performed. Critical comparisons with simulation results obtained with the FLUKA Monte Carlo code are also an essential element of the study.

  7. The longitudinal development of showers induced by high-energy hadrons in an iron-sampling calorimeter

    CERN Document Server

    Milke, J; Apel, W D; Badea, F; Bekk, K; Bercuci, A; Bertaina, M; Blümer, H; Bozdog, H; Büttner, C; Chiavassa, A; Daumiller, K; Di Pierro, F; Dolla, P; Engel, R; Engler, J; Fessler, F; Ghia, P L; Gils, H J; Glasstetter, R; Haungs, A; Heck, D; Hörandel, J R; Kampert, K H; Klages, H O; Kolotaev, Yu; Maier, G; Mathes, H J; Mayer, H J; Mitrica, B; Morello, C; Müller, M; Navarra, G; Obenland, R; Oehlschläger, J; Ostapchenko, S; Over, S; Petcu, M; Plewnia, S; Rebel, H; Risse, A; Roth, M; Schieler, H; Scholz, J; Stümpert, M; Thouw, T; Toma, G; Trinchero, G C; Ulrich, H; Valchierotti, S; Van Buren, J; Walkowiak, W; Weindl, A; Wochele, J; Zabierowski, J; Zagromski, S; Zimmermann, D

    2005-01-01

    Occasionally cosmic-ray induced air showers result in single, unaccompanied hadrons at ground level. Such events are investigated with the 300 m2 hadron calorimeter of the KASCADE-Grande experiment. It is an iron sampling calorimeter with a depth of 11 hadronic interaction lengths read out by warm-liquid ionization chambers. The longitudinal shower development is discussed as function of energy up to 30 TeV and the results are compared with simulations using the GEANT/FLUKA code. In addition, results of test measurements at a secondary particle beam of the Super Proton Synchrotron at CERN up to 350 GeV are discussed.

  8. Simulation of Resistive Plate Chamber sensitivity to neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Altieri, S. E-mail: saverio.altieri@pv.infn.it; Belli, G.; Bruno, G.; Merlo, M.; Ratti, S.P.; Riccardi, C.; Torre, P.; Vitulo, P.; Abbrescia, M.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Marangelli, B.; Natali, S.; Nuzzo, S.; Pugliese, G.; Ranieri, A.; Romano, F

    2001-04-01

    The Resistive Plate Chambers (RPCs) sensitivity to neutrons has been simulated using GEANT code with MICAP and FLUKA interfaces. The calculations have been performed as a function of the neutrons energy in the range 0.02 eV-1 GeV. To evaluate the response of the detector in the LHC background environment, the neutron energy spectrum expected in the CMS muon barrel has been taken into account; a hit rate due to neutrons of about 0.6 Hz cm{sup -2} has been estimated for a 250x250 cm{sup 2} RPC in the RB1 station.

  9. Preliminary Modeling Of Radiation Levels At The Fermilab PIP-II Linac arXiv

    CERN Document Server

    Lari, L.; Esposito, L.S.; Baffes, C.; Dixon, S.J.; Mokhov, N.V.; Rakhno, I.; Tropin, I.S.

    PIP-II is the Fermilab's flagship project for providing powerful, high-intensity proton beams to the laboratory's experiments. The heart of PIP-II is an 800-MeV superconducting linac accelerator. It will be located in a new tunnel with new service buildings and connected to the present Booster through a new transfer line. To support the design of civil engineering and mechanical integration, this paper provides preliminary estimation of radiation level in the gallery at an operational beam loss limit of 0.1 W/m, by means of Monte Carlo calculations with FLUKA and MARS15 codes.

  10. Estimate of production of medical isotopes by photo-neutron reaction at the Canadian Light Source

    Science.gov (United States)

    Szpunar, B.; Rangacharyulu, C.; Daté, S.; Ejiri, H.

    2013-11-01

    In contrast to conventional bremsstrahlung photon beam sources, laser backscatter photon sources at electron synchrotrons provide the capability to selectively tune photons to energies of interest. This feature, coupled with the ubiquitous giant dipole resonance excitations of atomic nuclei, promises a fertile method of nuclear isotope production. In this article, we present the results of simulations of production of the medical/industrial isotopes 196Au, 192Ir and 99Mo by (γ,n) reactions. We employ FLUKA Monte Carlo code along with the simulated photon flux for a beamline at the Canadian Light Source in conjunction with a CO2 laser system.

  11. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  12. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  13. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  14. CodeArmor : Virtualizing the Code Space to Counter Disclosure Attacks

    NARCIS (Netherlands)

    Chen, Xi; Bos, Herbert; Giuffrida, Cristiano

    2017-01-01

    Code diversification is an effective strategy to prevent modern code-reuse exploits. Unfortunately, diversification techniques are inherently vulnerable to information disclosure. Recent diversification-aware ROP exploits have demonstrated that code disclosure attacks are a realistic threat, with an

  15. Ion irradiation studies of construction materials for high-power accelerators

    Science.gov (United States)

    Mustafin, E.; Seidl, T.; Plotnikov, A.; Strašík, I.; Pavlović, M.; Miglierini, M.; Stanćek, S.; Fertman, A.; Lanćok, A.

    The paper reviews the activities and reports the current results of GSI-INTAS projects that are dealing with investigations of construction materials for high-power accelerators and their components. Three types of materials have been investigated, namely metals (stainless steel and copper), metallic glasses (Nanoperm, Finemet and Vitrovac) and organic materials (polyimide insulators and glass fiber reinforced plastics/GFRP). The materials were irradiated by different ion beams with various fluencies and energies. The influence of radiation on selected physical properties of these materials has been investigated with the aid of gamma-ray spectroscopy, transmission Mössbauer spectroscopy (TMS), conversion electrons Mössbauer spectroscopy (CEMS), optical spectroscopy (IR and UV/VIS) and other analytical methods. Some experiments were accompanied with computer simulations by FLUKA, SHIELD and SRIM codes. Validity of the codes was verified by comparison of the simulation results with experiments. After the validation, the codes were used to complete the data that could not be obtained experimentally.

  16. Energy spectrum of 208Pb(n,x) reactions

    Science.gov (United States)

    Tel, E.; Kavun, Y.; Özdoǧan, H.; Kaplan, A.

    2018-02-01

    Fission and fusion reactor technologies have been investigated since 1950's on the world. For reactor technology, fission and fusion reaction investigations are play important role for improve new generation technologies. Especially, neutron reaction studies have an important place in the development of nuclear materials. So neutron effects on materials should study as theoretically and experimentally for improve reactor design. For this reason, Nuclear reaction codes are very useful tools when experimental data are unavailable. For such circumstances scientists created many nuclear reaction codes such as ALICE/ASH, CEM95, PCROSS, TALYS, GEANT, FLUKA. In this study we used ALICE/ASH, PCROSS and CEM95 codes for energy spectrum calculation of outgoing particles from Pb bombardment by neutron. While Weisskopf-Ewing model has been used for the equilibrium process in the calculations, full exciton, hybrid and geometry dependent hybrid nuclear reaction models have been used for the pre-equilibrium process. The calculated results have been discussed and compared with the experimental data taken from EXFOR.

  17. Monte-Carlo simulations of neutron shielding for the ATLAS forward region

    CERN Document Server

    Stekl, I; Kovalenko, V E; Vorobel, V; Leroy, C; Piquemal, F; Eschbach, R; Marquet, C

    2000-01-01

    The effectiveness of different types of neutron shielding for the ATLAS forward region has been studied by means of Monte-Carlo simulations and compared with the results of an experiment performed at the CERN PS. The simulation code is based on GEANT, FLUKA, MICAP and GAMLIB. GAMLIB is a new library including processes with gamma-rays produced in (n, gamma), (n, n'gamma) neutron reactions and is interfaced to the MICAP code. The effectiveness of different types of shielding against neutrons and gamma-rays, composed from different types of material, such as pure polyethylene, borated polyethylene, lithium-filled polyethylene, lead and iron, were compared. The results from Monte-Carlo simulations were compared to the results obtained from the experiment. The simulation results reproduce the experimental data well. This agreement supports the correctness of the simulation code used to describe the generation, spreading and absorption of neutrons (up to thermal energies) and gamma-rays in the shielding materials....

  18. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  19. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  20. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    Science.gov (United States)

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p code (odds ratio 310.0, p coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  1. Dose point kernel simulation for monoenergetic electrons and radionuclides using Monte Carlo techniques.

    Science.gov (United States)

    Wu, J; Liu, Y L; Chang, S J; Chao, M M; Tsai, S Y; Huang, D E

    2012-11-01

    Monte Carlo (MC) simulation has been commonly used in the dose evaluation of radiation accidents and for medical purposes. The accuracy of simulated results is affected by the particle-tracking algorithm, cross-sectional database, random number generator and statistical error. The differences among MC simulation software packages must be validated. This study simulated the dose point kernel (DPK) and the cellular S-values of monoenergetic electrons ranging from 0.01 to 2 MeV and the radionuclides of (90)Y, (177)Lu and (103 m)Rh, using Fluktuierende Kaskade (FLUKA) and MC N-Particle Transport Code Version 5 (MCNP5). A 6-μm-radius cell model consisting of the cell surface, cytoplasm and cell nucleus was constructed for cellular S-value calculation. The mean absolute percentage errors (MAPEs) of the scaled DPKs, simulated using FLUKA and MCNP5, were 7.92, 9.64, 4.62, 3.71 and 3.84 % for 0.01, 0.1, 0.5, 1 and 2 MeV, respectively. For the three radionuclides, the MAPEs of the scaled DPKs were within 5 %. The maximum deviations of S(N←N), S(N←Cy) and S(N←CS) for the electron energy larger than 10 keV were 6.63, 6.77 and 5.24 %, respectively. The deviations for the self-absorbed S-values and cross-dose S-values of the three radionuclides were within 4 %. On the basis of the results of this study, it was concluded that the simulation results are consistent between FLUKA and MCNP5. However, there is a minor inconsistency for low energy range. The DPK and the cellular S-value should be used as the quality assurance tools before the MC simulation results are adopted as the gold standard.

  2. Deuteron cross section evaluation for safety and radioprotection calculations of IFMIF/EVEDA accelerator prototype

    International Nuclear Information System (INIS)

    Blideanu, Valentin; Garcia, Mauricio; Joyer, Philippe; Lopez, Daniel; Mayoral, Alicia; Ogando, Francisco; Ortiz, Felix; Sanz, Javier; Sauvan, Patrick

    2011-01-01

    In the frame of IFMIF/EVEDA activities, a prototype accelerator delivering a high power deuteron beam is under construction in Japan. Interaction of these deuterons with matter will generate high levels of neutrons and induced activation, whose predicted yields depend strongly on the models used to calculate the different cross sections. A benchmark test was performed to validate these data for deuteron energies up to 20 MeV and to define a reasonable methodology for calculating the cross sections needed for EVEDA. Calculations were performed using the nuclear models included in MCNPX and PHITS, and the dedicated nuclear model code TALYS. Although the results obtained using TALYS (global parameters) or Monte Carlo codes disagree with experimental values, a solution is proposed to compute cross sections that are a good fit to experimental data. A consistent computational procedure is also suggested to improve both transport simulations/prompt dose and activation/residual dose calculations required for EVEDA.

  3. Deuteron cross section evaluation for safety and radioprotection calculations of IFMIF/EVEDA accelerator prototype

    Energy Technology Data Exchange (ETDEWEB)

    Blideanu, Valentin [Commissariat a l' energie atomique CEA/IRFU, Centre de Saclay, 91191 Gif sur Yvette cedex (France); Garcia, Mauricio [Universidad Nacional de Educacion a Distancia, UNED, Madrid (Spain); Instituto de Fusion Nuclear, UPM, Madrid (Spain); Joyer, Philippe, E-mail: philippe.joyer@cea.fr [Commissariat a l' energie atomique CEA/IRFU, Centre de Saclay, 91191 Gif sur Yvette cedex (France); Lopez, Daniel; Mayoral, Alicia; Ogando, Francisco [Universidad Nacional de Educacion a Distancia, UNED, Madrid (Spain); Instituto de Fusion Nuclear, UPM, Madrid (Spain); Ortiz, Felix [Universidad Nacional de Educacion a Distancia, UNED, Madrid (Spain); Sanz, Javier; Sauvan, Patrick [Universidad Nacional de Educacion a Distancia, UNED, Madrid (Spain); Instituto de Fusion Nuclear, UPM, Madrid (Spain)

    2011-10-01

    In the frame of IFMIF/EVEDA activities, a prototype accelerator delivering a high power deuteron beam is under construction in Japan. Interaction of these deuterons with matter will generate high levels of neutrons and induced activation, whose predicted yields depend strongly on the models used to calculate the different cross sections. A benchmark test was performed to validate these data for deuteron energies up to 20 MeV and to define a reasonable methodology for calculating the cross sections needed for EVEDA. Calculations were performed using the nuclear models included in MCNPX and PHITS, and the dedicated nuclear model code TALYS. Although the results obtained using TALYS (global parameters) or Monte Carlo codes disagree with experimental values, a solution is proposed to compute cross sections that are a good fit to experimental data. A consistent computational procedure is also suggested to improve both transport simulations/prompt dose and activation/residual dose calculations required for EVEDA.

  4. Nuclear reaction models - source term estimation for safety design in accelerators

    International Nuclear Information System (INIS)

    Nandy, Maitreyee

    2013-01-01

    Accelerator driven subcritical system (ADSS) employs proton induced spallation reaction at a few GeV. Safety design of these systems involves source term estimation in two steps - multiple fragmentation of the target and n+γ emission through a fast process followed by statistical decay of the primary fragments. The prompt radiation field is estimated in the framework of quantum molecular dynamics (QMD) theory, intra-nuclear cascade or Monte Carlo calculations. A few nuclear reaction model codes used for this purpose are QMD, JQMD, Bertini, INCL4, PHITS, followed by statistical decay codes like ABLA, GEM, GEMINI, etc. In the case of electron accelerators photons and photoneutrons dominate the prompt radiation field. High energy photon yield through Bremsstrahlung is estimated in the framework of Born approximation while photoneutron production is calculated using giant dipole resonance and quasi-deuteron formation cross section. In this talk hybrid and exciton PEQ models and QMD formalism will be discussed briefly

  5. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  6. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  7. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  8. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  9. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  10. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  11. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  12. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  13. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  14. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  15. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  16. Comparison of inclusive particle production in 14.6 GeV/c proton-nucleus collisions with simulation

    International Nuclear Information System (INIS)

    Jaffe, D.E.; Lo, K.H.; Comfort, J.R.; Sivertz, M.

    2006-01-01

    Inclusive charged pion, kaon, proton and deuteron production in 14.6 GeV/c proton-nucleus collisions measured by BNL experiment E802 is compared with results from the GEANT3, GEANT4 and FLUKA simulation packages. The FLUKA package is found to have the best overall agreement

  17. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  18. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  19. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  20. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  1. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  2. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  3. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Austregesilo, H.; Velkov, K. [GRS, Garching (Germany)] [and others

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  4. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    International Nuclear Information System (INIS)

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-01-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes

  5. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  6. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  7. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  8. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  9. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  10. Strict optical orthogonal codes for purely asynchronous code-division multiple-access applications

    Science.gov (United States)

    Zhang, Jian-Guo

    1996-12-01

    Strict optical orthogonal codes are presented for purely asynchronous optical code-division multiple-access (CDMA) applications. The proposed code can strictly guarantee the peaks of its cross-correlation functions and the sidelobes of any of its autocorrelation functions to have a value of 1 in purely asynchronous data communications. The basic theory of the proposed codes is given. An experiment on optical CDMA systems is also demonstrated to verify the characteristics of the proposed code.

  11. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  12. Quantum Codes From Negacyclic Codes over Group Ring ( Fq + υFq) G

    International Nuclear Information System (INIS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-01-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring ( F q + υF q ) G . By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over F q . (paper)

  13. Monte Carlo transport in radiotherapy - current status and prospects, and physical data needs. Report of a consultants' meeting 25-29 September 2000, IAEA, Vienna

    International Nuclear Information System (INIS)

    2000-01-01

    The IAEA has maintained an interest in computerized radiotherapy dose calculations going as far back as the nineteen sixties with several publications in the field. In the meantime, powerful general-purpose Monte Carlo codes applicable to the energy range of interest to radiotherapy (roughly 100 keV to 50 MeV photons, electrons and positrons) have emerged. These codes, ETRAN, the ITS system, the EGS system, MCNP, FLUKA, GEANT and more recently PENELOPE and EGSnrc are general-purpose codes intended to address not only the radiotherapy problem, but also dosimetry, high-energy physics, surface analysis, and a wide variety of challenging applications. As these codes are of a general-purpose nature, and designed to address a very wide variety of applications, they are necessarily complex, and contain algorithms and techniques that are either not required for the radiotherapy applications, or are unnecessarily stringent. Consequently, several new Monte Carlo systems and application codes specifically addressed to radiotherapy treatment planning (RTP); namely, MCDOSE, MMC, PEREGRINE, SMC, VMC, VMC++ and DPM have been developed. The design goal of these systems is to provide sufficiently accurate dose calculation and great increases in speed over their general-purpose brethren

  14. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  15. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  16. Production yield of produced radioisotopes from 100 MeV proton beam on lead target for shielding analysis of large accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Oranj, Leila Mokhtari; Oh, Joo Hee; Jung, Nam Suk; Bae, O Ryun; Lee, Hee Seock [Div. of Advanced Nuclear Engineering, POSTECH, Pohang (Korea, Republic of)

    2014-11-15

    In this work, the production yield of major shielding material, a lead, was investigated using 100 MeV protons of KOMAC accelerator facility. For the analysis of the experimental data, the activity has been calculated using the FLUKA Monte Carlo code and analytical methods. The cross section data and the stopping power in the irradiated assembly were calculated by TALYS and SRIM codes in the analytical method, respectively. Consequently, the experimental production yield of produced radioisotopes was compared with the data that are based on Monte Carlo calculations and analytical studies. In this research, the {sup nat}Pb(p, x) reaction was studied using experimental measurements, Monte Carlo simulations and analytical methods. Rereading to the experimental measurements, we demonstrate that both Monte Carlo simulation and analytical methods could be useful tools for the estimation of production yield of this reaction.

  17. Production yield of produced radioisotopes from 100 MeV proton beam on lead target for shielding analysis of large accelerator

    International Nuclear Information System (INIS)

    Oranj, Leila Mokhtari; Oh, Joo Hee; Jung, Nam Suk; Bae, O Ryun; Lee, Hee Seock

    2014-01-01

    In this work, the production yield of major shielding material, a lead, was investigated using 100 MeV protons of KOMAC accelerator facility. For the analysis of the experimental data, the activity has been calculated using the FLUKA Monte Carlo code and analytical methods. The cross section data and the stopping power in the irradiated assembly were calculated by TALYS and SRIM codes in the analytical method, respectively. Consequently, the experimental production yield of produced radioisotopes was compared with the data that are based on Monte Carlo calculations and analytical studies. In this research, the nat Pb(p, x) reaction was studied using experimental measurements, Monte Carlo simulations and analytical methods. Rereading to the experimental measurements, we demonstrate that both Monte Carlo simulation and analytical methods could be useful tools for the estimation of production yield of this reaction

  18. Energy deposited in the high luminosity inner triplets of the LHC by collision debris

    International Nuclear Information System (INIS)

    Wildner, E.; Broggi, F.; Cerutti, F.; Ferrari, A.; Hoa, C.; Koutchouk, J.-P.; Mokhov, N.V.

    2008-01-01

    The 14 TeV center of mass proton-proton collisions in the LHC produce not only debris interesting for physics but also showers of particles ending up in the accelerator equipment, in particular in the superconducting magnet coils. Evaluations of this contribution to the heat, that has to be transported by the cryogenic system, have been made to guarantee that the energy deposition in the superconducting magnets does not exceed limits for magnet quenching and the capacity of the cryogenic system. The models of the LHC base-line are detailed and include description of, for energy deposition, essential elements like beam-pipes and corrector magnets. The evaluations made using the Monte-Carlo code FLUKA are compared to previous studies using MARS. For the consolidation of the calculations, a dedicated comparative study of these two codes was performed for a reduced setup

  19. ComboCoding: Combined intra-/inter-flow network coding for TCP over disruptive MANETs

    Directory of Open Access Journals (Sweden)

    Chien-Chia Chen

    2011-07-01

    Full Text Available TCP over wireless networks is challenging due to random losses and ACK interference. Although network coding schemes have been proposed to improve TCP robustness against extreme random losses, a critical problem still remains of DATA–ACK interference. To address this issue, we use inter-flow coding between DATA and ACK to reduce the number of transmissions among nodes. In addition, we also utilize a “pipeline” random linear coding scheme with adaptive redundancy to overcome high packet loss over unreliable links. The resulting coding scheme, ComboCoding, combines intra-flow and inter-flow coding to provide robust TCP transmission in disruptive wireless networks. The main contributions of our scheme are twofold; the efficient combination of random linear coding and XOR coding on bi-directional streams (DATA and ACK, and the novel redundancy control scheme that adapts to time-varying and space-varying link loss. The adaptive ComboCoding was tested on a variable hop string topology with unstable links and on a multipath MANET with dynamic topology. Simulation results show that TCP with ComboCoding delivers higher throughput than with other coding options in high loss and mobile scenarios, while introducing minimal overhead in normal operation.

  20. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  1. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  2. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  3. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  4. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jin; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures.

  5. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2004-01-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures

  6. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  7. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  9. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  10. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  11. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  12. Design of auxiliary shield for remote controlled metallographic microscope

    International Nuclear Information System (INIS)

    Matsui, Hiroki; Okamoto, Hisato

    2014-06-01

    The remote controlled optical microscope installed in the lead cell at the Reactor Fuel Examination Facility (RFEF) in Japan Atomic Energy Agency (JAEA) has been upgraded to a higher performance unit to study the effect of the microstructural evolution in clad material on the high burn-up fuel behavior under the accident condition. The optical pass of the new microscope requires a new through hole in the shielding lead wall of the cell. To meet safety regulations, auxiliary lead shieldings were designed to cover the lost shielding function of the cell wall. Particle and Heavy Ion Transport Code System (PHITS) was used to calculate and determine the shape and setting positions of the shielding unit. Seismic assessments of the unit were also performed. (author)

  13. Proceedings of the 2011 symposium on nuclear data

    International Nuclear Information System (INIS)

    Harada, Hideo; Yokoyama, Kenji; Iwamoto, Nobuyuki; Nakamura, Shoji

    2012-12-01

    The 2011 data symposium on nuclear data, organized by the Nuclear Data Division of Atomic Energy Society of Japan (AESJ) was held at Ricotti, Tokai, on Nov. 16 and 17, 2011 in cooperation with Nuclear Science and Engineering Directorate of JAEA and North-Kanto Branch of AESJ. The symposium was devoted for discussions and presentations of current topics in the field of nuclear data such as nuclear accident and accident analysis code, innovative methods on nuclear data theory and measurements, and nuclear data applications, including 2 tutorial talks, NJOY99 and PHITS. Talks as well as posters presented at the symposium aroused lively discussions among 97 participants. This report contains 34 papers submitted from the oral and poster presenters. (author)

  14. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  15. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  16. Tri-code inductance control rod position indicator with several multi-coding-bars

    International Nuclear Information System (INIS)

    Shi Jibin; Jiang Yueyuan; Wang Wenran

    2004-01-01

    A control rod position indicator named as tri-code inductance control rod position indicator with multi-coding-bars, which possesses simple structure, reliable operation and high precision, is developed. The detector of the indicator is composed of K coils, a compensatory coil and K coding bars. Each coding bar consists of several sections of strong magnetic cores, several sections of weak magnetic cores and several sections of non-magnetic portions. As the control rod is withdrawn, the coding bars move in the center of the coils respectively, while the constant alternating current passes the coils and makes them to create inductance alternating voltage signals. The outputs of the coils are picked and processed, and the tri-codes indicating rod position can be gotten. Moreover, the coding principle of the detector and its related structure are introduced. The analysis shows that the indicator owns more advantage over the coils-coding rod position indicator, so it can meet the demands of the rod position indicating in nuclear heating reactor (NHR). (authors)

  17. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  18. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  19. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  20. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  1. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  2. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  3. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  4. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  5. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  6. Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab: an update on PR12-16-001

    Energy Technology Data Exchange (ETDEWEB)

    Battaglieri, M. [Istituto Nazionale di Fisica Nucleare (INFN), Genova (Italy); et. al.

    2017-12-07

    This document is an update to the proposal PR12-16-001 Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab submitted to JLab-PAC44 in 2016 reporting progress in addressing questions raised regarding the beam-on backgrounds. The concerns are addressed by adopting a new simulation tool, FLUKA, and planning measurements of muon fluxes from the dump with its existing shielding around the dump. First, we have implemented the detailed BDX experimental geometry into a FLUKA simulation, in consultation with experts from the JLab Radiation Control Group. The FLUKA simulation has been compared directly to our GEANT4 simulations and shown to agree in regions of validity. The FLUKA interaction package, with a tuned set of biasing weights, is naturally able to generate reliable particle distributions with very small probabilities and therefore predict rates at the detector location beyond the planned shielding around the beam dump. Second, we have developed a plan to conduct measurements of the muon ux from the Hall-A dump in its current configuration to validate our simulations.

  7. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  8. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  9. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  10. Induced radioactivity in a 4 MW target and its surroundings

    CERN Document Server

    Agosteo, Stefano; Otto, Thomas; Silari, Marco

    2003-01-01

    An important aspect of a future CERN Neutrino Factory is the material activation arising from a 2.2 GeV, 4 MW proton beam striking a mercury target. An estimation of the hadronic inelastic interactions and the production of residual nuclei in the target, the magnetic horn, the decay tunnel, the surrounding rock and a downstream dump was performed by the Monte Carlo hadronic cascade code FLUKA. The aim was both to assess the dose equivalent rate to be expected during maintenance work and to evaluate the amount of residual radioactivity, which will have to be disposed of after the facility has ceased operation.

  11. Monte Carlo Calculations of Dose to Medium and Dose to Water for Carbon Ion Beams in Various Media

    DEFF Research Database (Denmark)

    Herrmann, Rochus; Petersen, Jørgen B.B.; Jäkel, Oliver

    treatment plans. Here, we quantisize the effect of dose to water vs. dose to medium for a series of typical target materials found in medical physics. 2     Material and Methods The Monte Carlo code FLUKA [Battistioni et al. 2007] is used to simulate the particle fluence spectrum in a series of target...... for water. This represents the case that our “detector” is an infinitesimal small non-perturbing entity made of water, where charged particle equilibrium can be assumed following the Bragg-Gray cavity theory. Dw and Dm are calculated for typical materials such as bone, brain, lung and soft-tissues using...

  12. A Bonner Sphere Spectrometer with extended response matrix

    Energy Technology Data Exchange (ETDEWEB)

    Birattari, C. [University of Milan, Department of Physics, Via Celoria 16, 20133 Milan (Italy); Dimovasili, E.; Mitaroff, A. [CERN, 1211 Geneva 23 (Switzerland); Silari, M., E-mail: marco.silari@cern.c [CERN, 1211 Geneva 23 (Switzerland)

    2010-08-21

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  13. A Bonner Sphere Spectrometer with extended response matrix

    International Nuclear Information System (INIS)

    Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.

    2010-01-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  14. A Bonner Sphere Spectrometer with extended response matrix

    Science.gov (United States)

    Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.

    2010-08-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.

  15. Radiation Dose for Equipment in the LHC Arcs

    CERN Document Server

    Wittenburg, K; Spickermann, T

    1998-01-01

    Collisions of protons with residual gas molecules or the beam screen installed in the vacuum chamber are the main sources for the radiation dose in the LHC arcs. The dose due to proton-gas collisions depends on gas pressure, energy and intensity of the circulating beam. The dose is about equally distributed along the arc and has been calculated in previous papers. Collisions of particles with the beam screen will take place where the beam size is largest - close to focusing quadrupole magnets. For this paper the radiation doses due to particles hitting the beam screen in a quadrupole were calculated with the shower codes GEANT3.21 and FLUKA96.

  16. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Farklı Yoğunluktaki Malzemelerin Nötron Zayıflatma Özelliklerinin İncelenmesi

    Directory of Open Access Journals (Sweden)

    Demet SARIYER

    2015-07-01

    Full Text Available Özet: Yüksek yoğunluklu radyasyon alanlarının oluştuğu hızlandırıcılarda, radyasyon seviyesini izin verilen doz değerlerine zayıflatmak için zırh tasarımı yapılır. Zırhın belirlenmesinde, radyasyon zayıflatma özellikleriyle birlikte kalınlığı, ağırlığı, kurulum ve bakım maliyeti gibi faktörler de göz önünde bulundurulur. Proton hızlandırıcılarında, zırhlama için etkin olan radyasyon nötronlardır ve zırh tasarımı nötronlara göre yapılır. Zırh maddesi olarak genellikle beton, toprak ve çelik kullanılır. Bu çalışmada, hızlandırıcı zırh tasarımında gerekli minimum yan duvar zırh kalınlıklarını belirlemek için farklı yoğunluklarda (toprak, standart beton, demir zırh maddeleri seçildi. Zırh kalınlıkları, FLUKA Monte Carlo kodu ile belirlendi. Anahtar kelimeler: Proton hızlandırıcı, zırh tasarımı, FLUKA, demir Investigation of Neutron Attenuation Properties for the Different Density Materials Abstract: The generation of high-intensity radiation fields in the accelerators, shield design is made to attenuation permissible levels of radiation dose. For determination of shield material, thicknesses, weight, installation and maintenance costs as well as radiation attenuation properties are taken into consideration such factors. Effective radiation for shielding is neutrons in proton accelerators and shield design is made for neutrons. Concrete, soil and iron are widely used as a shield material. In this paper, the different density of the shielding materials (soil, standard concrete, iron were selected to determine for the minimum thickness of the side wall for shielding design of proton accelerator. The thickness of the shielding is obtained by a simulation with the Monte Carlo Code FLUKA. Key words: Proton accelerator, shield design, FLUKA, iron.

  18. Review of the Monte Carlo and deterministic codes in radiation protection and dosimetry

    International Nuclear Information System (INIS)

    Tagziria, H.

    2000-02-01

    purpose. One failure, unfortunately common to many codes (including some leading and generally available codes), is the lack of effort expended in providing a descent statistical and sensitivity analysis package, which would help the user to avoid traps such as false convergence. Another failure, which is this time blameable on us the users, is our failure to grasp the importance of choosing well, and using sensibly, cross section data. The impact of such or other incorrect input data on our results is often overlooked. With new developments in computing technology and in variance reduction or acceleration techniques, Monte Carlo calculations can nowadays be performed with very small statistical uncertainties. These are often so low that they become negligible compared to other, sometimes much larger uncertainties such as those due to input data, source definition, geometry response functions, etc. Both code developers and users alike unfortunately often ignore any sensitivity analysis. This report is primarily intended as a non-exhaustive overview of and a pointer to some of the major Monte Carlo and Deterministic codes used in radiation transport in general and radiation protection and dosimetry in particular, with an extended bibliography for those codes. These will include MCNP, EGS, LAHET, FLUKA, MARS, MCBEND, TRIPOLI, SCALES and others. Some deterministic codes such as ANISN, TORT, EVENT, etc. will also be described in some detail, as will be although briefly, BEAM, PEREGRINE and rtt M C which are used in medical physics applications. The codes' order of description and the amount space dedicated to each of them has been randomly dictated by the time when the sections were written and by their authorship. In this challenging and ambitious exercise, wherever possible (and it has not been easy), we sought the involvement and help of the authors or main developers and users of the codes, at least through their regularly updated web sites

  19. Codes maintained by the LAACG [Los Alamos Accelerator Code Group] at the NMFECC

    International Nuclear Information System (INIS)

    Wallace, R.; Barts, T.

    1990-01-01

    The Los Alamos Accelerator Code Group (LAACG) maintains two groups of design codes at the National Magnetic Fusion Energy Computing Center (NMFECC). These codes, principally electromagnetic field solvers, are used for the analysis and design of electromagnetic components for accelerators, e.g., magnets, rf structures, pickups, etc. In this paper, the status and future of the installed codes will be discussed with emphasis on an experimental version of one set of codes, POISSON/SUPERFISH

  20. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  1. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  2. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  3. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  4. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  5. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  6. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  7. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  8. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  9. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  10. Shielding evaluation of neutron generator hall by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Pujala, U.; Selvakumaran, T.S.; Baskaran, R.; Venkatraman, B. [Radiological Safety Division, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Thilagam, L.; Mohapatra, D.K., E-mail: swathythila2@yahoo.com [Safety Research Institute, Atomic Energy Regulatory Board, Kalpakkam (India)

    2017-04-01

    A shielded hall was constructed for accommodating a D-D, D-T or D-Be based pulsed neutron generator (NG) with 4π yield of 10{sup 9} n/s. The neutron shield design of the facility was optimized using NCRP-51 methodology such that the total dose rates outside the hall areas are well below the regulatory limit for full occupancy criterion (1 μSv/h). However, the total dose rates at roof top, cooling room trench exit and labyrinth exit were found to be above this limit for the optimized design. Hence, additional neutron shielding arrangements were proposed for cooling room trench and labyrinth exits. The roof top was made inaccessible. The present study is an attempt to evaluate the neutron and associated capture gamma transport through the bulk shields for the complete geometry and materials of the NG-Hall using Monte Carlo (MC) codes MCNP and FLUKA. The neutron source terms of D-D, D-T and D-Be reactions are considered in the simulations. The effect of additional shielding proposed has been demonstrated through the simulations carried out with the consideration of the additional shielding for D-Be neutron source term. The results MC simulations using two different codes are found to be consistent with each other for neutron dose rate estimates. However, deviation up to 28% is noted between these two codes at few locations for capture gamma dose rate estimates. Overall, the dose rates estimated by MC simulations including additional shields shows that all the locations surrounding the hall satisfy the full occupancy criteria for all three types of sources. Additionally, the dose rates due to direct transmission of primary neutrons estimated by FLUKA are compared with the values calculated using the formula given in NCRP-51 which shows deviations up to 50% with each other. The details of MC simulations and NCRP-51 methodology for the estimation of primary neutron dose rate along with the results are presented in this paper. (author)

  11. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  12. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed

  13. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  14. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  15. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  16. Coding and decoding for code division multiple user communication systems

    Science.gov (United States)

    Healy, T. J.

    1985-01-01

    A new algorithm is introduced which decodes code division multiple user communication signals. The algorithm makes use of the distinctive form or pattern of each signal to separate it from the composite signal created by the multiple users. Although the algorithm is presented in terms of frequency-hopped signals, the actual transmitter modulator can use any of the existing digital modulation techniques. The algorithm is applicable to error-free codes or to codes where controlled interference is permitted. It can be used when block synchronization is assumed, and in some cases when it is not. The paper also discusses briefly some of the codes which can be used in connection with the algorithm, and relates the algorithm to past studies which use other approaches to the same problem.

  17. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  18. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  19. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  20. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  1. CITOPP, CITMOD, CITWI, Processing codes for CITATION Code

    International Nuclear Information System (INIS)

    Albarhoum, M.

    2008-01-01

    Description of program or function: CITOPP processes the output file of the CITATION 3-D diffusion code. The program can plot axial, radial and circumferential flux distributions (in cylindrical geometry) in addition to the multiplication factor convergence. The flux distributions can be drawn for each group specified by the program and visualized on the screen. CITMOD processes both the output and the input files of the CITATION 3-D diffusion code. CITMOD can visualize both axial, and radial-angular models of the reactor described by CITATION input/output files. CITWI processes the input file (CIT.INP) of CITATION 3-D diffusion code. CIT.INP is processed to deduce the dimensions of the cell whose cross sections can be representative of the homonym reactor component in section 008 of CIT.INP

  2. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  3. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  4. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  5. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  6. Coding training for medical students: How good is diagnoses coding with ICD-10 by novices?

    Directory of Open Access Journals (Sweden)

    Stausberg, Jürgen

    2005-04-01

    Full Text Available Teaching of knowledge and competence in documentation and coding is an essential part of medical education. Therefore, coding training had been placed within the course of epidemiology, medical biometry, and medical informatics. From this, we can draw conclusions about the quality of coding by novices. One hundred and eighteen students coded diagnoses from 15 nephrological cases in homework. In addition to interrater reliability, validity was calculated by comparison with a reference coding. On the level of terminal codes, 59.3% of the students' results were correct. The completeness was calculated as 58.0%. The results on the chapter level increased up to 91.5% and 87.7% respectively. For the calculation of reliability a new, simple measure was developed that leads to values of 0.46 on the level of terminal codes and 0.87 on the chapter level for interrater reliability. The figures of concordance with the reference coding are quite similar. In contrary, routine data show considerably lower results with 0.34 and 0.63 respectively. Interrater reliability and validity of coding by novices is as good as coding by experts. The missing advantage of experts could be explained by the workload of documentation and a negative attitude to coding on the one hand. On the other hand, coding in a DRG-system is handicapped by a large number of detailed coding rules, which do not end in uniform results but rather lead to wrong and random codes. Anyway, students left the course well prepared for coding.

  7. Turbo coding, turbo equalisation and space-time coding for transmission over fading channels

    CERN Document Server

    Hanzo, L; Yeap, B

    2002-01-01

    Against the backdrop of the emerging 3G wireless personal communications standards and broadband access network standard proposals, this volume covers a range of coding and transmission aspects for transmission over fading wireless channels. It presents the most important classic channel coding issues and also the exciting advances of the last decade, such as turbo coding, turbo equalisation and space-time coding. It endeavours to be the first book with explicit emphasis on channel coding for transmission over wireless channels. Divided into 4 parts: Part 1 - explains the necessary background for novices. It aims to be both an easy reading text book and a deep research monograph. Part 2 - provides detailed coverage of turbo conventional and turbo block coding considering the known decoding algorithms and their performance over Gaussian as well as narrowband and wideband fading channels. Part 3 - comprehensively discusses both space-time block and space-time trellis coding for the first time in literature. Par...

  8. Activation of accelerator construction materials by heavy ions

    Energy Technology Data Exchange (ETDEWEB)

    Katrík, P., E-mail: p.katrik@gsi.de [GSI Darmstadt, Planckstrasse 1, D-64291 (Germany); Mustafin, E. [GSI Darmstadt, Planckstrasse 1, D-64291 (Germany); Hoffmann, D.H.H. [TU Darmstadt, Schlossgartenstraße 9, D-64289 (Germany); Pavlovič, M. [FEI STU Bratislava, Ilkovičova 3, SK-81219 (Slovakia); Strašík, I. [GSI Darmstadt, Planckstrasse 1, D-64291 (Germany)

    2015-12-15

    Activation data for an aluminum target irradiated by 200 MeV/u {sup 238}U ion beam are presented in the paper. The target was irradiated in the stacked-foil geometry and analyzed using gamma-ray spectroscopy. The purpose of the experiment was to study the role of primary particles, projectile fragments, and target fragments in the activation process using the depth profiling of residual activity. The study brought information on which particles contribute dominantly to the target activation. The experimental data were compared with the Monte Carlo simulations by the FLUKA 2011.2c.0 code. This study is a part of a research program devoted to activation of accelerator construction materials by high-energy (⩾200 MeV/u) heavy ions at GSI Darmstadt. The experimental data are needed to validate the computer codes used for simulation of interaction of swift heavy ions with matter.

  9. Decoding Xing-Ling codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2002-01-01

    This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed.......This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed....

  10. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  11. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  12. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  13. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  14. Summary and presentation of the international workshop on beam induced energy deposition (issues, concerns, solutions)

    International Nuclear Information System (INIS)

    Soundranayagam, R.

    1991-11-01

    This report discusses: energy deposition and radiation shielding in antriproton source at FNAL; radiation issues/problems at RHIC; radiation damage to polymers; radiation effects on optical fibre in the SSC tunnel; capabilities of the Brookhaven Radiation Effects Facility; the SSC interaction region; the FLUKA code system, modifications, recent extension and experimental verification; energy particle transport calculations and comparisons with experimental data; Los Alamos High Energy Transport code system; MCNP features and applications; intercomparison of Monte Carlo codes designed for simulation of high energy hadronic cascades; event generator, DTUJET-90 and DTUNUC; Preliminary hydrodynamic calculations of beam energy deposition; MESA code calculations of material response to explosive energy deposition; Smooth particle hydrodynamic; hydrodynamic effects and mass depletion phenomena in targets; beam dump: Beam sweeping and spoilers; Design considerations to mitigate effects of accidental beam dump; SSC beam abort and absorbed; beam abort system of SSC options; unconventional scheme for beam spoilers; low β quadrupoles: Energy deposition and radioactivation; beam induces energy deposition in the SSC components; extension of SSC-SR-1033 approach to radioactivation in LHC and SSC detectors; energy deposition in the SSC low-β IR-quads; beam losses and collimation in the LHC; and radiation shielding around scrapers

  15. MARS code manual volume I: code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Yoon, Churl

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This theory manual provides a complete list of overall information of code structure and major function of MARS including code architecture, hydrodynamic model, heat structure, trip / control system and point reactor kinetics model. Therefore, this report would be very useful for the code users. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  16. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  17. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  18. Event Generators for Simulating Heavy Ion Interactions of Interest in Evaluating Risks in Human Spaceflight

    Science.gov (United States)

    Wilson, Thomas L.; Pinsky, Lawrence; Andersen, Victor; Empl, Anton; Lee, Kerry; Smirmov, Georgi; Zapp, Neal; Ferrari, Alfredo; Tsoulou, Katerina; Roesler, Stefan; hide

    2005-01-01

    Simulating the Space Radiation environment with Monte Carlo Codes, such as FLUKA, requires the ability to model the interactions of heavy ions as they penetrate spacecraft and crew member's bodies. Monte-Carlo-type transport codes use total interaction cross sections to determine probabilistically when a particular type of interaction has occurred. Then, at that point, a distinct event generator is employed to determine separately the results of that interaction. The space radiation environment contains a full spectrum of radiation types, including relativistic nuclei, which are the most important component for the evaluation of crew doses. Interactions between incident protons with target nuclei in the spacecraft materials and crew member's bodies are well understood. However, the situation is substantially less comfortable for incident heavier nuclei (heavy ions). We have been engaged in developing several related heavy ion interaction models based on a Quantum Molecular Dynamics-type approach for energies up through about 5 GeV per nucleon (GeV/A) as part of a NASA Consortium that includes a parallel program of cross section measurements to guide and verify this code development.

  19. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  20. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  1. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  3. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  4. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  5. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  6. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  7. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  8. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  9. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    Science.gov (United States)

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  10. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  11. Proceedings of the 2007 symposium on nuclear data

    International Nuclear Information System (INIS)

    Hazama, Taira; Fukahori, Tokio

    2008-11-01

    The 2007 Symposium on Nuclear Data was held at RICOTTI in Tokai-mura, Ibaraki-ken, Japan, on 29th and 30th of November 2007, with about 80 participants. Nuclear Data Division of Atomic Energy Society of Japan organized this symposium with cooperation of North Kanto Branch of the society. In the oral sessions, 10 papers were presented on topics of JEDNL-4, experiments, evaluations, applications, and research activity in China. In the poster session, presented were 12 papers concerning experiments, evaluations, benchmark tests, and so on. Tutorials on nuclear data, which were for cross section data creation process and PHITS code, were also done. Major part of those presented papers is compiled in this proceedings. The 19 of the presented papers are indexed individually. (J.P.N.)

  12. Optimization of a ΔE - E detector for 41Ca AMS

    Science.gov (United States)

    Hosoya, Seiji; Sasa, Kimikazu; Matsunaka, Tetsuya; Takahashi, Tsutomu; Matsumura, Masumi; Matsumura, Hiroshi; Sundquist, Mark; Stodola, Mark; Sueki, Keisuke

    2017-09-01

    A series of nuclides (14C, 26Al, and 36Cl) was measured using the 12UD Pelletron tandem accelerator before replacement by the horizontal 6 MV tandem accelerator at the University of Tsukuba Tandem Accelerator Complex (UTTAC). This paper considers the modification of the accelerator mass spectrometry (AMS) measurement parameters to suit the current 6 MV tandem accelerator setup (e.g., terminal voltage, detected ion charge state, gas pressure, and entrance window material in detector). The Particle and Heavy Ion Transport code System (PHITS) was also used to simulate AMS measurement to determine the best conditions to suppress isobaric interference. The spectra of 41Ca and 41K were then successfully separated and their nuclear spectra were identified; the system achieved a background level of 41Ca/40Ca ∼ 6 ×10-14 .

  13. DOSE-Analyzer. A computer program with graphical user interface to analyze absorbed dose inside a body of mouse and human upon external neutron exposure

    International Nuclear Information System (INIS)

    Satoh, Daiki; Takahashi, Fumiaki; Shigemori, Yuji; Sakamoto, Kensaku

    2010-06-01

    DOSE-Analyzer is a computer program to retrieve the dose information from a database and generate a graph through a graphical user interface (GUI). The database is constructed for absorbed dose, fluence, and energy distribution inside a body of mouse and human exposed upon external neutrons, which is calculated by our developed Monte-Carlo simulation method using voxel-based phantom and particle transport code PHITS. The input configurations of irradiation geometry, subject, and energy are set by GUI. The results are tabulated at particle types, i.e. electron, proton, deuteron, triton, and alpha particle, and target organs on a data sheet of Microsoft Office Excel TM . Simple analysis to compare the output values for two subjects is also performed on DOSE-Analyzer. This report is a user manual of DOSE-Analyzer. (author)

  14. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  15. Activation calculation and radiation analysis for China Fusion Engineering Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Zhi, E-mail: zchen@ustc.edu.cn; Qiao, Shiji; Jiang, Shuai; Xu, X. George

    2016-11-01

    Highlights: • Activation calculation was performed using FLUKA for the main components of CFETR. • Radionuclides and radioactive wastes were assessed for CFETR. • The Waste Disposal Ratings (WDR) were assessed for CFETR. - Abstract: The activation calculation and analysis for the China Fusion Engineering Test Reactor (CFETR) will play an important role in its system design, maintenance, inspection and assessment of nuclear waste. Using the multi-particle transport code FLUKA and its associated data library, we calculated the radioactivity, specific activity, waste disposal rating from activation products, nuclides in the tritium breeding blanket, shielding layer, vacuum vessel and toroidal field coil (TFC) of CFETR. This paper presents the calculation results including neutron flux, activation products and waste disposal rating after one-year full operation of the CFETR. The findings show that, under the assumption of one-year operation at the 200 MW fusion power, the total radioactivity inventory will be 1.05 × 10{sup 19} Bq at shutdown and 1.03 × 10{sup 17} Bq after ten years. The primary residual nuclide is found to be {sup 55}Fe in ten years after the shutdown. The waste disposal rating (WDR) values are very low (<<1), according to Class C limits, CFETR materials are qualified for shallow land burial. It is shown that CFETR has no serious activation safety issue.

  16. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  17. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  18. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  19. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  20. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  1. Beam-dynamics codes used at DARHT

    Energy Technology Data Exchange (ETDEWEB)

    Ekdahl, Jr., Carl August [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  2. Evaluation of Monte Carlo tools for high energy atmospheric physics

    Directory of Open Access Journals (Sweden)

    C. Rutjes

    2016-11-01

    Full Text Available The emerging field of high energy atmospheric physics (HEAP includes terrestrial gamma-ray flashes, electron–positron beams and gamma-ray glows from thunderstorms. Similar emissions of high energy particles occur in pulsed high voltage discharges. Understanding these phenomena requires appropriate models for the interaction of electrons, positrons and photons of up to 40 MeV energy with atmospheric air. In this paper, we benchmark the performance of the Monte Carlo codes Geant4, EGS5 and FLUKA developed in other fields of physics and of the custom-made codes GRRR and MC-PEPTITA against each other within the parameter regime relevant for high energy atmospheric physics. We focus on basic tests, namely on the evolution of monoenergetic and directed beams of electrons, positrons and photons with kinetic energies between 100 keV and 40 MeV through homogeneous air in the absence of electric and magnetic fields, using a low energy cutoff of 50 keV. We discuss important differences between the results of the different codes and provide plausible explanations. We also test the computational performance of the codes. The Supplement contains all results, providing a first benchmark for present and future custom-made codes that are more flexible in including electrodynamic interactions.

  3. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  4. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    Energy Technology Data Exchange (ETDEWEB)

    Camous, F.; Jacq, F.; Chatelard, P. [IPSN/DRS/SEMAR CE-Cadarache, St Paul Lez Durance (France)] [and others

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  5. Thickness optimization and activity induction in beam slit monitor for Indus

    International Nuclear Information System (INIS)

    Petwal, V.C.; Pramod, R.; Dwivedi, Jishnu; Senecha, V.K.

    2009-01-01

    A large number of beam slit monitors are planned to be installed in the TL-2 and TL-3 of Indus for probing the 450 MeV and 700 MeV electron beams. The beam slit monitor consists of 2 pairs of metallic blades, mounted in orthogonal direction and shall be installed inside the beam chamber. These shutters provide current signals, on interception with electron beam, which can be used to determine precisely beam position, shape and size. The physical dimensions of the shutter blades are of crucial importance due to the requirement of high resolution, accuracy and space constraints. As part of design study of beam slit monitors, Monte Carlo simulation using MCNP code has been performed to investigate the radiological characteristics of the suitable blade materials e.g. Cu, Ta, W, and Inermet. The thickness has been optimised to absorb 90% of electron beam. The power density profiles along thickness and radial direction have been simulated to carry out thermal design. The high energy electron beam on interception with shutter blade develops cascading shower, containing secondary particles such as photons, photoneutrons, pions, and muons etc, which induce radioactivity in shutter material as well in the surrounding components. The state of the art Monte Carlo Code FLUKA has been used to estimate the amount of the activity induced in the shutter blade. In the first step, the FLUKA calculations are compared with data reported in IAEA TRS 188 for Cu, W target in the energy range 15 - 35 MeV, which shows good agreement. In second step, these calculations are extended to estimate induced activity in the shutter blade at actual electron energy 450 MeV and 700 MeV. (author)

  6. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  7. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  8. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  9. Status of reactor core design code system in COSINE code package

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.; Yu, H.; Liu, Z., E-mail: yuhui@snptc.com.cn [State Nuclear Power Software Development Center, SNPTC, National Energy Key Laboratory of Nuclear Power Software (NEKLS), Beijiing (China)

    2014-07-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  10. Status of reactor core design code system in COSINE code package

    International Nuclear Information System (INIS)

    Chen, Y.; Yu, H.; Liu, Z.

    2014-01-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  11. Further Generalisations of Twisted Gabidulin Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde, Johan Sebastian Heesemann; Sheekey, John

    2017-01-01

    We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes.......We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes....

  12. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  13. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  14. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  15. Application of Quantum Gauss-Jordan Elimination Code to Quantum Secret Sharing Code

    Science.gov (United States)

    Diep, Do Ngoc; Giang, Do Hoang; Phu, Phan Huy

    2018-03-01

    The QSS codes associated with a MSP code are based on finding an invertible matrix V, solving the system vATMB (s a)=s. We propose a quantum Gauss-Jordan Elimination Procedure to produce such a pivotal matrix V by using the Grover search code. The complexity of solving is of square-root order of the cardinal number of the unauthorized set √ {2^{|B|}}.

  16. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  17. The Coding Causes of Death in HIV (CoDe) Project: initial results and evaluation of methodology

    DEFF Research Database (Denmark)

    Kowalska, Justyna D; Friis-Møller, Nina; Kirk, Ole

    2011-01-01

    The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies.......The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies....

  18. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  19. Towers of generalized divisible quantum codes

    Science.gov (United States)

    Haah, Jeongwan

    2018-04-01

    A divisible binary classical code is one in which every code word has weight divisible by a fixed integer. If the divisor is 2ν for a positive integer ν , then one can construct a Calderbank-Shor-Steane (CSS) code, where X -stabilizer space is the divisible classical code, that admits a transversal gate in the ν th level of Clifford hierarchy. We consider a generalization of the divisibility by allowing a coefficient vector of odd integers with which every code word has zero dot product modulo the divisor. In this generalized sense, we construct a CSS code with divisor 2ν +1 and code distance d from any CSS code of code distance d and divisor 2ν where the transversal X is a nontrivial logical operator. The encoding rate of the new code is approximately d times smaller than that of the old code. In particular, for large d and ν ≥2 , our construction yields a CSS code of parameters [[O (dν -1) ,Ω (d ) ,d ] ] admitting a transversal gate at the ν th level of Clifford hierarchy. For our construction we introduce a conversion from magic state distillation protocols based on Clifford measurements to those based on codes with transversal T gates. Our tower contains, as a subclass, generalized triply even CSS codes that have appeared in so-called gauge fixing or code switching methods.

  20. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  1. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation.

  2. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    International Nuclear Information System (INIS)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation

  3. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  4. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  5. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  6. Induced radioactivity in the target station and decay tunnel from a 4MW proton beam

    CERN Document Server

    Agosteo, S; Otto, T; Silari, Marco

    2003-01-01

    An important aspect of a future CERN Neutrino Factory is the material activation arising from a 2.2 GeV, 4 MW proton beam striking a mercury target. A first estimation of the hadronic inelastic interactions and the production of residual nuclei in the target, the magnetic horn, the decay tunnel, the surrounding rock and a downstream dump has been performed by the Monte Carlo hadronic cascade code FLUKA. The aim is both to assess the dose equivalent rate to be expected during maintenance work and to evaluate the amount of residual radioactivity, which will have to be disposed of after the facility has ceased operation. This paper discusses the first results of such calculations.

  7. Benchmark Studies of Induced Radioactivity Produced in LHC Materials, Pt II Specific Activities

    International Nuclear Information System (INIS)

    Brugger, M.; Mayer, S.; Roesler, S.; Ulrici, L.; Khater, H.; Prinz, A.; Vincke, H.

    2006-01-01

    A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of and laterally to a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV/c. Emphasis was put on the reduction of uncertainties such as careful monitoring of the irradiation parameters, the use of different instruments to measure dose rates, detailed elemental analyses of the irradiated materials and detailed simulations of the irradiation experiment. Measured and calculated dose rates are in good agreement

  8. 99mTc by 99Mo produced at the ENEA-FNG facility of 14MeV neutrons.

    Science.gov (United States)

    Capogni, M; Pietropaolo, A; Quintieri, L; Fazio, A; De Felice, P; Pillon, M; Pizzuto, A

    2018-04-01

    A severe supply crisis of 99 Mo, precursor of 99m Tc a diagnostic radionuclide largely used in Nuclear Medicine, occurred in 2008-2009 due to repeated shut-down of the two main (aged) fission reactors. An alternative route for producing 99 Mo by 100 Mo(n,2n) 99 Mo reaction was investigated at ENEA. The experiment, designed according to Monte Carlo simulations performed with the Fluka code, produced 99 Mo by irradiating a natural Molybdenum powdered target with 14MeV neutrons produced at the Frascati Neutron Generator. The 99 Mo specific activity was measured at metrological level by γ-ray spectrometry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Study on induced radioactivity of China Spallation Neutron Source

    International Nuclear Information System (INIS)

    Wu Qingbiao; Wang Qingbin; Wu Jingmin; Ma Zhongjian

    2011-01-01

    China Spallation Neutron Source (CSNS) is the first High Energy Intense Proton Accelerator planned to be constructed in China during the State Eleventh Five-Year Plan period, whose induced radioactivity is very important for occupational disease hazard assessment and environmental impact assessment. Adopting the FLUKA code, the authors have constructed a cylinder-tunnel geometric model and a line-source sampling physical model, deduced proper formulas to calculate air activation, and analyzed various issues with regard to the activation of different tunnel parts. The results show that the environmental impact resulting from induced activation is negligible, whereas the residual radiation in the tunnels has a great influence on maintenance personnel, so strict measures should be adopted.(authors)

  10. A Bonner Sphere Spectrometer with extended response matrix

    CERN Document Server

    Silari, M; Dimovasili, E; Birattari, C

    2010-01-01

    This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators. (C) 2010 Elsevier B.V. All rights reserved.

  11. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  12. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  13. Under-coding of secondary conditions in coded hospital health data: Impact of co-existing conditions, death status and number of codes in a record.

    Science.gov (United States)

    Peng, Mingkai; Southern, Danielle A; Williamson, Tyler; Quan, Hude

    2017-12-01

    This study examined the coding validity of hypertension, diabetes, obesity and depression related to the presence of their co-existing conditions, death status and the number of diagnosis codes in hospital discharge abstract database. We randomly selected 4007 discharge abstract database records from four teaching hospitals in Alberta, Canada and reviewed their charts to extract 31 conditions listed in Charlson and Elixhauser comorbidity indices. Conditions associated with the four study conditions were identified through multivariable logistic regression. Coding validity (i.e. sensitivity, positive predictive value) of the four conditions was related to the presence of their associated conditions. Sensitivity increased with increasing number of diagnosis code. Impact of death on coding validity is minimal. Coding validity of conditions is closely related to its clinical importance and complexity of patients' case mix. We recommend mandatory coding of certain secondary diagnosis to meet the need of health research based on administrative health data.

  14. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  15. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  16. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  17. Convolutional coding techniques for data protection

    Science.gov (United States)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  18. 21 CFR 106.90 - Coding.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD... of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity with the coding requirements that are applicable to thermally processed low-acid foods packaged in...

  19. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  20. Coding In-depth Semistructured Interviews

    DEFF Research Database (Denmark)

    Campbell, John L.; Quincy, Charles; Osserman, Jordan

    2013-01-01

    Many social science studies are based on coded in-depth semistructured interview transcripts. But researchers rarely report or discuss coding reliability in this work. Nor is there much literature on the subject for this type of data. This article presents a procedure for developing coding schemes...... useful for situations where a single knowledgeable coder will code all the transcripts once the coding scheme has been established. This approach can also be used with other types of qualitative data and in other circumstances....