WorldWideScience

Sample records for computational accelerator physics

  1. Computer programs in accelerator physics

    International Nuclear Information System (INIS)

    Keil, E.

    1984-01-01

    Three areas of accelerator physics are discussed in which computer programs have been applied with much success: i) single-particle beam dynamics in circular machines, i.e. the design and matching of machine lattices; ii) computations of electromagnetic fields in RF cavities and similar objects, useful for the design of RF cavities and for the calculation of wake fields; iii) simulation of betatron and synchrotron oscillations in a machine with non-linear elements, e.g. sextupoles, and of bunch lengthening due to longitudinal wake fields. (orig.)

  2. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  3. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  4. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    International Nuclear Information System (INIS)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-01-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction

  5. Accelerators and Beams, multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R.R.; Browman, A.A.; Mead, W.C.; Williams, R.A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user close-quote s rate of learning and length of retention of the material. They integrate interactive On-Screen Laboratories, hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer. copyright 1999 American Institute of Physics

  6. Lua(Jit) for computing accelerator beam physics

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    As mentioned in the 2nd developers meeting, I would like to open the debate with a special presentation on another language - Lua, and a tremendous technology - LuaJit. Lua is much less known at CERN, but it is very simple, much smaller than Python and its JIT is extremely performant. The language is a dynamic scripting language easy to learn and easy to embedded in applications. I will show how we use it in HPC for accelerator beam physics as a replacement for C, C++, Fortran and Python, with some benchmarks versus Python, PyPy4 and C/C++.

  7. ''Accelerators and Beams,'' multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R. R.; Browman, A. A.; Mead, W. C.; Williams, R. A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user's rate of learning and length of retention of the material. They integrate interactive ''On-Screen Laboratories,'' hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer

  8. CONFERENCE: Computers and accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-01-15

    In September of last year a Conference on 'Computers in Accelerator Design and Operation' was held in West Berlin attracting some 160 specialists including many from outside Europe. It was a Europhysics Conference, organized by the Hahn-Meitner Institute with Roman Zelazny as Conference Chairman, postponed from an earlier intended venue in Warsaw. The aim was to bring together specialists in the fields of accelerator design, computer control and accelerator operation.

  9. Future accelerators: physics issues

    International Nuclear Information System (INIS)

    Bjorken, J.D.

    1977-11-01

    High energy physics of the future using future accelerators is discussed. The proposed machines and instruments, physics issues and opportunities including brief sketches of outstanding recent results, and the way the proposed machines address these issues are considered. 42 references

  10. Nuclear physics accelerator facilities

    International Nuclear Information System (INIS)

    1988-12-01

    This paper describes many of the nuclear physics heavy-ion accelerator facilities in the US and the research programs being conducted. The accelerators described are: Argonne National Laboratory--ATLAS; Brookhaven National Laboratory--Tandem/AGS Heavy Ion Facility; Brookhaven National Laboratory--Relativistic Heavy Ion Collider (RHIC) (Proposed); Continuous Electron Beam Accelerator Facility; Lawrence Berkeley Laboratory--Bevalac; Lawrence Berkeley Laboratory--88-Inch Cyclotron; Los Alamos National Laboratory--Clinton P. Anderson Meson Physics Facility (LAMPF); Massachusetts Institute of Technology--Bates Linear Accelerator Center; Oak Ridge National Laboratory--Holifield Heavy Ion Research Facility; Oak Ridge National Laboratory--Oak Ridge Electron Linear Accelerator; Stanford Linear Accelerator Center--Nuclear Physics Injector; Texas AandM University--Texas AandM Cyclotron; Triangle Universities Nuclear Laboratory (TUNL); University of Washington--Tandem/Superconducting Booster; and Yale University--Tandem Van de Graaff

  11. FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.    In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the...

  12. VLHC accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Michael Blaskiewicz et al.

    2001-11-01

    A six-month design study for a future high energy hadron collider was initiated by the Fermilab director in October 2000. The request was to study a staged approach where a large circumference tunnel is built that initially would house a low field ({approx}2 T) collider with center-of-mass energy greater than 30 TeV and a peak (initial) luminosity of 10{sup 34} cm{sup -2}s{sup -1}. The tunnel was to be scoped, however, to support a future upgrade to a center-of-mass energy greater than 150 TeV with a peak luminosity of 2 x 10{sup 34} cm{sup -2} sec{sup -1} using high field ({approx} 10 T) superconducting magnet technology. In a collaboration with Brookhaven National Laboratory and Lawrence Berkeley National Laboratory, a report of the Design Study was produced by Fermilab in June 2001. 1 The Design Study focused on a Stage 1, 20 x 20 TeV collider using a 2-in-1 transmission line magnet and leads to a Stage 2, 87.5 x 87.5 TeV collider using 10 T Nb{sub 3}Sn magnet technology. The article that follows is a compilation of accelerator physics designs and computational results which contributed to the Design Study. Many of the parameters found in this report evolved during the study, and thus slight differences between this text and the Design Study report can be found. The present text, however, presents the major accelerator physics issues of the Very Large Hadron Collider as examined by the Design Study collaboration and provides a basis for discussion and further studies of VLHC accelerator parameters and design philosophies.

  13. Accelerator and radiation physics

    CERN Document Server

    Basu, Samita; Nandy, Maitreyee

    2013-01-01

    "Accelerator and radiation physics" encompasses radiation shielding design and strategies for hadron therapy accelerators, neutron facilities and laser based accelerators. A fascinating article describes detailed transport theory and its application to radiation transport. Detailed information on planning and design of a very high energy proton accelerator can be obtained from the article on radiological safety of J-PARC. Besides safety for proton accelerators, the book provides information on radiological safety issues for electron synchrotron and prevention and preparedness for radiological emergencies. Different methods for neutron dosimetry including LET based monitoring, time of flight spectrometry, track detectors are documented alongwith newly measured experimental data on radiation interaction with dyes, polymers, bones and other materials. Design of deuteron accelerator, shielding in beam line hutches in synchrotron and 14 MeV neutron generator, various radiation detection methods, their characteriza...

  14. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  15. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  16. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  17. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  18. Introduction to Accelerators Physics

    International Nuclear Information System (INIS)

    Variola, A.

    2007-01-01

    This short course aims at giving to high energy physics students a preliminary introduction to accelerators basics. The arguments and the style were selected in this perspective. Consequently, topics such as the definition of beam parameters and luminosity were preferred to much more technical aspects. The calculation details were neglected to allow more important highlights on concepts and definitions. Some examples and exercises were suggested to summarize the different topics of the lessons

  19. SSC accelerator physics

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    Accelerator physicists at LBL began intensive work on the SSC in 1983, in support of the proposed 6.5-T magnet design, which, in turn, became reference design A during the Reference Designs Study. In that same study, LBL physicists formed the core of the accelerator physics group led by Fermilab's Don Edwards. In a period of only a few months, that group established preliminary parameters for a near-optimal design, produced conceptual designs based on three magnet types, addressed all significant beam lifetime and stability issues, and identified areas requiring further R and D. Since the conclusion of the Reference Designs Study, work has focused on the key SSC design issue, namely, single-particle stability in an imperfect magnetic field. At the end of fiscal 1984, much of the LBL accelerator physics group took its place in the SSC Central Design Group, whose headquarters at LBL will be the focus of nationwide SSC R and D efforts over the next several years

  20. Theoretical problems in accelerator physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses the following research on accelerators: computational methods; higher order mode suppression in accelerators structures; overmoded waveguide components and application to SLED II and power transport; rf sources; accelerator cavity design for a B factory asymmetric collider; and photonic band gap cavities

  1. Nuclear physics accelerator facilities

    International Nuclear Information System (INIS)

    1985-01-01

    The Department of Energy's Nuclear Physics program is a comprehensive program of interdependent experimental and theoretical investigation of atomic nuclei. Long range goals are an understanding of the interactions, properties, and structures of atomic nuclei and nuclear matter at the most elementary level possible and an understanding of the fundamental forces of nature by using nuclei as a proving ground. Basic ingredients of the program are talented and imaginative scientists and a diversity of facilities to provide the variety of probes, instruments, and computational equipment needed for modern nuclear research. Approximately 80% of the total Federal support of basic nuclear research is provided through the Nuclear Physics program; almost all of the remaining 20% is provided by the National Science Foundation. Thus, the Department of Energy (DOE) has a unique responsibility for this important area of basic science and its role in high technology. Experimental and theoretical investigations are leading us to conclude that a new level of understanding of atomic nuclei is achievable. This optimism arises from evidence that: (1) the mesons, protons, and neutrons which are inside nuclei are themselves composed of quarks and gluons and (2) quantum chromodynamics can be developed into a theory which both describes correctly the interaction among quarks and gluons and is also an exact theory of the strong nuclear force. These concepts are important drivers of the Nuclear Physics program

  2. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  3. Particle accelerator physics

    CERN Document Server

    Wiedemann, Helmut

    2015-01-01

    This book by Helmut Wiedemann is a well-established, classic text, providing an in-depth and comprehensive introduction to the field of high-energy particle acceleration and beam dynamics. The present 4th edition has been significantly revised, updated and expanded. The newly conceived Part I is an elementary introduction to the subject matter for undergraduate students. Part II gathers the basic tools in preparation of a more advanced treatment, summarizing the essentials of electrostatics and electrodynamics as well as of particle dynamics in electromagnetic fields. Part III is an extensive primer in beam dynamics, followed, in Part IV, by an introduction and description of the main beam parameters and including a new chapter on beam emittance and lattice design. Part V is devoted to the treatment of perturbations in beam dynamics. Part VI then discusses the details of charged particle acceleration. Parts VII and VIII introduce the more advanced topics of coupled beam dynamics and describe very intense bea...

  4. Neutrino physics and accelerators

    International Nuclear Information System (INIS)

    Kaftanov, V.

    1978-01-01

    The history is described of experiments aimed at the study of direct neutrino-matter interactions conducted in the past twenty years. Experiments are outlined carried out with the objective of proving the existence of the intermediate W meson which had been predicted by the weak interaction theory. The methods of obtaining neutrino beams using accelerators and the detectors used are briefly shown. Also described are experiments to be conducted in the near future in different laboratories. (Z.J.)

  5. Neutron physics with accelerators

    Science.gov (United States)

    Colonna, N.; Gunsing, F.; Käppeler, F.

    2018-07-01

    Neutron-induced nuclear reactions are of key importance for a variety of applications in basic and applied science. Apart from nuclear reactors, accelerator-based neutron sources play a major role in experimental studies, especially for the determination of reaction cross sections over a wide energy span from sub-thermal to GeV energies. After an overview of present and upcoming facilities, this article deals with state-of-the-art detectors and equipment, including the often difficult sample problem. These issues are illustrated at selected examples of measurements for nuclear astrophysics and reactor technology with emphasis on their intertwined relations.

  6. Personal computers in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.

    1988-01-01

    The advent of the personal computer has created a popular movement which has also made a strong impact on science and engineering. Flexible software environments combined with good computational performance and large storage capacities are becoming available at steadily decreasing costs. Of equal importance, however, is the quality of the user interface offered on many of these products. Graphics and screen interaction is available in ways that were only possible on specialized systems before. Accelerator engineers were quick to pick up the new technology. The first applications were probably for controllers and data gatherers for beam measurement equipment. Others followed, and today it is conceivable to make personal computer a standard component of an accelerator control system. This paper reviews the experience gained at CERN so far and describes the approach taken in the design of the common control center for the SPS and the future LEP accelerators. The design goal has been to be able to integrate personal computers into the accelerator control system and to build the operator's workplace around it. (orig.)

  7. Accelerator physics and modeling: Proceedings

    International Nuclear Information System (INIS)

    Parsa, Z.

    1991-01-01

    This report contains papers on the following topics: Physics of high brightness beams; radio frequency beam conditioner for fast-wave free-electron generators of coherent radiation; wake-field and space-charge effects on high brightness beams. Calculations and measured results for BNL-ATF; non-linear orbit theory and accelerator design; general problems of modeling for accelerators; development and application of dispersive soft ferrite models for time-domain simulation; and bunch lengthening in the SLC damping rings

  8. Symbolic mathematical computing: orbital dynamics and application to accelerators

    International Nuclear Information System (INIS)

    Fateman, R.

    1986-01-01

    Computer-assisted symbolic mathematical computation has become increasingly useful in applied mathematics. A brief introduction to such capabilitites and some examples related to orbital dynamics and accelerator physics are presented. (author)

  9. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    Energy Technology Data Exchange (ETDEWEB)

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  10. Chaotic dynamics in accelerator physics

    International Nuclear Information System (INIS)

    Cary, J.R.

    1992-01-01

    Substantial progress was in several areas of accelerator dynamics. For developing understanding of longitudinal adiabatic dynamics, and for creating efficiency enhancements of recirculating free-electron lasers, was substantially completed. A computer code for analyzing the critical KAM tori that bound the dynamic aperture in circular machines was developed. Studies of modes that arise due to the interaction of coating beams with a narrow-spectrum impedance have begun. During this research educational and research ties with the accelerator community at large have been strengthened

  11. Accelerator Physics Section progress report

    International Nuclear Information System (INIS)

    Coote, G.E.

    1986-05-01

    This report summarizes the work of the Accelerator Physics Section of the Institute of Nuclear Sciences during the period January-December 1985. Applications of the EN-tandem accelerator included 13 N production for tracer experiments in plants and animals, hydrogen profiling with a 19 F beam and direct detection of heavy ions with a surface barrier detector. Preparations for accelerator mass spectrometry continued steadily, with the commissioning of the pulsed EHT supply which selects the isotope to be accelerated, routine detection of 14 C ions, and completion of a sputter ion source with an eight position target wheel. It was shown that the hydrogen content of a material could be derived from a simultaneous measurement of the transmission of neutrons and gamma rays from a neutron source or accelerator target. The 11 CO 2 produced at the 3 MV accelerator was used in two studies of translocation in a large number of plant species: the effects of small quantities of SO 2 in the air, and the effect of cooling a short length of the stem. The nuclear microprobe was applied to studies of carbon pickup during welding of stainless steel, determination of trace elements in soil and vegetation and the measurement of sodium depth profiles in obsidian - in particular the effect of rastering the incident proton beams

  12. Research in accelerator physics (theory)

    International Nuclear Information System (INIS)

    Ohnuma, Shoroku.

    1993-01-01

    The authors discuss the present status, expected effort during the remainder of the project, and some of the results of their activities since the beginning of the project. Some of the areas covered are: (1) effects of helical insertial devices on beam dynamics; (2) coupling impedance of apertures in accelerator beam pipes; (3) new calculation of diffusion rate; (4) integrable polynomial factorization for symplectic map tracking; and (5) physics of magnet sorting in superconducting rings

  13. Compensation Techniques in Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Sayed, Hisham Kamal [Old Dominion Univ., Norfolk, VA (United States)

    2011-05-01

    Accelerator physics is one of the most diverse multidisciplinary fields of physics, wherein the dynamics of particle beams is studied. It takes more than the understanding of basic electromagnetic interactions to be able to predict the beam dynamics, and to be able to develop new techniques to produce, maintain, and deliver high quality beams for different applications. In this work, some basic theory regarding particle beam dynamics in accelerators will be presented. This basic theory, along with applying state of the art techniques in beam dynamics will be used in this dissertation to study and solve accelerator physics problems. Two problems involving compensation are studied in the context of the MEIC (Medium Energy Electron Ion Collider) project at Jefferson Laboratory. Several chromaticity (the energy dependence of the particle tune) compensation methods are evaluated numerically and deployed in a figure eight ring designed for the electrons in the collider. Furthermore, transverse coupling optics have been developed to compensate the coupling introduced by the spin rotators in the MEIC electron ring design.

  14. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2004-01-01

    Full text: Due to the drastic reduction (in previous years) of scientific and technical staff of the Department, our basic work in 2003 was limited to the following subjects: - the development of radiographic 4 MeV electron accelerator, - computational verification of basic parameters of a simplified version of ''6/15 MeV'' medical accelerator. - continuation of the study of photon and electron spectra of narrow photon beams with the use of the BEAMnrc Monte Carlo codes, - a study of accelerating and deflecting travelling wave RF structures based on experience already gained. The small 4-6 MeV electron linac was constructed in the Department as a tool for radiographic services which may be offered by our Institute. In 2003, the most important sub-units of the accelerator were constructed and completed. Accelerated electron beam intensity up to 80 mA was already obtained and for the following year the energy spectrum measurement, energy and intensity optimisation for e - /X-ray conversion and also first exposures are planned. Because in the realisation of the 6/15 MeV Accelerator Project, the Department was responsible for calculations of beam guiding and acceleration (accelerating section with triode electron gun, beam focusing, achromatic deviation), last year some verifying computations were done. This concerned mainly the influence of the variation of gun injection energy and RF frequency shifts on beam dynamics. The computational codes written in the Department are still used and continuously developed for this and similar purposes. The triode gun, originally thought as a part of 6/15 MeV medical accelerator, is on long term testing, showing very good performance; a new pulse modulator for that sub-unit was designed. The Monte Carlo calculations of narrow photon beams are continued. Intensity modulated radiation therapy (IMRT) is expected to play a dominant role in the years to come. Our principal researcher hereafter receiving PhD degree collaborates on IMRT

  15. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  16. Analytical tools in accelerator physics

    International Nuclear Information System (INIS)

    Litvinenko, V.N.

    2010-01-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev (Kolomensky), but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz (Landau). A large number of short notes covering various techniques are placed in the Appendices.

  17. Electron accelerators and nuclear physics

    International Nuclear Information System (INIS)

    Frois, B.

    1989-01-01

    The operating electron accelerators and their importance in the nuclear and in the particle physics developments, are underlined. The principles of probing the nucleus by applying electron scattering techniques and the main experimental results, are summarized. In order to understand hadron interactions and the dynamics of quark confinement in nuclei, the high energy electrons must provide quantitative data on the following topics: the structure of the nucleon, the role of non nucleonic components in nuclei, the nature of short-range nucleon correlations, the origin of the short-range part of nuclear forces and the effects of the nuclear medium on quark distributions. To progress in the nuclear structure knowledge it is necessary to build a coherent strategy of accelerator developments in Europe

  18. Health physics practices at research accelerators

    International Nuclear Information System (INIS)

    Thomas, R.H.

    1976-02-01

    A review is given of the uses of particle accelerators in health physics, the text being a short course given at the Health Physics Society Ninth Midyear Topical Symposium in February, 1976. Topics discussed include: (1) the radiation environment of high energy accelerators; (2) dosimetry at research accelerators; (3) shielding; (4) induced activity; (5) environmental impact of high energy accelerators; (6) population dose equivalent calculation; and (7) the application of the ''as low as practicable concept'' at accelerators

  19. SALOME: An Accelerator for the Practical Course in Accelerator Physics

    OpenAIRE

    Miltchev, Velizar; Riebesehl, Daniel; Roßbach, Jörg; Trunk, Maximilian; Stein, Oliver

    2014-01-01

    SALOME (Simple Accelerator for Learning Optics and the Manipulation of Electrons) is a short low energy linear electron accelerator built by the University of Hamburg. The goal of this project is to give the students the possibility to obtain hands-on experience with the basics of accelerator physics. In this contribution the layout of the device will be presented. The most important components of the accelerator will be discussed and an overview of the planned demonstration experiments will ...

  20. CERN Accelerator School: Registration open for Advanced Accelerator Physics course

    CERN Multimedia

    2015-01-01

    Registration is now open for the CERN Accelerator School’s Advanced Accelerator Physics course to be held in Warsaw, Poland from 27 September to 9 October 2015.   The course will be of interest to physicists and engineers who wish to extend their knowledge of accelerator physics. The programme offers core lectures on accelerator physics in the mornings and a practical course with hands-on tuition in the afternoons.  Further information can be found at: http://cas.web.cern.ch/cas/Poland2015/Warsaw-advert.html http://indico.cern.ch/event/361988/

  1. CERN Accelerator School: Registration open for Advanced Accelerator Physics course

    CERN Multimedia

    2015-01-01

    Registration is now open for the CERN Accelerator School’s Advanced Accelerator Physics course to be held in Warsaw, Poland from 27 September to 9 October 2015.   The course will be of interest to physicists and engineers who wish to extend their knowledge of Accelerator Physics. The programme offers core lectures on accelerator physics in the mornings and a practical course with hands-on tuition in the afternoons.  Further information can be found at: http://cas.web.cern.ch/cas/Poland2015/Warsaw-advert.html http://indico.cern.ch/event/361988/

  2. Accelerating Innovation: How Nuclear Physics Benefits Us All

    Science.gov (United States)

    2011-01-01

    Innovation has been accelerated by nuclear physics in the areas of improving our health; making the world safer; electricity, environment, archaeology; better computers; contributions to industry; and training the next generation of innovators.

  3. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    1999-01-01

    ' laboratory. Additional radiation shielding was constructed and the computer assisted system for dosimetric monitoring was installed. Three experimental set-ups for electron and photon beam diagnostics are in course of installation and running -at: 4-5 MeV, 10-15 MeV, and 20 MeV. The 20 MeV unit will also be used for generation and metrology of narrow photon beams applicable in stereotactic radiosurgery. Preliminary design works are advanced, oriented, undertaken on an important project - high-power electron accelerators for radiation technology (10 MeV, 20-50 kW). Financial support for this task is still pending. A substantial part of the Department's activity was oriented to an international collaboration with accelerator physics centres. Two works completed in 1997 were extended in 1998: microwave pulsed generator destined for short beam bunches diagnostics was installed and put in operation at INFN-Frascati; 27 pieces of polarized ''door-knob'' r.f. couplers for superconducting cavities in HERA ring were installed and put in operation. In the course of 1998 we got the message from DESY, that couplers are working well and brought desirable improvement in operation reliability. The new item of collaboration with DESY, is design, construction and r.f. measurements of a copper model of accelerating ''superstructure'' for TESLA collider. If successful, the use of niobium ''superstructure'' can shorten by about a few kilometres the length of the TESLA linear accelerator. First four 1 m sections of model structures were sent to DESY at the end of 1998. The next four are in preparation. Some results of work done in 1998 were presented at conferences in Caen, Stockholm and Cracow

  4. Department of Accelerator Physics and Technology - Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2007-01-01

    superconducting cavity RF power couplers. 18 MeV Electron Accelerator Stand with the linear accelerator - Saturn was prepared for experimental work, and can be used in neutron detectors investigation and for accelerating structures research. To increase the reliability of operation, upgrading of the computer control system is foreseen next year. The aim of the preliminary study of accelerating structures in C-band is the search for electron accelerator miniaturization. At higher frequencies, much higher accelerating fields can be applied and as the wavelength becomes shorter, the overall size of the structure and various components becomes smaller. In 2006 the main physical parameters of 5720 MHz SW side coupled structures were optimized. For that frequency there exist on the market suitable high power klystrons and a variety of necessary microwave equipment. Monte Carlo simulations using the BEAMnrc/EGSnrc were carried out to study the influence of possible errors in assigning of CT (coefficients of X ray attenuation in tissue) on calculated ion range in hadron therapy. This work was done in Heidelberg by A.Wysocka-Rabin in the frame of our collaboration with DKFZ. In ENEA-Frascati a linear accelerator for protons called TOP (Terapia Oncologica con Protoni, Oncological Proton Therapy) is under realization. Basically it is a proton linac of modified Alvarez type working on 3000 MHz frequency and delivering a beam in the energy range from 65 MeV to 200 MeV. In 2005 the contract was signed between ENEA and IPJ-Swierk on the basis of which the Accelerator Physics Dpt. of IPJ will design, produce and deliver to Frascati the input section of the 65 MeV linac. This section of SCDTL type will increase the proton energy from 7 to 17 MeV. The design is almost finished; many elements are manufactured and ready for assembling. This will take place in of 2007. (author)

  5. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  6. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    Dubois, J.

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author) [fr

  7. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  8. Multi-scale multi-physics computational chemistry simulation based on ultra-accelerated quantum chemical molecular dynamics method for structural materials in boiling water reactor

    International Nuclear Information System (INIS)

    Miyamoto, Akira; Sato, Etsuko; Sato, Ryo; Inaba, Kenji; Hatakeyama, Nozomu

    2014-01-01

    In collaboration with experimental experts we have reported in the present conference (Hatakeyama, N. et al., “Experiment-integrated multi-scale, multi-physics computational chemistry simulation applied to corrosion behaviour of BWR structural materials”) the results of multi-scale multi-physics computational chemistry simulations applied to the corrosion behaviour of BWR structural materials. In macro-scale, a macroscopic simulator of anode polarization curve was developed to solve the spatially one-dimensional electrochemical equations on the material surface in continuum level in order to understand the corrosion behaviour of typical BWR structural material, SUS304. The experimental anode polarization behaviours of each pure metal were reproduced by fitting all the rates of electrochemical reactions and then the anode polarization curve of SUS304 was calculated by using the same parameters and found to reproduce the experimental behaviour successfully. In meso-scale, a kinetic Monte Carlo (KMC) simulator was applied to an actual-time simulation of the morphological corrosion behaviour under the influence of an applied voltage. In micro-scale, an ultra-accelerated quantum chemical molecular dynamics (UA-QCMD) code was applied to various metallic oxide surfaces of Fe 2 O 3 , Fe 3 O 4 , Cr 2 O 3 modelled as same as water molecules and dissolved metallic ions on the surfaces, then the dissolution and segregation behaviours were successfully simulated dynamically by using UA-QCMD. In this paper we describe details of the multi-scale, multi-physics computational chemistry method especially the UA-QCMD method. This method is approximately 10,000,000 times faster than conventional first-principles molecular dynamics methods based on density-functional theory (DFT), and the accuracy was also validated for various metals and metal oxides compared with DFT results. To assure multi-scale multi-physics computational chemistry simulation based on the UA-QCMD method for

  9. Acceleration parameters for fluid physics with accelerating bodies

    CSIR Research Space (South Africa)

    Gledhill, Irvy MA

    2016-06-01

    Full Text Available to an acceleration parameter that appears to be new in fluid physics, but is known in cosmology. A selection of cases for rectilinear acceleration has been chosen to illustrate the point that this parameter alone does not govern regimes of flow about significantly...

  10. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  11. Computer-based training for particle accelerator personnel

    International Nuclear Information System (INIS)

    Silbar, R.R.

    1999-01-01

    A continuing problem at many laboratories is the training of new operators in the arcane technology of particle accelerators. Presently most of this training occurs on the job, under a mentor. Such training is expensive, and while it provides operational experience, it is frequently lax in providing the physics background needed to truly understand accelerator systems. Using computers in a self-paced, interactive environment can be more effective in meeting this training need. copyright 1999 American Institute of Physics

  12. CAS CERN Accelerator School: Advanced accelerator physics. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Turner, S.

    1987-01-01

    This advanced course on general accelerator physics is the second of the biennial series given by the CERN Accelerator School and follows on from the first basic course given at Gif-sur-Yvette, Paris, in 1984. Stress is placed on the mathematical tools of Hamiltonian mechanics and the Vlasov and Fokker-Planck equations, which are widely used in accelerator theory. The main topics treated in this present work include: nonlinear resonances, chromaticity, motion in longitudinal phase space, growth and control of longitudinal and transverse beam emittance, space-charge effects and polarization. The seminar programme treats some specific accelerator techniques, devices, projects and future possibilities. (orig.)

  13. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  14. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  15. Applied computational physics

    CERN Document Server

    Boudreau, Joseph F; Bianchi, Riccardo Maria

    2018-01-01

    Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...

  16. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  17. Space charge physics for particle accelerators

    CERN Document Server

    Hofmann, Ingo

    2017-01-01

    Understanding and controlling the physics of space charge effects in linear and circular proton and ion accelerators are essential to their operation, and to future high-intensity facilities. This book presents the status quo of this field from a theoretical perspective, compares analytical approaches with multi-particle computer simulations and – where available – with experiments. It discusses fundamental concepts of phase space motion, matched beams and modes of perturbation, along with mathematical models of analysis – from envelope to Vlasov-Poisson equations. The main emphasis is on providing a systematic description of incoherent and coherent resonance phenomena; parametric instabilities and sum modes; mismatch and halo; error driven resonances; and emittance exchange due to anisotropy, as well as the role of Landau damping. Their distinctive features are elaborated in the context of numerous sample simulations, and their potential impacts on beam quality degradation and beam loss are discussed....

  18. Computational Physics' Greatest Hits

    Science.gov (United States)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  19. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  20. [Accelerator physics R ampersand D

    International Nuclear Information System (INIS)

    Krisch, A.D.

    1994-01-01

    This report discusses the NEPTUN-A experiment that will study spin effects in violent proton-proton collisions; the Siberian snake tests at IUCF cooler ring; polarized gas jets; and polarized proton acceleration to 1 TeV at Fermilab

  1. Accelerator physics issues at the SSC

    International Nuclear Information System (INIS)

    Dugan, G.F.

    1993-05-01

    Realization of the design energy and luminosity goals of the Superconducting Super Collider (SSC) will require proper resolutions of a number of challenging problems in accelerator physics. The status of several salient issues in the design of the SSC will be reviewed and updated in this paper. The emphasis will be on the superconducting accelerators

  2. CAS CERN Accelerator School second advanced accelerator physics course

    International Nuclear Information System (INIS)

    Turner, S.

    1989-01-01

    The advanced course on general accelerator physics given in West Berlin closely followed that organised by the CERN Accelerator School at Oxford in September 1985 and whose proceedings were published as CERN Yellow Report 87-03 (1987). However, certain subjects were treated in a different way, improved or extended, while some new ones were introduced and it is all of these which are included in the present proceedings. The lectures include particle-photon interactions, high-brilliance lattices and single/multiple Touschek effect, while the seminars are on the major accelerators presently under construction or proposed for the near future, applications of synchrotron radiation, free-electron lasers, cosmic accelerators and crystal beams. Also included are errata, and addenda to some of the lectures, of CERN 87-03. (orig.)

  3. Computing tools for accelerator design calculations

    International Nuclear Information System (INIS)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  4. CAS CERN Accelerator School. Third advanced accelerator physics course

    International Nuclear Information System (INIS)

    Turner, S.

    1990-01-01

    The third version of the CERN Accelerator School's (CAS) advanced course on General Accelerator Physics was given at Uppsala University from 18-29 September, 1989. Its syllabus was based on the previous courses held in Oxford, 1985 and Berlin, 1987 whose proceedings were published as CERN Yellow Reports 87-03 and 89-01 respectively. However, the opportunity was taken to emphasize the physics of small accelerators and storage rings, to present some topics in new ways, and to introduce new seminars. Thus the lectures contained in the present volume include chromaticity, dynamic aperture, kinetic theory, Landau damping, ion-trapping, Schottky noise, laser cooling and small ring lattice problems while the seminars include interpretation of numerical tracking, internal targets and living with radiation. (orig.)

  5. Analogue computer display of accelerator beam optics

    International Nuclear Information System (INIS)

    Brand, K.

    1984-01-01

    Analogue computers have been used years ago by several authors for the design of magnetic beam handling systems. At Bochum a small analogue/hybrid computer was combined with a particular analogue expansion and logic control unit for beam transport work. This apparatus was very successful in the design and setup of the beam handling system of the tandem accelerator. The center of the stripper canal was the object point for the calculations, instead of the high energy acceleration tube a drift length was inserted into the program neglecting the weak focusing action of the tube. In the course of the installation of a second injector for heavy ions it became necessary to do better calculations. A simple method was found to represent accelerating sections on the computer and a particular way to simulate thin lenses was adopted. The analogue computer system proved its usefulness in the design and in studies of the characteristics of different accelerator installations over many years. The results of the calculations are in very good agreement with real accelerator data. The apparatus is the ideal tool to demonstrate beam optics to students and accelerator operators since the effect of a change of any of the parameters is immediately visible on the oscilloscope

  6. Advances of Accelerator Physics and Technologies

    CERN Document Server

    1993-01-01

    This volume, consisting of articles written by experts with international repute and long experience, reviews the state of the art of accelerator physics and technologies and the use of accelerators in research, industry and medicine. It covers a wide range of topics, from basic problems concerning the performance of circular and linear accelerators to technical issues and related fields. Also discussed are recent achievements that are of particular interest (such as RF quadrupole acceleration, ion sources and storage rings) and new technologies (such as superconductivity for magnets and RF ca

  7. New accelerators in high-energy physics

    International Nuclear Information System (INIS)

    Blewett, J.P.

    1982-01-01

    First, I should like to mention a few new ideas that have appeared during the last few years in the accelerator field. A couple are of importance in the design of injectors, usually linear accelerators, for high-energy machines. Then I shall review some of the somewhat sensational accelerator projects, now in operation, under construction or just being proposed. Finally, I propose to mention a few applications of high-energy accelerators in fields other than high-energy physics. I realize that this is a digression from my title but I hope that you will find it interesting

  8. Accelerator physics experiments at Aladdin

    International Nuclear Information System (INIS)

    Chattopadhyay, S.; Cornacchia, M.; Jackson, A.; Zisman, M.S.

    1985-07-01

    The Aladdin accelerator is a 1 GeV synchrotron light source located at the University of Wisconsin. The results of experimental studies of the Aladdin accelerator are described. The primary purpose of the experiments reported was to investigate reported anomalies in the behavior of the linear lattice, particularly in the vertical plane. A second goal was to estimate the ring broadband impedance. Experimental observations and interpretation of the linear properties of the Aladdin ring are described, including the beta function and dispersion measurements. Two experiments are described to measure the ring impedance, the first a measurement of the parasitic mode loss, and the second a measurement of the beam transfer function. Measurements of the longitudinal and transverse emittance at 100 and 200 MeV are described and compared with predictions. 10 refs., 24 figs., 2 tabs

  9. Neutrino physics and accelerators. [Reviews

    Energy Technology Data Exchange (ETDEWEB)

    Kaftanov, V

    1978-04-01

    The history is described of experiments aimed at the study of direct neutrino-matter interactions conducted in the past twenty years. Experiments are outlined carried out with the objective of proving the existence of the intermediate W meson which had been predicted by the weak interaction theory. The methods of obtaining neutrino beams using accelerators and the detectors used are briefly shown. Also described are experiments to be conducted in the near future in different laboratories.

  10. Hamiltonian systems in accelerator physics

    International Nuclear Information System (INIS)

    Laslett, L.J.

    1985-06-01

    General features of the design of annular particle accelerators or storage rings are outlined and the Hamiltonian character of individual-ion motion is indicated. Examples of phase plots are presented, for the motion in one spatial degree of freedom, of an ion subject to a periodic nonlinear focusing force. A canonical transformation describing coupled nonlinear motion also is given, and alternative types of graphical display are suggested for the investigation of long-term stability in such cases. 7 figs

  11. Quantum effects in accelerator physics

    International Nuclear Information System (INIS)

    Leinaas, J.M.

    1991-08-01

    Quantum effects for electrons in a storage ring are discussed, in particular the polarization effect due to spin flip synchrotron radiation. The electrons are treated as a simple quantum mechnical two-level system coupled to the orbital motion and the radiation field. The excitations of the spin system are then related to the Unruh effect, i.e. the effect that an accelerated radiation detector is thermally excited by vacuum fluctuations. 24 refs., 2 figs

  12. Research in accelerator physics (theory)

    International Nuclear Information System (INIS)

    Ohnuma, Shoroku.

    1991-01-01

    This report discusses the following topics: beam-beam interaction in colliders with momentum oscillation; isolated difference resonance and evolution of the particle distribution; study of magnet sorting for the SSC High Energy Booster; development of a discrete HESQ; beam dynamics in compact synchrotrons; theoretical problems in multi-stage FEL for two-beam acceleration; operation of Tevatron near integer tunes; and detailed examination of coupling impedance of various devices in storage rings; impact on beams from the insertion devices

  13. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  14. Accelerator Physics Branch annual technical report, 1989

    International Nuclear Information System (INIS)

    Hulbert, J.A.

    1990-08-01

    The report describes, in a series of separate articles, the achievements of the Accelerator Physics Branch for the calendar year 1989. Work in basic problems of accelerator physics including ion sources, high-duty-factor rf quadrupoles, coupling effects in standing wave linacs and laser acceleration is outlined. A proposal for a synchrotron light source for Canada is described. Other articles cover the principal design features of the IMPELA industrial electron linac prototype, the cavities developed for the HERA complex at DESY, Hamburg, West Germany, and further machine projects that have been completed

  15. A physics computing bureau

    CERN Document Server

    Laurikainen, P

    1975-01-01

    The author first reviews the services offered by the Bureau to the user community scattered over three separate physics departments and a theory research institute. Limited services are offered also to non- physics research in the University, in collaboration with the University Computing Center. The personnel is divided into operations sections responsible for the terminal and data archive management, punching and document services, etc. and into analysts sections with half a dozen full-time scientific programmers recruited among promising graduate level physics students, rather than computer scientists or mathematicians. Analysts are thus able not only to communicate with physicists but also to participate in research to some extent. Only more demanding program development tasks can be handled by the Bureau, most of the routine data processing is the users responsibility.

  16. CAS CERN Accelerator School: Fourth general accelerator physics course

    International Nuclear Information System (INIS)

    Turner, S.

    1991-01-01

    The fourth CERN Accelerator School (CAS) basic course on General Accelerator Physics was given at KFA, Juelich, from 17 to 28 September 1990. Its syllabus was based on the previous similar courses held at Gif-sur-Yvette in 1984, Aarhus 1986, and Salamanca 1988, and whose proceedings were published as CERN Reports 85-19, 87-10, and 89-05, respectively. However, certain topics were treated in a different way, improved or extended, while new subjects were introduced. All of these appear in the present proceedings, which include lectures or seminars on the history and applications of accelerators, phase space and emittance, chromaticity, beam-beam effects, synchrotron radiation, radiation damping, tune measurement, transition, electron cooling, the designs of superconducting magnets, ring lattices, conventional RF cavities and ring RF systems, and an introduction to cyclotrons. (orig.)

  17. CAS CERN Accelerator School third general accelerator physics course

    International Nuclear Information System (INIS)

    Turner, S.

    1989-01-01

    The general course on accelerator physics given in Salamanca, Spain, closely followed those organised by the CERN Accelerator School at Gif-sur-Yvette, Paris in 1984, and at Aarhus, Denmark in 1986 and whose proceedings were published as CERN Yellow Reports 85-19 (1985) and 87-10 (1987) respectively. However, certain topics were treated in a different way, improved or extended, while some new ones were introduced and it is all of these which are included in the present proceedings. The lectures include beam-cooling concepts, Liouville's theorem and emittance, emittance dilution in transfer lines, weak-betatron coupling, diagnostics, while the seminars are on positron and electron sources, linac structures and the LEP L3 experiment, together with industrial aspects of particle accelerators. Also included are errata and addenda to the Yellow Reports mentioned above. (orig.)

  18. Computing in plasma physics

    International Nuclear Information System (INIS)

    Nuehrenberg, J.

    1986-01-01

    These proceedings contain the articles presented at the named conference. These concern numerical methods for astrophysical plasmas, the numerical simulation of reversed-field pinch dynamics, methods for numerical simulation of ideal MHD stability of axisymmetric plasmas, calculations of the resistive internal m=1 mode in tokamaks, parallel computing and multitasking, particle simulation methods in plasma physics, 2-D Lagrangian studies of symmetry and stability of laser fusion targets, computing of rf heating and current drive in tokamaks, three-dimensional free boundary calculations using a spectral Green's function method, as well as the calculation of three-dimensional MHD equilibria with islands and stochastic regions. See hints under the relevant topics. (HSI)

  19. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  20. General accelerator physics. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Bryant, P.; Turner, S.

    1985-01-01

    This course on accelerator physics is the first in a series of two, which is planned by the CERN Accelerator School. Starting at the level of a science graduate, this course covers mainly linear theory. The topics include: transverse and longitudinal beam dynamics, insertions, coupling, transition, dynamics of radiating particles, space-charge forces, neutralization, beam profiles, luminosity calculations in colliders, longitudinal phase-space stacking, phase-displacement acceleration, transfer lines, injection and extraction. Some more advanced topics are also introduced: coherent instabilities in coasting beams, general collective phenomena, quantum lifetime, and intra-beam scattering. The seminar programme is based on two themes: firstly, the sub-systems of an accelerator and, secondly, the uses to which accelerators are put. (orig.)

  1. Non-accelerator particle physics

    International Nuclear Information System (INIS)

    Steinberg, R.I.; Lane, C.E.

    1991-08-01

    The goals of this research were the experimental testing of fundamental theories of physics such as grand unification and the exploration of cosmic phenomena through the techniques of particle physics. We have worked on the MACRO experiment, which is employing a large area underground detector to search for grand unification magnetic monopoles and dark matter candidates and to study cosmic ray muons as well as low and high energy neutrinos; the νIMB project, which seeks to refurbish and upgrade the IMB water Cerenkov detector to perform an improved proton decay search together with a long baseline reactor neutrino oscillation experiments using a one kiloton liquid scintillator (the Perry experiment); and development of technology for improved liquid scintillators and for very low background materials in support of the MACRO and Perry experiments and for new solar neutrino experiments

  2. Non-accelerator particle physics

    International Nuclear Information System (INIS)

    Steinberg, R.I.

    1990-01-01

    The goals of this research are the experimental testing of fundamental theories of physics such as grand unification and the exploration of cosmic phenomena through the techniques of particle physics. We are currently engaged in construction of the MACRO detector, an Italian-American collaborative research instrument with a total particle acceptance of 10,000 m 2 sr, which will perform a sensitive search for magnetic monopoles using excitation-ionization methods. Other major objective of the MACRO experiment are to search for astrophysical high energy neutrinos expected to be emitted by such objects as Vela X-1, LMC X-4 and SN-1987A and to search for low energy neutrino bursts from gravitational stellar collapse. We are also working on BOREX, a liquid scintillation solar neutrino experiment and GRANDE, a proposed very large area surface detector for astrophysical neutrinos, and on the development of new techniques for liquid scintillation detection

  3. Non-accelerator particle physics

    International Nuclear Information System (INIS)

    Steinberg, R.I.; Lane, C.E.

    1991-09-01

    The goals of this research are the experimental testing of fundamental theories of physics such as grand unification and the exploration of cosmic phenomena through the techniques of particle physics. We are working on the MACRO experiment, which employs a large area underground detector to search for grand unification magnetic monopoles and dark matter candidates and to study cosmic ray muons as well as low and high energy neutrinos: the νIMB project, which seeks to refurbish and upgrade the IMB water Cerenkov detector to perform an improved proton decay search together with a long baseline reactor neutrino oscillation experiment using a kiloton liquid scintillator (the Perry experiment); and development of technology for improved liquid scintillators and for very low background materials in support of the MACRO and Perry experiments and for new solar neutrino experiments. 21 refs., 19 figs., 6 tabs

  4. Applications of Particle Accelerators in Medical Physics

    OpenAIRE

    Cuttone, G

    2008-01-01

    Particle accelerators are often associated to high energy or nuclear physics. As well pointed out in literature [1] if we kindly analyse the number of installation worldwide we can easily note that about 50% is mainly devoted to medical applications (radiotherapy, medical radioisotopes production, biomedical research). Particle accelerators are also playing an important indirect role considering the improvement of the technical features of medical diagnostic. In fact the use of radionuclide f...

  5. Accelerator Physics for ILC and CLIC

    CERN Document Server

    Zimmermann, F

    2010-01-01

    This paper summarizes the second part of the “accelerator physics lectures” delivered at the Ambleside Linear Collider School 2009. It discusses more specific linear-collider issues: superconducting and room-temperature linear accelerators, particle sources for electrons and positrons, synchrotron radiation and damping, intensity limits, beam stability, and beam delivery system – including final focus, collimation, and beam-beam effects. It also presents an overview of the International Linear Collider (ILC), a description of the two beam acceleration scheme of the Compact Linear Collider (CLIC), and a comparison of the ILC and CLIC parameters.

  6. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.

    1977-03-01

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  7. Computing in high energy physics

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1991-01-01

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors

  8. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  9. Computer simulation of dynamic processes on accelerators

    International Nuclear Information System (INIS)

    Kol'ga, V.V.

    1979-01-01

    The problems of computer numerical investigation of motion of accelerated particles in accelerators and storages, an effect of different accelerator systems on the motion, determination of optimal characteristics of accelerated charged particle beams are considered. Various simulation representations are discussed which describe the accelerated particle dynamics, such as the enlarged particle method, the representation where a great number of discrete particle is substituted for a field of continuously distributed space charge, the method based on determination of averaged beam characteristics. The procedure is described of numerical studies involving the basic problems, viz. calculation of closed orbits, establishment of stability regions, investigation of resonance propagation determination of the phase stability region, evaluation of the space charge effect the problem of beam extraction. It is shown that most of such problems are reduced to solution of the Cauchy problem using a computer. The ballistic method which is applied to solution of the boundary value problem of beam extraction is considered. It is shown that introduction into the equation under study of additional members with the small positive regularization parameter is a general idea of the methods for regularization of noncorrect problems [ru

  10. Ultimate-gradient accelerators physics and prospects

    CERN Document Server

    Skrinsky, Aleksander Nikolayevich

    1995-01-01

    As introduction, the needs and ways for ultimate acceleration gradients are discussed briefly. The Plasma Wake Field Acceleration is analized in the most important details. The structure of specific plasma oscillations and "high energy driver beam SP-plasma" interaction is presented, including computer simulation of the process. Some pratical ways to introduce the necessary mm-scale bunching in driver beam and to arrange sequential energy multiplication are dicussed. The influence of accelerating beam particle - plasma binary collisions is considered, also. As applications of PWFA, the use of proton super-colliders beams (LHC and Future SC) to drive the "multi particle types" accelerator, and the arrangements for the electron-positron TeV range collider are discussed.

  11. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  12. Proceedings of CAS - CERN Accelerator School: Advanced Accelerator Physics Course

    International Nuclear Information System (INIS)

    Herr, W

    2014-01-01

    This report presents the proceedings of the Course on Advanced Accelerator Physics organized by the CERN Accelerator School. The course was held in Trondheim, Norway from 18 to 29 August 2013, in collaboration with the Norwegian University of Science and Technology. Its syllabus was based on previous courses and in particular on the course held in Berlin 2003 whose proceedings were published as CERN Yellow Report CERN-2006-002. The field has seen significant advances in recent years and some topics were presented in a new way and other topics were added. The lectures were supplemented with tutorials on key topics and 14 hours of hands on courses on Optics Design and Corrections, RF Measurement Techniques and Beam Instrumentation and Diagnostics. These courses are a key element of the Advanced Level Course

  13. Accelerator based atomic physics experiments: an overview

    International Nuclear Information System (INIS)

    Moak, C.D.

    1976-01-01

    Atomic Physics research with beams from accelerators has continued to expand and the number of papers and articles at meetings and in journals reflects a steadily increasing interest and an increasing support from various funding agencies. An attempt will be made to point out where interdisciplinary benefits have occurred, and where applications of the new results to engineering problems are expected. Drawing from material which will be discussed in the conference, a list of the most active areas of research is presented. Accelerator based atomic physics brings together techniques from many areas, including chemistry, astronomy and astrophysics, nuclear physics, solid state physics and engineering. An example is the use of crystal channeling to sort some of the phenomena of ordinary heavy ion stopping powers. This tool has helped us to reach a better understanding of stopping mechanisms with the result that now we have established a better base for predicting energy losses of heavy ions in various materials

  14. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  15. Computational needs for the RIA accelerator systems

    International Nuclear Information System (INIS)

    Ostroumov, P.N.; Nolen, J.A.; Mustapha, B.

    2006-01-01

    This paper discusses the computational needs for the full design and simulation of the RIA accelerator systems. Beam dynamics simulations are essential to first define and optimize the architectural design for both the driver linac and the post-accelerator. They are also important to study different design options and various off-normal modes in order to decide on the most-performing and cost-effective design. Due to the high-intensity primary beams, the beam-stripper interaction is a source of both radioactivation and beam contamination and should be carefully investigated and simulated for proper beam collimation and shielding. The targets and fragment separators area needs also very special attention in order to reduce any radiological hazards by careful shielding design. For all these simulations parallel computing is an absolute necessity

  16. Computers in Nuclear Physics Division

    International Nuclear Information System (INIS)

    Kowalczyk, M.; Tarasiuk, J.; Srebrny, J.

    1997-01-01

    Improving of the computer equipment in Nuclear Physics Division is described. It include: new computer equipment and hardware upgrading, software developing, new programs for computer booting and modernization of data acquisition systems

  17. The CEBAF accelerator and its physics program

    International Nuclear Information System (INIS)

    Cardman, L.S.

    1993-01-01

    The continuous electron beam accelerator facility (CEBAF) consists of a pair of 400 MeV superconducting linacs together with a 5-pass recirculation system and beam switchyard that will permit it to provide three, simultaneous 4 GeV, cw electron beams with a total current of up to 200 μA. The conventional construction for the accelerator and the three experimental end stations is essentially complete. The first linac has been installed in the accelerator tunnel and beam has been accelerated through it; all tests to date have met or exceeded the design specifications. The major components of the experimental equipment for the end stations are under construction. Operation of CEBAF for nuclear physics is scheduled to begin in mid-1994. The facility will support a broad range of nuclear physics research, including topics such as how quarks and gluons are held together in protons and neutrons, the origins of the nuclear force, modifications of nucleons in the nuclear medium, and nuclear structure when nucleons are very close together. The status of the accelerator and its experimental equipment will be presented together with a sampling of experiments planned for the early phases of operation

  18. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  19. High Energy Density Physics and Exotic Acceleration Schemes

    International Nuclear Information System (INIS)

    Cowan, T.; Colby, E.

    2005-01-01

    The High Energy Density and Exotic Acceleration working group took as our goal to reach beyond the community of plasma accelerator research with its applications to high energy physics, to promote exchange with other disciplines which are challenged by related and demanding beam physics issues. The scope of the group was to cover particle acceleration and beam transport that, unlike other groups at AAC, are not mediated by plasmas or by electromagnetic structures. At this Workshop, we saw an impressive advancement from years past in the area of Vacuum Acceleration, for example with the LEAP experiment at Stanford. And we saw an influx of exciting new beam physics topics involving particle propagation inside of solid-density plasmas or at extremely high charge density, particularly in the areas of laser acceleration of ions, and extreme beams for fusion energy research, including Heavy-ion Inertial Fusion beam physics. One example of the importance and extreme nature of beam physics in HED research is the requirement in the Fast Ignitor scheme of inertial fusion to heat a compressed DT fusion pellet to keV temperatures by injection of laser-driven electron or ion beams of giga-Amp current. Even in modest experiments presently being performed on the laser-acceleration of ions from solids, mega-amp currents of MeV electrons must be transported through solid foils, requiring almost complete return current neutralization, and giving rise to a wide variety of beam-plasma instabilities. As keynote talks our group promoted Ion Acceleration (plenary talk by A. MacKinnon), which historically has grown out of inertial fusion research, and HIF Accelerator Research (invited talk by A. Friedman), which will require impressive advancements in space-charge-limited ion beam physics and in understanding the generation and transport of neutralized ion beams. A unifying aspect of High Energy Density applications was the physics of particle beams inside of solids, which is proving to

  20. Theoretical and Experimental Studies in Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Rosenzweig, James [Univ. of California, Los Angeles, CA (United States). Dept. of Physics and Astronomy

    2017-03-08

    This report describes research supported by the US Dept. of Energy Office of High Energy Physics (OHEP), performed by the UCLA Particle Beam Physics Laboratory (PBPL). The UCLA PBPL has, over the last two decades-plus, played a critical role in the development of advanced accelerators, fundamental beam physics, and new applications enabled by these thrusts, such as new types of accelerator-based light sources. As the PBPL mission is broad it is natural that it has been grown within the context of the accelerator science and technology stewardship of the OHEP. Indeed, steady OHEP support for the program has always been central to the success of the PBPL; it has provided stability, and above all has set the over-arching themes for our research directions, which have producing over 500 publications (>120 in high level journals). While other agency support has grown notably in recent years, permitting more vigorous pursuit of the program, it is transient by comparison. Beyond permitting program growth in a time of flat OHEP budgets, the influence of other agency missions is found in push to adapt advanced accelerator methods to applications, in light of the success the field has had in proof-of-principle experiments supported first by the DoE OHEP. This three-pronged PBPL program — advanced accelerators, fundamental beam physics and technology, and revolutionary applications — has produced a generation of students that have had a profound affect on the US accelerator physics community. PBPL graduates, numbering 28 in total, form a significant population group in the accelerator community, playing key roles as university faculty, scientific leaders in national labs (two have been named Panofsky Fellows at SLAC), and vigorous proponents of industrial application of accelerators. Indeed, the development of advanced RF, optical and magnet technology at the PBPL has led directly to the spin-off company, RadiaBeam Technologies, now a leading industrial accelerator firm

  1. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    2001-01-01

    Full text: In view of limited number of scientific and technical staff, it was necessary to focus the activity on most important subjects and to keep balance between current duties and development of future projects. The dominant item was realisation of research and designing works in the Ordered Project for New Therapeutical Accelerator with two energies of photon beam 6 and 15 MeV. During the reported year, main efforts were oriented on: - computation and experimental works on optimization of electron gun parameters and electron optics in the injection system for accelerating structure, - calculation and modelling of standing wave, S-band accelerating structure to achieve broad range of electron energy variation with good phase acceptance and narrow energy spectrum of the output beam, - calculation and design of beam focusing and transport system, with deflection of the output beam for 2700 in achromatic sector magnet, - design and modelling of microwave power system, with pilot generator, klystron 6 MW amplifier, pulse modulator, waveguide system, four-port circulator and automatic frequency control, - preparative works on metrological procedures and apparatus for accelerated beam diagnostics comprising measurements of energy spectrum, beam intensity, transmission factor, leakage radiation, and other important beam parameters. Other important subject, worth mentioning are: - Advance in forming and metrology of narrow X-ray photon beams, dedicated to stereotactic radiosurgery and radiotherapy, - Adaptation of a new version of EGS-4, MC type code for computer simulation of dose distribution in therapeutical beams, - Participation in selected items of the TESLA Project in cooperation with DESY - Hamburg, - theory and computer simulation of higher order modes in superconducting accelerating structures, - technological research of methods and apparatus for thin layer coating of r.f. resonators and subunits in transmission circuits - Conceptual studies of proposed new

  2. Computational plasma physics

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-08-01

    The behavior of a plasma confined by a magnetic field is simulated by a variety of numerical models. Some models used on a short time scale give detailed knowledge of the plasma on a microscopic scale, while other models used on much longer time scales compute macroscopic properties of the plasma dynamics. In the last two years there has been a substantial increase in the numerical modelling of fusion devices. The status of MHD, transport, equilibrium, stability, Vlasov, Fokker-Planck, and Hybrid codes is reviewed. These codes have already been essential in the design and understanding of low and high beta toroidal experiments and mirror systems. The design of the next generation of fusion experiments and fusion test reactors will require continual development of these numerical models in order to include the best available plasma physics description and also to increase the geometric complexity of the model. (auth)

  3. Applications of Particle Accelerators in Medical Physics

    CERN Document Server

    Cuttone, G

    2008-01-01

    Particle accelerators are often associated to high energy or nuclear physics. As well pointed out in literature [1] if we kindly analyse the number of installation worldwide we can easily note that about 50% is mainly devoted to medical applications (radiotherapy, medical radioisotopes production, biomedical research). Particle accelerators are also playing an important indirect role considering the improvement of the technical features of medical diagnostic. In fact the use of radionuclide for advanced medical imaging is strongly increasing either in conventional radiography (CT and MRI) and also in nuclear medicine for Spect an PET imaging. In this paper role of particle accelerators for medical applications will be presented together with the main solutions applied.

  4. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2003-01-01

    Full text: The main activities of the Accelerator Physics and Technology Department were focused on following subjects: - contribution to development and building of New Therapeutical Electron Accelerator delivering the photon beams of 6 and 15 MeV, - study of the photon and electron spectra of narrow photon beams with the use of the BEAM/EGSnrc codes, - design and construction of special RF structures for use in CLIC Test Facility in CERN, - design and construction of 1:1 copper, room temperature models of accelerating superconducting 1.3 GHz structures for TESLA Project in DESY. In spite of drastic reduction of scientific and technical staff (from 16 to 10 persons) the planned works were successfully completed, but requested some extraordinary efforts. In realisation of 6/15 MeV Accelerator Project, the Department was responsible all along the project for calculations of all most important parts (electron gun, accelerating structure, beam focusing, achromatic deviation) and also for construction and physical modelling of some strategic subassemblies. The results of scientific and technical achievements of our Department in this work are documented in the Annex to Final Report on realisation of KBN Scientific Project No PBZ 009-13 and earlier Annual Reports 2000 and 2001. The results of Monte Carlo calculations of narrow photon beams and experimental verification using Varian Clinac 2003CD, Simens Mevatron and CGR MeV Saturn accelerators ended up with PhD thesis prepared by MSc Anna Wysocka. Her thesis: Collimation and Dosimetry of X-ray Beams for Stereotactic Radiotherapy with Linear Accelerators was sponsored by KBN scientific Project Nr T11E 04121. In collaboration with LNF INFN Frascati the electron beam deflectors were designed for CERN CLIC Test Facility CTF3. These special type travelling wave RF structures were built by our Department and are actually operated in CTF3 experiment. As the result of collaboration with TESLA-FEL Project in DESY, the set of RF

  5. CAS Accelerator Physics held in Erice, Italy

    CERN Multimedia

    CERN Accelerator School

    2013-01-01

    The CERN Accelerator School (CAS) recently organised a specialised course on Superconductivity for Accelerators, held at the Ettore Majorana Foundation and Centre for Scientific Culture in Erice, Italy from 24 April-4 May, 2013.   Photo courtesy of Alessandro Noto, Ettore Majorana Foundation and Centre for Scientific Culture. Following a handful of summary lectures on accelerator physics and the fundamental processes of superconductivity, the course covered a wide range of topics related to superconductivity and highlighted the latest developments in the field. Realistic case studies and topical seminars completed the programme. The school was very successful with 94 participants representing 23 nationalities, coming from countries as far away as Belorussia, Canada, China, India, Japan and the United States (for the first time a young Ethiopian lady, studying in Germany, attended this course). The programme comprised 35 lectures, 3 seminars and 7 hours of case study. The case studies were p...

  6. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  7. CAS Accelerator Physics (Ion Sources) in Slovakia

    CERN Multimedia

    CAS School

    2012-01-01

    The CERN Accelerator School (CAS) and the Slovak University of Technology jointly organised a specialised course on ion sources, held at the Hotel Senec, Senec, Slovakia, from 29 May to 8 June, 2012.   Following some background lectures on accelerator physics and the fundamental processes of atomic and plasma physics, the course covered a wide range of topics related to ion sources and highlighted the latest developments in the field. Realistic case studies and topical seminars completed the programme. The school was very successful, with 69 participants representing 25 nationalities. Feedback from the participants was extremely positive, reflecting the high standard of the lectures. The case studies were performed with great enthusiasm and produced some excellent results. In addition to the academic programme, the participants were able to take part in a one-day excursion consisting of a guided tour of Bratislava and free time. A welcome event was held at the Hotel Senec, with s...

  8. Summary talk - status of accelerator neutrino physics

    International Nuclear Information System (INIS)

    Lee, B.W.

    1977-01-01

    I shall address theoretical questions that are immediately relevant to today's accelerator neutrino physics. The frame of reference I shall dwell in is quantum chromodynamics, in which quarks are assumed to carry both flavors and colors, and confining forces among quarks are transmitted by color gluons. The physical hadrons are color-neutral. Quarks presumably cannot be isolated at least at the present accelerator energies. For most phenomenological considerations, whether confinement is permanent or temporary does not really matter, but I insist that quarks behave as if they were free at short distances, and a color symmetry is exact. Inasmuch as quark cannot exist in an isolated state, what one means by a quark mass is a matter of definition. One definition might be superior to others in a given context. (orig.) [de

  9. Computer codes for designing proton linear accelerators

    International Nuclear Information System (INIS)

    Kato, Takao

    1992-01-01

    Computer codes for designing proton linear accelerators are discussed from the viewpoint of not only designing but also construction and operation of the linac. The codes are divided into three categories according to their purposes: 1) design code, 2) generation and simulation code, and 3) electric and magnetic fields calculation code. The role of each category is discussed on the basis of experience at KEK (the design of the 40-MeV proton linac and its construction and operation, and the design of the 1-GeV proton linac). We introduce our recent work relevant to three-dimensional calculation and supercomputer calculation: 1) tuning of MAFIA (three-dimensional electric and magnetic fields calculation code) for supercomputer, 2) examples of three-dimensional calculation of accelerating structures by MAFIA, 3) development of a beam transport code including space charge effects. (author)

  10. Early history of physics with accelerators

    International Nuclear Information System (INIS)

    Anderson, H.L.

    1982-01-01

    The early history of physics at accelerators is reviewed, with emphasis on three experiments which have had a profound influence on our veiw of the structure of matter: The Franck and Hertz experiment opening practical ways of studying nuclear disintegration, and the discovery of the del ++ isobar of the proton by Fermi and collaborators, revealing structure in the nucleon. Fermi's work is illustrated by pages from his notebooks

  11. Reactor physics computations

    International Nuclear Information System (INIS)

    Shapiro, A.

    1977-01-01

    Those reactor-core calculations which provide the effective multiplication factor (or eigenvalue) and the stationary (or fundamental mode) neutron-flux distribution at selected times during the lifetime of the core are considered. The multiplication factor is required to establish the nuclear composition and configuration which satisfy criticality and control requirements. The steady-state flux distribution must be known to calculate reaction rates and power distributions which are needed for the thermal, mechanical and shielding design of the reactor, as well as for evaluating refueling requirements. The calculational methods and techniques used for evaluating the nuclear design information vary with the type of reactor and with the preferences and prejudices of the reactor-physics group responsible for the calculation. Additionally, new methods and techniques are continually being developed and made operational. This results in a rather large conglomeration of methods and computer codes which are available for reactor analysis. The author provides the basic calculational framework and discusses the more prominent techniques which have evolved. (Auth.)

  12. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  13. Main physical problems of superhigh energy accelerators

    International Nuclear Information System (INIS)

    Lapidus, L.I.

    1979-01-01

    A survey is given of the state and prospects for the scientific researches to be carried out at the largest charged particle accelerators now under construction. The fundamental problems of the elementary particle physics are considered which can be solved on the base of experiments at high-energy accelerators. The problems to be solved involve development of the theory of various quark number, accurate determination of the charged and neutral intermediate vector boson masses in the Weinberg-Salam theory, the problem of production of t-quark, W -+ - and Z deg bosons, Higgs mesons and investigation of their interactions, examination of quark and lepton spectra, studies on the effects of strong interactions. As a result of the investigations on hadrons at maximum momentum transfers, the data on space-time structure at short distances can be obtained. It is emphasized that there are no engineering barriers to the construction of such accelerators. The main problem lies in financial investment. A conclusion is drawn that the next generation of accelerators will be developed on the base of cooperation between many countries [ru

  14. First accelerator-based physics of 2014

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Experiments in the East Area received their first beams from the PS this week. Theirs is CERN's first accelerator-based physics since LS1 began last year.   For the East Area, the PS performs a so-called slow extraction, where beam is extracted during many revolution periods (the time it take for particles to go around the PS, ~2.1 μs). The yellow line shows the circulating beam current in the PS, decreasing slowly during the slow extraction, which lasts 350 ms. The green line is the measured proton intensity in the transfer line toward the East Area target. Although LHC physics is still far away, we can now confirm that the injectors are producing physics! In the East Area - the experimental area behind the PS - the T9 and T10 beam lines are providing beams for physics. These beam lines serve experiments such as AIDA - which looks at new detector solutions for future accelerators - and the ALICE Inner Tracking System - which tests components for the ALICE experiment. &qu...

  15. Guide to accelerator physics program SYNCH: VAX version 1987.2

    International Nuclear Information System (INIS)

    Parsa, Z.; Courant, E.

    1987-01-01

    This guide is written to accommodate users of Accelerator Physics Data Base BNLDAG::DUAO:[PARSA1]. It describes the contents of the on line Accelerator Physics data base DUAO:[PARSA1.SYNCH]. SYNCH is a computer program used for the design and analysis of synchrotrons, storage rings and beamlines

  16. Nuclear physics accelerator facilities of the world

    International Nuclear Information System (INIS)

    1991-12-01

    this report is intended to provide a convenient summary of the world's major nuclear physics accelerator facility with emphasis on those facilities supported by the US Department of Energy (DOE). Previous editions of this report have contained only DOE facilities. However, as the extent of global collaborations in nuclear physics grows, gathering summary information on the world's nuclear physics accelerator facilities in one place is useful. Therefore, the present report adds facilities operated by the National Science Foundation (NSF) as well as the leading foreign facilities, with emphasis on foreign facilities that have significant outside user programs. The principal motivation for building and operating these facilities is, of course, basic research in nuclear physics. The scientific objectives for this research were recently reviewed by the DOE/NSF Nuclear Science Advisory Committee, who developed a long range plan, Nuclei, Nucleons, and Quarks -- Nuclear Science in the 1990's. Their report begins as follows: The central thrust of nuclear science is the study of strongly interacting matter and of the forces that govern its structure and dynamics; this agenda ranges from large- scale collective nuclear behavior through the motions of individual nucleons and mesons, atomic nuclei, to the underlying distribution of quarks and gluons. It extends to conditions at the extremes of temperature and density which are of significance to astrophysics and cosmology and are conducive to the creation of new forms of strongly interacting matter; and another important focus is on the study of the electroweak force, which plays an important role in nuclear stability, and on precision tests of fundamental interactions. The present report provides brief descriptions of the accelerator facilities available for carrying out this agenda and their research programs

  17. Theses of reports 'V Conference of high energy physics, nuclear physics and accelerators'

    International Nuclear Information System (INIS)

    Dovbnya, A.N.

    2007-01-01

    Nucleus structure study in the reactions on the charged particles; application of the nuclear and physical methods in the adjacent science fields; study and development of accelerators and accumulators of charged particles; basic research in an effort to develop the nuclear and physical methods for the nuclear power needs, medicine and industry; computed engineering in the physical studies; basic research of interaction processes of ultrarelativistic particles with monocrystals and substance; physics of detectors are submitted in proceedings of V Conference on High Energy Physics

  18. Lecture Notes on Topics in Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Alex W.

    2002-11-15

    These are lecture notes that cover a selection of topics, some of them under current research, in accelerator physics. I try to derive the results from first principles, although the students are assumed to have an introductory knowledge of the basics. The topics covered are: (1) Panofsky-Wenzel and Planar Wake Theorems; (2) Echo Effect; (3) Crystalline Beam; (4) Fast Ion Instability; (5) Lawson-Woodward Theorem and Laser Acceleration in Free Space; (6) Spin Dynamics and Siberian Snakes; (7) Symplectic Approximation of Maps; (8) Truncated Power Series Algebra; and (9) Lie Algebra Technique for nonlinear Dynamics. The purpose of these lectures is not to elaborate, but to prepare the students so that they can do their own research. Each topic can be read independently of the others.

  19. Lecture Notes on Topics in Accelerator Physics

    International Nuclear Information System (INIS)

    Chao, Alex W.

    2002-01-01

    These are lecture notes that cover a selection of topics, some of them under current research, in accelerator physics. I try to derive the results from first principles, although the students are assumed to have an introductory knowledge of the basics. The topics covered are: (1) Panofsky-Wenzel and Planar Wake Theorems; (2) Echo Effect; (3) Crystalline Beam; (4) Fast Ion Instability; (5) Lawson-Woodward Theorem and Laser Acceleration in Free Space; (6) Spin Dynamics and Siberian Snakes; (7) Symplectic Approximation of Maps; (8) Truncated Power Series Algebra; and (9) Lie Algebra Technique for nonlinear Dynamics. The purpose of these lectures is not to elaborate, but to prepare the students so that they can do their own research. Each topic can be read independently of the others

  20. CAS Introduction to Accelerator Physics in Spain

    CERN Multimedia

    CERN Bulletin

    2012-01-01

    The CERN Accelerator School (CAS) and the University of Granada jointly organised a course called "Introduction to Accelerator Physics" in Granada, Spain, from 28 October to 9 November, 2012.   The course attracted over 200 applicants, of whom 139 were selected to attend. The students were of 25 different nationalities, coming from countries as far away as Australia, China, Guatemala and India. The intensive programme comprised 38 lectures, 3 seminars, 4 tutorials where the students were split into three groups, a poster session and 7 hours of guided and private study. Feedback from the students was very positive, praising the expertise of the lecturers, as well as the high standard and quality of their lectures. CERN's Director-General, Rolf Heuer, gave a public lecture at the Parque de las Ciencias entitled "The Large Hadron Collider: Unveiling the Universe". In addition to the academic programme, the students had the opportunity to visit the well...

  1. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  2. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  3. Physics with 100-1000 TeV accelerators

    International Nuclear Information System (INIS)

    Salam, A.

    1982-10-01

    Some thoughts are presented about high energy physics topics which may be connected with future accelerator physics. Discussed are the physics associated with Higgs in grand unifying theories and the physics associated with supersymmetry and with ideas about preons

  4. Better physical activity classification using smartphone acceleration sensor.

    Science.gov (United States)

    Arif, Muhammad; Bilal, Mohsin; Kattan, Ahmed; Ahamed, S Iqbal

    2014-09-01

    Obesity is becoming one of the serious problems for the health of worldwide population. Social interactions on mobile phones and computers via internet through social e-networks are one of the major causes of lack of physical activities. For the health specialist, it is important to track the record of physical activities of the obese or overweight patients to supervise weight loss control. In this study, acceleration sensor present in the smartphone is used to monitor the physical activity of the user. Physical activities including Walking, Jogging, Sitting, Standing, Walking upstairs and Walking downstairs are classified. Time domain features are extracted from the acceleration data recorded by smartphone during different physical activities. Time and space complexity of the whole framework is done by optimal feature subset selection and pruning of instances. Classification results of six physical activities are reported in this paper. Using simple time domain features, 99 % classification accuracy is achieved. Furthermore, attributes subset selection is used to remove the redundant features and to minimize the time complexity of the algorithm. A subset of 30 features produced more than 98 % classification accuracy for the six physical activities.

  5. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    2002-01-01

    features and technology of execution. At the end of the year, the contract was concluded, and in summer 2002 two ordered sections will be completed. In view of money shortages, the problem emerges for the coming year, to discuss and to define the future role of accelerator physics and technology in our Institute. (author)

  6. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Sarah; Devenish, Robin [Nuclear Physics Laboratory, Oxford University (United Kingdom)

    1989-07-15

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'.

  7. Computing in high energy physics

    International Nuclear Information System (INIS)

    Smith, Sarah; Devenish, Robin

    1989-01-01

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'

  8. Department of Accelerator Physics and Technology - Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2006-01-01

    The activities of P-10 Department in year 2005 were devoted to: - development of radiographic 4 MeV electron accelerator, - development of accelerating and deflecting types travelling (TW) and standing wave (SW) RF structures for electrons and ions, - MC simulations applied to photon and ion radiotherapy The compact 6 MeV electron linac constructed in Department P-10 was put in the beginning of reported year into experimental operation. The request for permission to use ionisation source (6 MeV linac) was submitted to National Atomic Energy Agency. On the basis of all necessary documents the permission for routine using of our linac was granted. Actually the e/X conversion tungsten target has been moved from vacuum to air. To improve the safety of accelerator operation, the new collimator and some shielding walls were added. Two regimes of operation are actually possible: X ray output beam or electron beam depending on user demand. Some old non-reliable sub-units of accelerator were replaced, and energy and intensity optimisation for e-/X-ray conversion were made. The MC calculations of photon beams produced on e-/X converter were repeated taking into account the new collimator and additional shields. The triode gun, originally thought of as a part of 6/15 MeV medical accelerator is still on long term tests showing excellent performance; it was twice opened to air to confirm the possibility of repeated formation of gun dispenser cathode. New pulse modulator was routinely used in these tests. The sublimation set-up designed and made in our Department for the TiN coating of accelerator components underwent successfully the technological test including coating quality of several ceramic RF power vacuum windows. Within the German heavy ion therapy program the DKFZ Heidelberg is responsible for medical physics problems of treatment planning and modeling of ion beams for GSI Radiotherapy Facility. The MC simulations are used to calibrate the X-ray CT scanners to obtain

  9. Book of abstracts of the 9th Conference on High Energy Physics, Nuclear Physics and Accelerators

    International Nuclear Information System (INIS)

    Dovbnya, A.N.

    2011-01-01

    The conference is devoted to the fundamental investigations at intermediate and high energies; also, the nuclear structure in reactions with charged particles; application of nuclear-physical methods to associated fields; investigation and development of accelerators, and of charged particles storage rings; the fundamental investigation and development of nuclear physical methods as applied in atomic energetics, medicine and industry; an application of the computer technologies for physical studies; fundamental investigations of processes of the ultrarelativistic particle interactions with monocrystals and matter; and physics of detectors.

  10. Computers and theoretical physics

    International Nuclear Information System (INIS)

    Terrano, A.E.

    1987-01-01

    The outline of the lectures is as follows: 1) The architecture of conventional computers. 2) The design of special-purpose machines. 3) Elements of modern programming. 4) Algebraic and interactive programs. (orig./BBO)

  11. RAPIDE 0.0 RHIC Accelerator Physics Intrepid Development Environment

    Energy Technology Data Exchange (ETDEWEB)

    Satogata, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Saltmarsh, C. [Brookhaven National Lab. (BNL), Upton, NY (United States); Peggs, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    1993-08-01

    This document is a guide to the common environmental features of computing in (and around) the RHIC Accelerator Physics.sectio on the 'zoo' cluster of UNJX workstations, in RAPIDE, the RHIC Accelerator Physics Intrepid Development Environment It is hoped tliat later revisions of this document will approach a more professional 'style guide', beyond the convenient collection of pointers and hints presented here. RAP does two kinds of computing, "controls" and "general", addressed in sections 2 and 3 of this document For general computing, efficient system administration requires cooperation in using a common environment There is a much stronger need to define - and adhere to - a commonly agreed set of styles (or rules) in developing controls software. Right now, these rules have been set "de facto". Future improvements to the controls environment, particularly in response to the opinions of users, depends on broad knowledge of what the rules are. There are environmental issues that are basic to both controls and general computing, and that are so fundamental that they are (almost) unarguable. They are described immediately below, in the next section.

  12. Accelerators for elementary particle physics in Europe

    International Nuclear Information System (INIS)

    Schopper, H.

    1983-01-01

    The European accelerator programme provides for physicists from Europe and other continents facilities to carry out an exciting physics programme both in the medium- and long-term future. During the last decade a concentration of activities took place. The major high energy physics laboratory in Europe is CERN which, with its 13 Member States, is the only international laboratory in the field of high energy physics. About 2.500 physicists carry out their research there and they come not only from the Member States but also from the United States, USSR, Japan, China, Israel etc. Its attraction stems from the fact that most of its facilities are unique. The second laboratory for high energy physics is DESY in Hamburg. Although being a national laboratory it has always been open to physicists from other countries.In particular, since the operation of PETRA started, it has attracted many physicists from Europe and other regions. All high energy experiments at DESY are carried out in international collaborations: there are about 400 physicists involved, some 180 come from foreign universities and research institutes and about 150 from German universities and research laboratories. (author)

  13. Computing in high energy physics

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Hoogland, W.

    1986-01-01

    This book deals with advanced computing applications in physics, and in particular in high energy physics environments. The main subjects covered are networking; vector and parallel processing; and embedded systems. Also examined are topics such as operating systems, future computer architectures and commercial computer products. The book presents solutions that are foreseen as coping, in the future, with computing problems in experimental and theoretical High Energy Physics. In the experimental environment the large amounts of data to be processed offer special problems on-line as well as off-line. For on-line data reduction, embedded special purpose computers, which are often used for trigger applications are applied. For off-line processing, parallel computers such as emulator farms and the cosmic cube may be employed. The analysis of these topics is therefore a main feature of this volume

  14. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  15. Pulsed power accelerator for material physics experiments

    Directory of Open Access Journals (Sweden)

    D. B. Reisman

    2015-09-01

    Full Text Available We have developed the design of Thor: a pulsed power accelerator that delivers a precisely shaped current pulse with a peak value as high as 7 MA to a strip-line load. The peak magnetic pressure achieved within a 1-cm-wide load is as high as 100 GPa. Thor is powered by as many as 288 decoupled and transit-time isolated bricks. Each brick consists of a single switch and two capacitors connected electrically in series. The bricks can be individually triggered to achieve a high degree of current pulse tailoring. Because the accelerator is impedance matched throughout, capacitor energy is delivered to the strip-line load with an efficiency as high as 50%. We used an iterative finite element method (FEM, circuit, and magnetohydrodynamic simulations to develop an optimized accelerator design. When powered by 96 bricks, Thor delivers as much as 4.1 MA to a load, and achieves peak magnetic pressures as high as 65 GPa. When powered by 288 bricks, Thor delivers as much as 6.9 MA to a load, and achieves magnetic pressures as high as 170 GPa. We have developed an algebraic calculational procedure that uses the single brick basis function to determine the brick-triggering sequence necessary to generate a highly tailored current pulse time history for shockless loading of samples. Thor will drive a wide variety of magnetically driven shockless ramp compression, shockless flyer plate, shock-ramp, equation of state, material strength, phase transition, and other advanced material physics experiments.

  16. Accelerated Physical Stability Testing of Amorphous Dispersions.

    Science.gov (United States)

    Mehta, Mehak; Suryanarayanan, Raj

    2016-08-01

    The goal was to develop an accelerated physical stability testing method of amorphous dispersions. Water sorption is known to cause plasticization and may accelerate drug crystallization. In an earlier investigation, it was observed that both the increase in mobility and decrease in stability in amorphous dispersions was explained by the "plasticization" effect of water (Mehta et al. Mol. Pharmaceutics 2016, 13 (4), 1339-1346). In this work, the influence of water concentration (up to 1.8% w/w) on the correlation between mobility and crystallization in felodipine dispersions was investigated. With an increase in water content, the α-relaxation time as well as the time for 1% w/w felodipine crystallization decreased. The relaxation times of the systems, obtained with different water concentration, overlapped when the temperature was scaled (Tg/T). The temperature dependencies of the α-relaxation time as well as the crystallization time were unaffected by the water concentration. Thus, the value of the coupling coefficient, up to a water concentration of 1.8% w/w, was approximately constant. Based on these findings, the use of "water sorption" is proposed to build predictive models for crystallization in slow crystallizing dispersions.

  17. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  18. Beam tomography or ART in accelerator physics

    International Nuclear Information System (INIS)

    Fraser, J.S.

    1978-11-01

    Projections of charged particle beam current density have been used for many years as a measure of beam position and size. The conventional practice of obtaining only two projections, usually in the horizontal and vertical planes, puts a severe limit on the detail that can be recovered from the projections. A third projection provides sufficient improvement to justify the addition of a wire to the conventional wire scanner in certain cases. A group of programs using algebraic reconstruction techniques was written to reconstruct beam current density from beam projections obtained at three or more specific or arbitrary angles around the beam. A generalized program, which makes use of arbitrary 2 x 2 transfer matrices between projections, can be used to reconstruct transverse or longitudinal emittance from appropriate projections. Reconstruction examples of beam current density and transverse and longitudinal emittance using experimental data from the Clinton P. Anderson Meson Physics Facility (LAMPF) accelerator beam are given

  19. Computer Tutorial Programs in Physics.

    Science.gov (United States)

    Faughn, Jerry; Kuhn, Karl

    1979-01-01

    Describes a series of computer tutorial programs which are intended to help college students in introductory physics courses. Information about these programs, which are either calculus or algebra-trig based, is presented. (HM)

  20. Reactor physics and reactor computations

    International Nuclear Information System (INIS)

    Ronen, Y.; Elias, E.

    1994-01-01

    Mathematical methods and computer calculations for nuclear and thermonuclear reactor kinetics, reactor physics, neutron transport theory, core lattice parameters, waste treatment by transmutation, breeding, nuclear and thermonuclear fuels are the main interests of the conference

  1. Computational atomic and nuclear physics

    International Nuclear Information System (INIS)

    Bottcher, C.; Strayer, M.R.; McGrory, J.B.

    1990-01-01

    The evolution of parallel processor supercomputers in recent years provides opportunities to investigate in detail many complex problems, in many branches of physics, which were considered to be intractable only a few years ago. But to take advantage of these new machines, one must have a better understanding of how the computers organize their work than was necessary with previous single processor machines. Equally important, the scientist must have this understanding as well as a good understanding of the structure of the physics problem under study. In brief, a new field of computational physics is evolving, which will be led by investigators who are highly literate both computationally and physically. A Center for Computationally Intensive Problems has been established with the collaboration of the University of Tennessee Science Alliance, Vanderbilt University, and the Oak Ridge National Laboratory. The objective of this Center is to carry out forefront research in computationally intensive areas of atomic, nuclear, particle, and condensed matter physics. An important part of this effort is the appropriate training of students. An early effort of this Center was to conduct a Summer School of Computational Atomic and Nuclear Physics. A distinguished faculty of scientists in atomic, nuclear, and particle physics gave lectures on the status of present understanding of a number of topics at the leading edge in these fields, and emphasized those areas where computational physics was in a position to make a major contribution. In addition, there were lectures on numerical techniques which are particularly appropriate for implementation on parallel processor computers and which are of wide applicability in many branches of science

  2. Computational Methods in Plasma Physics

    CERN Document Server

    Jardin, Stephen

    2010-01-01

    Assuming no prior knowledge of plasma physics or numerical methods, Computational Methods in Plasma Physics covers the computational mathematics and techniques needed to simulate magnetically confined plasmas in modern magnetic fusion experiments and future magnetic fusion reactors. Largely self-contained, the text presents the basic concepts necessary for the numerical solution of partial differential equations. Along with discussing numerical stability and accuracy, the author explores many of the algorithms used today in enough depth so that readers can analyze their stability, efficiency,

  3. The creation and development of new physics of acceleration

    International Nuclear Information System (INIS)

    Bonch-Osmolovskij, A.G.

    1992-01-01

    The creation and stages of development of the new field of physics - the physics of the dense charge-particle ensembles and connected with it new principles of acceleration are considered. In brief form the explanation of the essence of the physics of new acceleration methods was given, their state and perspectives of their development were pictures. 13 refs

  4. CAS Accelerator Physics (RF for Accelerators) in Denmark

    CERN Multimedia

    Barbara Strasser

    2010-01-01

    The CERN Accelerator School (CAS) and Aarhus University jointly organised a specialised course on RF for Accelerators, at the Ebeltoft Strand Hotel, Denmark from 8 to 17 June 2010.   Caption The challenging programme focused on the introduction of the underlying theory, the study and the performance of the different components involved in RF systems, the RF gymnastics and RF measurements and diagnostics. This academic part was supplemented with three afternoons dedicated to practical hands-on exercises. The school was very successful, with 100 participants representing 25 nationalities. Feedback from the participants was extremely positive, praising the expertise and enthusiasm of the lecturers, as well as the high standard and excellent quality of their lectures. In addition to the academic programme, the participants were able to visit a small industrial exhibition organised by Aarhus University and take part in a one-day excursion consisting of a visit of the accelerators operated ...

  5. Physics and technical development of accelerators

    International Nuclear Information System (INIS)

    2000-03-01

    About 90 registered participants delivered more than 40 scientific papers. A great part of these presentations were of general interest about running projects such as CIME accelerator at Ganil, IPHI (high intensity proton injector), ESRF (European source of synchrotron radiation), LHC (large hadron collider), ELYSE accelerator at Orsay, AIRIX, and VIVITRON tandem accelerator. Other presentations highlighted the latest technological developments of accelerator components: superconducting cavities, power klystrons, high current injectors..

  6. Health physics problems encountered in the Saclay linear accelerator

    International Nuclear Information System (INIS)

    Delsaut, R.

    1979-01-01

    The safety and health physics problems specific to the Saclay linear accelerator are presented: activation (of gases, dust, water, structural materials, targets); individual dosimetry; the safety engineering [fr

  7. a Physics of Computation.

    Science.gov (United States)

    Stornetta, Wakefield Scott, Jr.

    The large numbers of elements in distributed computational systems such as systolic arrays, connectionist models, and groups of agents, makes it difficult to do a "bottom -up" analysis of system performance. This raises the issue of what aspects of such systems can be analyzed without detailed knowledge of the lowest-level interactions. Such an approach is analogous in spirit to seeking a thermodynamic description of a gas, rather than tracking the motions of individual molecules. Four concrete illustrations of this notion are presented as follows: (1) an alternative to the NetTalk approach to temporal pattern processing in connectionist networks, which exhibits simple scaling laws that reduce the system's dependence on the sampling rate, (2) an experimental study of the effect that symmetrizing the operating range of the back propagation connectionist model has on relaxation rates and capacity (3) a phenomenological model of a recently introduced fault-stealing mechanism for multi-pipeline systolic arrays, which predicts global failure rates on arrays of arbitrary size based on only a small number of measurements, and (4) the effects that the range of interaction has on specialization and fault tolerance for a group of agents engaged in problem solving.

  8. Department of Accelerator Physics and Technology - Overview

    International Nuclear Information System (INIS)

    Wronka, S.

    2010-01-01

    ) complex permittivity was measured as a function of RF frequency up to 8 GHz before and after vacuum heating to 1100 o C. The design of the absorber vacuum chamber and absorbing ring and copper holder removing the heat was finalized. The technological aspects (stainless steel to be used safely at a temperature of 2K) are still under discussion. The final realization of WP-06 (Work Package 06 as In Kind Contribution) consists of production and delivery to the XFEL site of the total number of 1648 HOM transmission lines and 108 BLAs. Installation and technical commissioning should be completed by the end of 2013. 2) TiN coating vacuum stand for RF components. Studies of TiN anti-multifactor film deposition on ceramic and metallic surfaces were continued in 2010, particularly the impact of ionization phenomena on the transport of Ti vapors. Further measurements of discharge plasma parameters were performed using cylindrical Langmuir probes. The development of the discharge was modeled theoretically. Precise formulas were derived for exposure calculation of the deposited TiN surface films. 3) Participation in the ESS (European Spallation Source) project. In 2010 IPJ continued to participate in the ESS project. Thanks to this cooperation three science theses have been written and defended in the Faculty of Electronics and Information Technology of Warsaw University of Technology. These theses were as follows: The Bead-pull RF measurement system for the Linac 4 prototype This thesis contains a specification of a bead-pull measurement system for drift tube linear accelerator structures such as Linac 4. It consists of the physical basis for the measurement method and the general concept of such systems, as well as a specification of its complete (both hardware and software) implementation for the Linac 4 prototype. It also contains the results of the measurements gained using this system. These results confirmed the validity of the system and allowed conclusions regarding the

  9. Some integral formulations occurring in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Miano, G; Verolino, L [Naples Univ. (Italy). Dip. di Ingegneria Elettronica; [INFN, Naples (Italy); Vaccaro, V G [Naples Univ. (Italy). Dip. di Scienze Fisiche; [INFN, Naples (Italy)

    1995-10-01

    In this paper a powerful and robust analytical-numerical approach to study the electromagnetic interaction between a bunch of particles and the discontinuities of the vacuum chamber of a particle accelerator is discussed. In particular the diffraction of the electromagnetic field created by a bunch of a bunch of charges travelling through an iris and a drift tube is considered. Choosing in both cases a spectral transform of the current density distribution on the scatterer as unknowns, an effective numerical model is obtained. These unknowns have to satisfy a system of dual integral equations. A general procedure to transform this system into only one Fredholm integral equation of the second kind (in the case of the iris) or to a system of linear algebraic equations by means of a Neumann series (in the case of the drift tube) is described. These models allow to compute the longitudinal coupling impedance with a good accuracy either in the low frequency limit or in the high frequency limit.

  10. Some integral formulations occurring in accelerator physics

    International Nuclear Information System (INIS)

    Miano, G.; Verolino, L.; Vaccaro, V.G.

    1995-10-01

    In this paper a powerful and robust analytical-numerical approach to study the electromagnetic interaction between a bunch of particles and the discontinuities of the vacuum chamber of a particle accelerator is discussed. In particular the diffraction of the electromagnetic field created by a bunch of a bunch of charges travelling through an iris and a drift tube is considered. Choosing in both cases a spectral transform of the current density distribution on the scatterer as unknowns, an effective numerical model is obtained. These unknowns have to satisfy a system of dual integral equations. A general procedure to transform this system into only one Fredholm integral equation of the second kind (in the case of the iris) or to a system of linear algebraic equations by means of a Neumann series (in the case of the drift tube) is described. These models allow to compute the longitudinal coupling impedance with a good accuracy either in the low frequency limit or in the high frequency limit

  11. Basic concepts in computational physics

    CERN Document Server

    Stickler, Benjamin A

    2016-01-01

    This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the read...

  12. Topical problems of accelerator and applied heavy ion physics

    International Nuclear Information System (INIS)

    Becker, R.; Deitinghoff, H.; Junior, P.H.; Schempp, A.

    1990-12-01

    These proceedings contain the articles presented at the named seminar. They deal with high-intensity linacs for heavy ions, the free-electron laser, applications of heavy-ion beams, MEQALAC, the ESR Schottky-diagnosis system, the analysis of GaAs by ion-beam methods, a light-ion synchrotron for cancer therapy, a device for the measurement of the momentum spread of ion beams, the European Hadron facility, the breakdown fields at electrons in high vacuum, a computer program for the calculation of electric quadrupoles, a focusing electrostatic mirror, storage and cooling of Ar beams, the visualization of heavy ion tracks in photographic films, the motion of ions in magnetic fields, the CERN heavy ion program, linear colliders, the beam injection from a linac into a storage ring, negative-ion sources, wake field acceleration, RFQ's, a dense electron target, the matching of a DC beam into the RFQ, electron emission and breakdown in vacuum, and 1-1.5 GeV 300 mA linear accelerator, the production of high-current positive-ion beams, high-current beam experiments at GSI, improvement of the Frankfurt EBIS, the physics of the violin, double layers, beam formation with coupled RFQ's, atomic nitrogen beam for material modification, compact superconducting synchrotron-radiation sources, industrial property rights, a RF ion source for thin film processes, beam-cavity interactions in the RFQ linac, atomic physics with crossed uranium beams, proton linacs, the interdigital H-type structure, injection of H - beams into a RFQ accelerator, the production of MOS devices by ion implantation, the application of RFQ's, the Frankfurt highly-charged ion facility, RF acceleration techniques for beam current drive in tokamaks, space-charge neutralized transport, and storage rings for synchrotron radiation and free electron lasers. (HSI)

  13. Computer codes for beam dynamics analysis of cyclotronlike accelerators

    Science.gov (United States)

    Smirnov, V.

    2017-12-01

    Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.

  14. Computational physics of the mind

    Science.gov (United States)

    Duch, Włodzisław

    1996-08-01

    In the XIX century and earlier physicists such as Newton, Mayer, Hooke, Helmholtz and Mach were actively engaged in the research on psychophysics, trying to relate psychological sensations to intensities of physical stimuli. Computational physics allows to simulate complex neural processes giving a chance to answer not only the original psychophysical questions but also to create models of the mind. In this paper several approaches relevant to modeling of the mind are outlined. Since direct modeling of the brain functions is rather limited due to the complexity of such models a number of approximations is introduced. The path from the brain, or computational neurosciences, to the mind, or cognitive sciences, is sketched, with emphasis on higher cognitive functions such as memory and consciousness. No fundamental problems in understanding of the mind seem to arise. From a computational point of view realistic models require massively parallel architectures.

  15. Advanced Computing for 21st Century Accelerator Science and Technology

    International Nuclear Information System (INIS)

    Dragt, Alex J.

    2004-01-01

    Dr. Dragt of the University of Maryland is one of the Institutional Principal Investigators for the SciDAC Accelerator Modeling Project Advanced Computing for 21st Century Accelerator Science and Technology whose principal investigators are Dr. Kwok Ko (Stanford Linear Accelerator Center) and Dr. Robert Ryne (Lawrence Berkeley National Laboratory). This report covers the activities of Dr. Dragt while at Berkeley during spring 2002 and at Maryland during fall 2003

  16. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    International Nuclear Information System (INIS)

    Hules, John A.

    2008-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics

  17. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    2000-01-01

    Full text: The principal Department's duties in 1999 have not changed and were consequently directed on development in the area of electron and ion accelerators and their applications in science, medicine and technology. Two important events dominated the current and future orientation of R and D activity. The first was finalizing of long time efforts for preparing of the ordered research project granted by the State Committee of Scientific Research and devoted to elaboration and design of a new electron accelerator for radiotherapy, with two energies of X-ray photon beams. This project was formally approved in March 1999 and due to organisatory procedures set in operation after few months. In the second half of 1999, an important progress was done in advancing the project. The second mentioned event is foundation by the government of a Multiyear Research Programme - called ''Isotopes and Accelerators''. This programme formulates a broad spectrum of important tasks oriented on application of isotopes and accelerator techniques in many branches of science and national economy. The expected participation of the Department in this programme comprises following subjects: medical interoperative accelerator, high power electron accelerator for radiation technology, and upgrading of cyclotron for isotopes production. In course of 1999, preparatory studies in these subjects were carried out. Some of the results were presented on conferences and seminars. An interesting experience was the expertise done on technical status of Eindhoven isochronous cyclotron and its possible transfer to Swierk as a professional tool for isotopes production. In the group of medical applications, three subjects were continued during 1999 and brought important results: - completion of microwave measurements of high gradient acceleration structure for low energy accelerators; such structure will be very useful solution for Co-Line and interoperative accelerator; - evaluation of design data and

  18. Towards C++ object libraries for accelerator physics

    International Nuclear Information System (INIS)

    Michelotti, L.

    1992-01-01

    This paper concerns creation of libraries of reusable objects in the language C ++ for doing accelerator design and analysis. The C ++ language possesses features which lend themselves to writing portable, scientific solftware. The two libraries of C ++ classes (objects) which have been under development are (1) MXYZPTLK, which implements automatic differentiation and (2) BEAMLINE, which provides objects for modeling beam line and accelerator components. A description of the principle classes in these two libraries is presented

  19. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  20. The control computer for the Chalk River electron test accelerator

    International Nuclear Information System (INIS)

    McMichael, G.E.; Fraser, J.S.; McKeown, J.

    1978-02-01

    A versatile control and data acquisition system has been developed for a modest-sized linear accelerator using mainly process I/O hardware and software. This report describes the evolution of the present system since 1972, the modifications needed to satisfy the changing requirements of the various accelerator physics experiments and the limitations of such a system in process control. (author)

  1. Information technology and computational physics

    CERN Document Server

    Kóczy, László; Mesiar, Radko; Kacprzyk, Janusz

    2017-01-01

    A broad spectrum of modern Information Technology (IT) tools, techniques, main developments and still open challenges is presented. Emphasis is on new research directions in various fields of science and technology that are related to data analysis, data mining, knowledge discovery, information retrieval, clustering and classification, decision making and decision support, control, computational mathematics and physics, to name a few. Applications in many relevant fields are presented, notably in telecommunication, social networks, recommender systems, fault detection, robotics, image analysis and recognition, electronics, etc. The methods used by the authors range from high level formal mathematical tools and techniques, through algorithmic and computational tools, to modern metaheuristics.

  2. Physical Realizations of Quantum Computing

    CERN Document Server

    Kanemitsu, Shigeru; Salomaa, Martti; Takagi, Shin; Are the DiVincenzo Criteria Fulfilled in 2004 ?

    2006-01-01

    The contributors of this volume are working at the forefront of various realizations of quantum computers. They survey the recent developments in each realization, in the context of the DiVincenzo criteria, including nuclear magnetic resonance, Josephson junctions, quantum dots, and trapped ions. There are also some theoretical contributions which have relevance in the physical realizations of a quantum computer. This book fills the gap between elementary introductions to the subject and highly specialized research papers to allow beginning graduate students to understand the cutting-edge of r

  3. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  4. Electromagnetic Physics Models for Parallel Computing Architectures

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R; Ananya, A; Apostolakis, J; Aurora, A; Bandieramonte, M; Brun, R; Carminati, F; Gheata, A; Gheata, M; Goulas, I; Nikitina, T; Bhattacharyya, A; Mohanty, A; Canal, P; Elvira, D; Jun, S Y; Lima, G; Duhem, L

    2016-01-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well. (paper)

  5. Electromagnetic Physics Models for Parallel Computing Architectures

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  6. Classical mechanics and electromagnetism in accelerator physics

    CERN Document Server

    Stupakov, Gennady

    2018-01-01

    This self-contained textbook with exercises discusses a broad range of selected topics from classical mechanics and electromagnetic theory that inform key issues related to modern accelerators. Part I presents fundamentals of the Lagrangian and Hamiltonian formalism for mechanical systems, canonical transformations, action-angle variables, and then linear and nonlinear oscillators. The Hamiltonian for a circular accelerator is used to evaluate the equations of motion, the action, and betatron oscillations in an accelerator. From this base, we explore the impact of field errors and nonlinear resonances. This part ends with the concept of the distribution function and an introduction to the kinetic equation to describe large ensembles of charged particles and to supplement the previous single-particle analysis of beam dynamics. Part II focuses on classical electromagnetism and begins with an analysis of the electromagnetic field from relativistic beams, both in vacuum and in a resistive pipe. Plane electromagne...

  7. CAS Introduction to Accelerator Physics in Bulgaria

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    The CERN Accelerator School (CAS) and the Institute for Nuclear Research & Nuclear Energy (INRNE – Bulgarian Academy of Sciences) jointly organised a course on Introduction to Accelerators, at the Grand Hotel Varna, Bulgaria, from 19 September to 1 October, 2010.   CERN Accelerator School group photo. The course was extremely well attended with 109 participants representing 34 different nationalities, coming from countries as far away as Australia, Canada and Vietnam. The intensive programme comprised 39 lectures, 3 seminars, 4 tutorials where the students were split into three groups, a poster session where students could present their own work, and 7 hours of guided and private study. Feedback from the participants was extremely positive, praising the expertise and enthusiasm of the lecturers, as well as the high standard and excellent quality of their lectures. For the first time at CAS, the CERN Director-General, Rolf Heuer, visited the school and presented a seminar entitled...

  8. Department of Accelerator Physics And Technology - Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2009-01-01

    Full text: The activity of department P-10 is focused on the development of new acceleration techniques and technology, as well as on applications of particle accelerators. In 2008, the following topics were investigated and/or realized: 1. A linear accelerator for protons called TOP (Terapia Oncologica con Protoni, Oncological Proton Therapy). Basically a proton linac of modified Alvarez type working at 3000 MHz frequency and delivering beams in the energy range from 65 MeV to 200 MeV. In 2005, a contract was signed between ENEA and SINS-Swierk for the design, manufacture and delivery to Frascati of the input section of a 65 MeV linac. This section of SCDTL type will increase the proton energy from 7 to 16 MeV. In 2008, the field distribution in the manufactured structure was measured and optimized using available universal test stand. Measurements were also performed in ENEA/Frascati in October; a small difference in results, around 0.25%, is under investigation. Beam dynamics calculations using 3D codes have been started in parallel. 2. Preparation for participation in the international X-FEL project. Calculations of the parasitic Higher Order Modes (HOMs) induced in superconducting accelerating structures by very short electron bunches have been continued. Thanks to the special research grant received by department P-10 the design and completion of the HOM elements has been started for two accelerating modules, where each module consists of eight superconducting accelerating structures and focusing/correcting elements. 3. Superconducting layers; studies in INFN-Roma. Within the European CARE/JRA1/WP4-2 project, serious modification of the Nb-coating stand for the 1.3 GHz single-cell copper resonators using a vacuum arc was performed. Thanks to this stand the internal surface of the resonator was successfully coated. 4. TiN coating vacuum stand for RF components. At this stand the analysis of the TiN layer thickness as a function of reactive atmosphere pressure

  9. Computer-aided engineering in High Energy Physics

    International Nuclear Information System (INIS)

    Bachy, G.; Hauviller, C.; Messerli, R.; Mottier, M.

    1988-01-01

    Computing, standard tool for a long time in the High Energy Physics community, is being slowly introduced at CERN in the mechanical engineering field. The first major application was structural analysis followed by Computer-Aided Design (CAD). Development work is now progressing towards Computer-Aided Engineering around a powerful data base. This paper gives examples of the power of this approach applied to engineering for accelerators and detectors

  10. Handbook of accelerator physics and engineering

    CERN Document Server

    Mess, Karl Hubert; Tigner, Maury; Zimmermann, Frank

    2013-01-01

    Edited by internationally recognized authorities in the field, this expanded and updated new edition of the bestselling Handbook, containing more than 100 new articles, is aimed at the design and operation of modern particle accelerators. It is intended as a vade mecum for professional engineers and physicists engaged in these subjects. With a collection of more than 2000 equations, 300 illustrations and 500 graphs and tables, here one will find, in addition to the common formulae of previous compilations, hard-to-find, specialized formulae, recipes and material data pooled from the lifetime experience of many of the world's most able practitioners of the art and science of accelerators.

  11. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    1998-01-01

    (full text) In the context of general discussions concerning the activity of the Institute, it was important to look critically at current and future directions at the Department's activity. Attention is given to development of basic accelerator knowledge, realized at home and throughout international collaborations. Of importance is a steady improvement of metrological and experimental basis for accelerator research. Apart of this, some development tendencies were formulated during 1997, oriented to application fields of accelerators. As examples should be named: - medical applications: a) A serious effort was given to an idea of using the existing compact cyclotron C-30 as a source for creation of a diagnostic centre in Swierk. The proposition was formulated in contact with the Nuclear Medicine Department of the Medical Academy, and the ''Brodno'' General Hospital. In spite of declared medical interest in such an installation, the project was not approved, due to lack of proper financial support. b) Model measurements and verification of theoretical assumptions and calculations oriented on the design of a very short, high-gradiented acceleration structure for the low energy accelerator COLINE/1000 were done. This project will enable us to achieve ''source - isocentre distance'', of 1000 mm, instead of existing 800 mm. This is important for therapy. In 1998, this work will be supported by the State Committee for Scientific Research. c) Preliminary discussions, and design approach were undertaken in collaboration with the Centre of Oncology, for elaboration of a movable low-energy accelerator with electron beam output, matched to inter operational irradiation during surgical therapy of tumours. - applications in radiation technology: Comparison of isotope and machine radiation sources indicates that, under Polish conditions it is reasonable to use purpose-oriented high power accelerators. The working group composed of specialists from IChTJ and IPJ prepared the

  12. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  13. Linear collider accelerator physics issues regarding alignment

    International Nuclear Information System (INIS)

    Seeman, J.T.

    1990-01-01

    The next generation of linear colliders will require more stringent alignment tolerances than those for the SLC with regard to the accelerating structures, quadrupoles, and beam position monitors. New techniques must be developed to achieve these tolerances. A combination of mechanical-electrical and beam-based methods will likely be needed

  14. Applications of accelerator mass spectrometry to nuclear physics and astrophysics

    International Nuclear Information System (INIS)

    Guo Zhiyu; Zhang Chuan

    2002-01-01

    As an ultra high sensitive analyzing method, accelerator mass spectrometry is playing an important role in the studies of nuclear physics and astrophysics. The accelerator mass spectrometry (AMS) applications in searching for violation of Pauli exclusion principle and study on supernovae are discussed as examples

  15. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  16. SYMMETRY, HAMILTONIAN PROBLEMS AND WAVELETS IN ACCELERATOR PHYSICS

    International Nuclear Information System (INIS)

    FEDOROVA, A.; ZEITLIN, M.; PARSA, Z.

    2000-01-01

    In this paper the authors consider applications of methods from wavelet analysis to nonlinear dynamical problems related to accelerator physics. In this approach they take into account underlying algebraical, geometrical and topological structures of corresponding problems

  17. Summary for astrophysics and non-accelerator physics

    International Nuclear Information System (INIS)

    Kahana, S.H.

    1988-01-01

    This paper summarizes the presentations at the astrophysics and non-accelerator physics conference. Discussed in this paper are: supernovae, neutrinos, x-rays, gamma rays, cosmic rays, monopoles and primordial nucleosynthesis. 15 refs

  18. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  19. Proceedings of B Factories, the state of the art in accelerators, detectors and physics

    International Nuclear Information System (INIS)

    Hitlin, D.

    1992-11-01

    The conference B Factories, The State of the Art in Accelerators, Detectors and Physics was held at Stanford Linear Accelerator Center on April 6-10, 1992. The guiding principle of the conference was to bring together accelerator physicists and high energy experimentalists and theorists at the same time, with the goal of encouraging communication in defining and solving problems in a way which cut across narrow areas of specialization. Thus the conference was, in large measure, two distinct conferences, one involving accelerator specialists, the other theorists and experimentalists. There were initial and closing plenary sessions, and three separate tracks of parallel sessions, called Accelerator, Detector/Physics and Joint Interest sessions. This report contains the papers of this conference, the general topics of these cover: vacuum system, lattice design, beam-beam interactions, rf systems, feedback systems, measuring instrumentation, the interaction region, radiation background, particle detectors, particle tracking and identification, data acquisition, and computing system, and particle theory

  20. Proceedings of B Factories, the state of the art in accelerators, detectors and physics

    Energy Technology Data Exchange (ETDEWEB)

    Hitlin, D. (ed.) (California Inst. of Tech., Pasadena, CA (United States))

    1992-11-01

    The conference B Factories, The State of the Art in Accelerators, Detectors and Physics was held at Stanford Linear Accelerator Center on April 6-10, 1992. The guiding principle of the conference was to bring together accelerator physicists and high energy experimentalists and theorists at the same time, with the goal of encouraging communication in defining and solving problems in a way which cut across narrow areas of specialization. Thus the conference was, in large measure, two distinct conferences, one involving accelerator specialists, the other theorists and experimentalists. There were initial and closing plenary sessions, and three separate tracks of parallel sessions, called Accelerator, Detector/Physics and Joint Interest sessions. This report contains the papers of this conference, the general topics of these cover: vacuum system, lattice design, beam-beam interactions, rf systems, feedback systems, measuring instrumentation, the interaction region, radiation background, particle detectors, particle tracking and identification, data acquisition, and computing system, and particle theory.

  1. The individual health physics at Saclay accelerators

    International Nuclear Information System (INIS)

    Brochen, J.C.; Delsaut, R.; Drouet, J.; Vialettes, H.; Zerbib, J.C.

    1981-11-01

    After giving a brief description of the Saturne synchrotron and the linear accelerator located on the Saclay site, the risks of irradiation in operation and on shut-down are reviewed and a description is given of the arrangements made for protection against radiation such as the shielding, access safety and the central monitoring of radiation. The irradiation statistics for the last few years are given [fr

  2. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  3. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  4. Mathematics, Physics and Computer Sciences The computation of ...

    African Journals Online (AJOL)

    Mathematics, Physics and Computer Sciences The computation of system matrices for biquadraticsquare finite ... Global Journal of Pure and Applied Sciences ... The computation of system matrices for biquadraticsquare finite elements.

  5. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  6. High energy physics and grid computing

    International Nuclear Information System (INIS)

    Yu Chuansong

    2004-01-01

    The status of the new generation computing environment of the high energy physics experiments is introduced briefly in this paper. The development of the high energy physics experiments and the new computing requirements by the experiments are presented. The blueprint of the new generation computing environment of the LHC experiments, the history of the Grid computing, the R and D status of the high energy physics grid computing technology, the network bandwidth needed by the high energy physics grid and its development are described. The grid computing research in Chinese high energy physics community is introduced at last. (authors)

  7. Physics of high energy particle accelerators. AIP conference proceedings No. 127

    International Nuclear Information System (INIS)

    Month, M.; Dahl, P.F.; Dienes, M.

    1985-01-01

    Topics covered in this workshop include accelerator physics, particle physics, and new acceleration methods. Eighteen lectures were presented. Individual abstracts were prepared separately for the data base

  8. The HL-LHC accelerator physics challenges

    CERN Document Server

    Fartoukh, S

    2014-01-01

    We review the conceptual baseline of the HL-LHC project, putting into perspective the main beam physics challenges of this new collider in comparison with the existing LHC, and the series of solutions and possible mitigation measures presently envisaged.

  9. The HL-LHC Accelerator Physics Challenges

    Science.gov (United States)

    Fartoukh, S.; Zimmermann, F.

    The conceptual baseline of the HL-LHC project is reviewed, putting into perspective the main beam physics challenges of this new collider in comparison with the existing LHC, and the series of solutions and possible mitigation measures presently envisaged.

  10. The HL-LHC accelerator physics challenges

    CERN Document Server

    Fartoukh, S

    2015-01-01

    The conceptual baseline of the HL-LHC project is reviewed, putting into perspective the main beam physics challenges of this new collider in comparison with the existing LHC, and the series of solutions and possible mitigation measures presently envisaged.

  11. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Becks, Karl-Heinz; Perret-Gallix, Denis

    1994-01-01

    New techniques were highlighted by the ''Third International Workshop on Software Engineering, Artificial Intelligence and Expert Systems for High Energy and Nuclear Physics'' in Oberammergau, Bavaria, Germany, from October 4 to 8. It was the third workshop in the series; the first was held in Lyon in 1990 and the second at France-Telecom site near La Londe les Maures in 1992. This series of workshops covers a broad spectrum of problems. New, highly sophisticated experiments demand new techniques in computing, in hardware as well as in software. Software Engineering Techniques could in principle satisfy the needs for forthcoming accelerator experiments. The growing complexity of detector systems demands new techniques in experimental error diagnosis and repair suggestions; Expert Systems seem to offer a way of assisting the experimental crew during data-taking

  12. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-01-01

    This paper describes some recent developments in computing and stresses their application to accelerator control systems. Among the advances that promise to have a significant impact are: i) low cost scientific workstations; ii) the use of ''windows'', pointing devices and menus in a multitasking operating system; iii) high resolution large-screen graphics monitors; iv) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, the authors examine the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  13. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-04-01

    This paper describes some recent developments in computing and stresses their application in accelerator control systems. Among the advances that promise to have a significant impact are (1) low cost scientific workstations; (2) the use of ''windows'', pointing devices and menus in a multi-tasking operating system; (3) high resolution large-screen graphics monitors; (4) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, this paper examines the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  14. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  15. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1988-09-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multi-user Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implementation with four months with a computer and instrumentation cost of approximately $100K. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking and operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the efficient implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. 3 refs

  16. Distributed computer controls for accelerator systems

    Science.gov (United States)

    Moore, T. L.

    1989-04-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed.

  17. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1989-01-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. (orig.)

  18. DEVELOPING THE PHYSICS DESIGN FOR NDCX-II, A UNIQUE PULSE-COMPRESSING ION ACCELERATOR

    International Nuclear Information System (INIS)

    Friedman, A.; Barnard, J.J.; Cohen, R.H.; Grote, D.P.; Lund, S.M.; Sharp, W.M.; Faltens, A.; Henestroza, E.; Jung, J.-Y.; Kwan, J.W.; Lee, E.P.; Leitner, M.A.; Logan, B.G.; Vay, J.-L.; Waldron, W.L.; Davidson, R.C.; Dorf, M.; Gilson, E.P.; Kaganovich, I.

    2009-01-01

    The Heavy Ion Fusion Science Virtual National Laboratory (a collaboration of LBNL, LLNL, and PPPL) is using intense ion beams to heat thin foils to the 'warm dense matter' regime at ∼ + ions to ∼1 ns while accelerating it to 3-4 MeV over ∼15 m. Strong space charge forces are incorporated into the machine design at a fundamental level. We are using analysis, an interactive 1D PIC code (ASP) with optimizing capabilities and centroid tracking, and multi-dimensional Warpcode PIC simulations, to develop the NDCX-II accelerator. This paper describes the computational models employed, and the resulting physics design for the accelerator.

  19. High energy physics and cloud computing

    International Nuclear Information System (INIS)

    Cheng Yaodong; Liu Baoxu; Sun Gongxing; Chen Gang

    2011-01-01

    High Energy Physics (HEP) has been a strong promoter of computing technology, for example WWW (World Wide Web) and the grid computing. In the new era of cloud computing, HEP has still a strong demand, and major international high energy physics laboratories have launched a number of projects to research on cloud computing technologies and applications. It describes the current developments in cloud computing and its applications in high energy physics. Some ongoing projects in the institutes of high energy physics, Chinese Academy of Sciences, including cloud storage, virtual computing clusters, and BESⅢ elastic cloud, are also described briefly in the paper. (authors)

  20. Towards C++ object libraries for accelerator physics

    International Nuclear Information System (INIS)

    Michelotti, L.

    1993-01-01

    We must write robust, flexible, portable software that is easier to understand, maintain, modify, reuse, and extend. These attributes are more than mere buzzwords. They are important goals that computer scientists strive to achieve by refining their programming models and devising languages to support them. An important breakthrough was achieved by the introduction of object oriented programming (OOP) as the computing model behind such languages as Smalltalk, ADA, Eiffel, Objective-C, and C++. OOP is not a ''fad'': it is arguably the most significant development in programming since the invention of FORTRAN, and it is the way that the best software will be written well into the next century. An ''object'' comprises structures of data, the functions that manipulate them, and rules for bringing them into and out of scope. OOP is a methodology for realizing and fully utilizing this abstract concept, an extension to programming of the basic technique that has advanced mathematics for centuries

  1. Computer simulations of compact toroid formation and acceleration

    International Nuclear Information System (INIS)

    Peterkin, R.E. Jr.; Sovinec, C.R.

    1990-01-01

    Experiments to form, accelerate, and focus compact toroid plasmas will be performed on the 9.4 MJ SHIVA STAR fast capacitor bank at the Air Force Weapons Laboratory during the 1990. The MARAUDER (magnetically accelerated rings to achieve ultrahigh directed energy and radiation) program is a research effort to accelerate magnetized plasma rings with the masses between 0.1 and 1.0 mg to velocities above 10 8 cm/sec and energies above 1 MJ. Research on these high-velocity compact toroids may lead to development of very fast opening switches, high-power microwave sources, and an alternative path to inertial confinement fusion. Design of a compact toroid accelerator experiment on the SHIVA STAR capacitor bank is underway, and computer simulations with the 2 1/2-dimensional magnetohydrodynamics code, MACH2, have been performed to guide this endeavor. The compact toroids are produced in a magnetized coaxial plasma gun, and the acceleration will occur in a configuration similar to a coaxial railgun. Detailed calculations of formation and equilibration of a low beta magnetic force-free configuration (curl B = kB) have been performed with MACH2. In this paper, the authors discuss computer simulations of the focusing and acceleration of the toroid

  2. Towards the petascale in electromagnetic modeling of plasma-based accelerators for high-energy physics

    International Nuclear Information System (INIS)

    Bruhwiler, D L; Antonsen, T; Cary, J R; Cooley, J; Decyk, V K; Esarey, E; Geddes, C G R; Huang, C; Hakim, A; Katsouleas, T; Messmer, P; Mori, W B; Tsung, F S; Vieira, J; Zhou, M

    2006-01-01

    Plasma-based lepton acceleration concepts are a key element of the long-term R and D portfolio for the U.S. Office of High Energy Physics. There are many such concepts, but we consider only the laser (LWFA) and plasma (PWFA) wakefield accelerators. We present a summary of electromagnetic particle-in-cell (PIC) simulations for recent LWFA and PWFA experiments. These simulations, including both time explicit algorithms and reduced models, have effectively used terascale computing resources to support and guide experiments in this rapidly developing field. We briefly discuss the challenges and opportunities posed by the near-term availability of petascale computing hardware

  3. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  4. Applications of the ARGUS code in accelerator physics

    International Nuclear Information System (INIS)

    Petillo, J.J.; Mankofsky, A.; Krueger, W.A.; Kostas, C.; Mondelli, A.A.; Drobot, A.T.

    1993-01-01

    ARGUS is a three-dimensional, electromagnetic, particle-in-cell (PIC) simulation code that is being distributed to U.S. accelerator laboratories in collaboration between SAIC and the Los Alamos Accelerator Code Group. It uses a modular architecture that allows multiple physics modules to share common utilities for grid and structure input., memory management, disk I/O, and diagnostics, Physics modules are in place for electrostatic and electromagnetic field solutions., frequency-domain (eigenvalue) solutions, time- dependent PIC, and steady-state PIC simulations. All of the modules are implemented with a domain-decomposition architecture that allows large problems to be broken up into pieces that fit in core and that facilitates the adaptation of ARGUS for parallel processing ARGUS operates on either Cray or workstation platforms, and MOTIF-based user interface is available for X-windows terminals. Applications of ARGUS in accelerator physics and design are described in this paper

  5. Tracking and vertexing for B physics at hadron accelerators

    International Nuclear Information System (INIS)

    Johnson, R.; Purohit, M.; Weidemann, A.W.

    1993-01-01

    In this note, the authors report on some of the activities of the Tracking and Vertexing Working Group of this Workshop. Track and vertex finding is essential to exploit the high production rate of B-mesons at hadron accelerators, both for triggering and analysis. Here, they review the tracking and vertex-finding systems of some of the major existing and proposed collider and fixed-target experiments at existing and future hadron accelerators, with a view towards their usefulness for B-physics. The capabilities of both general-purpose detectors and those of dedicated B-physics experiments are considered

  6. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  7. Workshop on physics at future accelerators

    International Nuclear Information System (INIS)

    1987-01-01

    A workshop took place at La Thuile and at CERN in January 1987 to study the physics potential of three types of particle collider with energies in the TeV region, together with the feasibility of experiments with them. The machines were: a large hadron collider (LHC) placed in the LEP tunnel at CERN, with a total proton-proton centre-of-mass energy of about 16 TeV; an electron-proton collider, using the LHC and LEP, with a centre-of-mass energy in the range 1.3 TeV to 1.8 TeV; and an electron-positron linear collider with centre-of-mass energy about 2 TeV. The summary talks given at CERN by the conveners of the study groups are contained in volume I of the proceedings; the present volume is devoted to the contributions from the participants at La Thuile. These give details of the studies carried out on the new phenomena which might be observed at the new machines and the technical feasibility of possible experiments. (orig.)

  8. Applied Physics Research at the Idaho Accelerator Center

    International Nuclear Information System (INIS)

    Date, D. S.; Hunt, A. W.; Chouffani, K.; Wells, D. P.

    2011-01-01

    The Idaho Accelerator Center, founded in 1996 and based at Idaho State University, supports research, education, and high technology economic development in the United States. The research center currently has eight electron linear accelerators ranging in energy from 6 to 44 MeV with the latter linear accelerator capable of picosecond pulses, a 2 MeV positive-ion Van de Graaff, a 4 MV Nec tandem Pelletron, and a pulsed-power 8 k A, 10 MeV electron induction accelerator. Current research emphases include, accelerator physics research, accelerator based medical isotope production, active interrogation techniques for homeland security and nuclear nonproliferation applications, non destructive testing and materials science studies in support of industry as well as the development of advanced nuclear fuels, pure and applied radio-biology, and medical physics. This talk will highlight three of these areas including the production of the isotopes 99 Tc and 67 Cu for medical diagnostics and therapy, as well as two new technologies currently under development for nuclear safeguards and homeland security - namely laser Compton scattering and the polarized photofission of actinides

  9. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Directory of Open Access Journals (Sweden)

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  10. Future Accelerator Challenges in Support of High-Energy Physics

    International Nuclear Information System (INIS)

    Zisman, Michael S.; Zisman, M.S.

    2008-01-01

    Historically, progress in high-energy physics has largely been determined by development of more capable particle accelerators. This trend continues today with the imminent commissioning of the Large Hadron Collider at CERN, and the worldwide development effort toward the International Linear Collider. Looking ahead, there are two scientific areas ripe for further exploration--the energy frontier and the precision frontier. To explore the energy frontier, two approaches toward multi-TeV beams are being studied, an electron-positron linear collider based on a novel two-beam powering system (CLIC), and a Muon Collider. Work on the precision frontier involves accelerators with very high intensity, including a Super-BFactory and a muon-based Neutrino Factory. Without question, one of the most promising approaches is the development of muon-beam accelerators. Such machines have very high scientific potential, and would substantially advance the state-of-the-art in accelerator design. The challenges of the new generation of accelerators, and how these can be accommodated in the accelerator design, are described. To reap their scientific benefits, all of these frontier accelerators will require sophisticated instrumentation to characterize the beam and control it with unprecedented precision

  11. Future Accelerator Challenges in Support of High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Zisman, Michael S.; Zisman, M.S.

    2008-05-03

    Historically, progress in high-energy physics has largely been determined by development of more capable particle accelerators. This trend continues today with the imminent commissioning of the Large Hadron Collider at CERN, and the worldwide development effort toward the International Linear Collider. Looking ahead, there are two scientific areas ripe for further exploration--the energy frontier and the precision frontier. To explore the energy frontier, two approaches toward multi-TeV beams are being studied, an electron-positron linear collider based on a novel two-beam powering system (CLIC), and a Muon Collider. Work on the precision frontier involves accelerators with very high intensity, including a Super-BFactory and a muon-based Neutrino Factory. Without question, one of the most promising approaches is the development of muon-beam accelerators. Such machines have very high scientific potential, and would substantially advance the state-of-the-art in accelerator design. The challenges of the new generation of accelerators, and how these can be accommodated in the accelerator design, are described. To reap their scientific benefits, all of these frontier accelerators will require sophisticated instrumentation to characterize the beam and control it with unprecedented precision.

  12. A physical process of the radial acceleration of disc galaxies

    Science.gov (United States)

    Wilhelm, Klaus; Dwivedi, Bhola N.

    2018-03-01

    An impact model of gravity designed to emulate Newton's law of gravitation is applied to the radial acceleration of disc galaxies. Based on this model (Wilhelm et al. 2013), the rotation velocity curves can be understood without the need to postulate any dark matter contribution. The increased acceleration in the plane of the disc is a consequence of multiple interactions of gravitons (called `quadrupoles' in the original paper) and the subsequent propagation in this plane and not in three-dimensional space. The concept provides a physical process that relates the fit parameter of the acceleration scale defined by McGaugh et al. (2016) to the mean free path length of gravitons in the discs of galaxies. It may also explain the gravitational interaction at low acceleration levels in MOdification of the Newtonian Dynamics (MOND, Milgrom 1983, 1994, 2015, 2016). Three examples are discussed in some detail: the spiral galaxies NGC 7814, NGC 6503 and M 33.

  13. CAS Accelerator Physics (High-Power Hadron Machines) in Spain

    CERN Multimedia

    CAS

    2011-01-01

    The CERN Accelerator School (CAS) and ESS-Bilbao jointly organised a specialised course on High-Power Hadron Machines, held at the Hotel Barceló Nervión in Bilbao, Spain, from 24 May to 2 June, 2011.   CERN Accelerator School students. After recapitulation lectures on the essentials of accelerator physics and review lectures on the different types of accelerators, the programme focussed on the challenges of designing and operating high-power facilities. The particular problems for RF systems, beam instrumentation, vacuum, cryogenics, collimators and beam dumps were examined. Activation of equipment, radioprotection and remote handling issues were also addressed. The school was very successful, with 69 participants of 22 nationalities. Feedback from the participants was extremely positive, praising the expertise and enthusiasm of the lecturers, as well as the high standard and excellent quality of their lectures. In addition to the academic programme, the participants w...

  14. Handling and Transport of Oversized Accelerator Components and Physics Detectors

    CERN Document Server

    Prodon, S; Guinchard, M; Minginette, P

    2006-01-01

    For cost, planning and organisational reasons, it is often decided to install large pre-built accelerators components and physics detectors. As a result surface exceptional transports are required from the construction to the installation sites. Such heavy transports have been numerous during the LHC installation phase. This paper will describe the different types of transport techniques used to fit the particularities of accelerators and detectors components (weight, height, acceleration, planarity) as well as the measurement techniques for monitoring and the logistical aspects (organisation with the police, obstacles on the roads, etc). As far as oversized equipment is concerned, the lowering into the pit is challenging, as well as the transport in tunnel galleries in a very scare space and without handling means attached to the structure like overhead travelling cranes. From the PS accelerator to the LHC, handling systems have been developed at CERN to fit with these particular working conditions. This pap...

  15. CEBAF: A superconducting radio frequency accelerator for nuclear physics

    International Nuclear Information System (INIS)

    Hartline, B.K.

    1988-01-01

    The Continuous Electron Beam Accelerator Facility (CEBAF) will be a 4-GeV, 200-μA superconducting recirculating linear accelerator to provide CW electron beams to simultaneous nuclear physics experiments in three end stations. Funded by the Department of Energy, CEBAF's purpose is basic research on the nuclear many-body system, its quark substructure, and the strong and electroweak interactions governing this form of matter. At the heart of the accelerator are niobium superconducting accelerating cavities designed at Cornell University and successfully prototyped with industry during the past three years. The cavities consistently exceed CEBAF's performance specifications (gradient ≥ 5 MV/m, Q 0 ≥ 2.4 /times/ 10 9 at 2 K and 5 MV/m). Construction is under way, and operation is scheduled in 1994. 26 refs., 9 figs., 3 tabs

  16. CAS CERN accelerator school: 5. general accelerator physics course. Vol. 2. Proceedings

    International Nuclear Information System (INIS)

    Turner, S.

    1994-01-01

    The fifth CERN Accelerator School (CAS) basic course on General Accelerator Physics was given at the University of Jyvaeskylae, Finland, from 7 to 18 September 1992. Its syllabus was based on the previous similar courses held at Gif-sur-Yvette in 1984, Aarhus 1986, Salamanca 1988 and Juelich 1990, and whose proceedings were published as CERN Reports 85-19, 87-10, 89-05 and 91-04, respectively. However, certain topics were treated in a different way, improved or extended, while new subjects were introduced. As far as the proceedings of this school are concerned the opportunity was taken not only to include the lectures presented but also to select and revise the most appropriate chapters from the previous similar schools. In this way the present volumes constitute a rather complete introduction to all aspects of the design and construction of particle accelerators, including optics, emittance, luminosity, longitudinal and transverse beam dynamics, insertions, chromaticity, transfer lines, resonances, accelerating structures, tune shifts, coasting beams, lifetime, synchrotron radiation, radiation damping, beam-beam effects, diagnostics, cooling, ion and positron sources, RF and vacuum systems, injection and extraction, conventional, permanent and superconducting magnets, cyclotrons, RF linear accelerators, microtrons, as well as applications of particle accelerators (including therapy) and the history of accelerators. See hints under the relevant topics. (orig.)

  17. Quantum computing accelerator I/O : LDRD 52750 final report

    International Nuclear Information System (INIS)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-01-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work

  18. Accelerator physics and nuclear energy education in INRNE-BAS

    International Nuclear Information System (INIS)

    Tonev, D.; Goutev, N.; Georgiev, L. S.

    2015-01-01

    Presently Bulgaria has no research nuclear facility, neither a research reactor, nor an accelerator. With the new cyclotron laboratory in Sofia the Institute for Nuclear Research and Nuclear Energy at the Bulgarian Academy of Sciences will restart the experimental research program not only in the fi eld of nuclear physics, but also in many interdisciplinary fields related to nuclear physics. The cornerstone of the cyclotron laboratory is a cyclotron TR24, which provides a proton beam with a variable energy between 15 and 24 MeV and current of up to 0.4 mA. The TR24 accelerator allows for the production of a large variety of radioisotopes for medical applications and development of radiopharmaceuticals. The new cyclotron facility will be used for research in radiopharmacy, radiochemistry, radiobiology, nuclear physics, solid state physics, applied research, new materials and for education in all these fields including especially nuclear energy. Keywords: Cyclotron, PET/CT, radiopharmacy

  19. Acceleration methods for multi-physics compressible flow

    Science.gov (United States)

    Peles, Oren; Turkel, Eli

    2018-04-01

    In this work we investigate the Runge-Kutta (RK)/Implicit smoother scheme as a convergence accelerator for complex multi-physics flow problems including turbulent, reactive and also two-phase flows. The flows considered are subsonic, transonic and supersonic flows in complex geometries, and also can be either steady or unsteady flows. All of these problems are considered to be a very stiff. We then introduce an acceleration method for the compressible Navier-Stokes equations. We start with the multigrid method for pure subsonic flow, including reactive flows. We then add the Rossow-Swanson-Turkel RK/Implicit smoother that enables performing all these complex flow simulations with a reasonable CFL number. We next discuss the RK/Implicit smoother for time dependent problem and also for low Mach numbers. The preconditioner includes an intrinsic low Mach number treatment inside the smoother operator. We also develop a modified Roe scheme with a corresponding flux Jacobian matrix. We then give the extension of the method for real gas and reactive flow. Reactive flows are governed by a system of inhomogeneous Navier-Stokes equations with very stiff source terms. The extension of the RK/Implicit smoother requires an approximation of the source term Jacobian. The properties of the Jacobian are very important for the stability of the method. We discuss what the chemical physics theory of chemical kinetics tells about the mathematical properties of the Jacobian matrix. We focus on the implication of the Le-Chatelier's principle on the sign of the diagonal entries of the Jacobian. We present the implementation of the method for turbulent flow. We use a two RANS turbulent model - one equation model - Spalart-Allmaras and a two-equation model - k-ω SST model. The last extension is for two-phase flows with a gas as a main phase and Eulerian representation of a dispersed particles phase (EDP). We present some examples for such flow computations inside a ballistic evaluation

  20. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    Science.gov (United States)

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  1. Computer codes in particle transport physics

    International Nuclear Information System (INIS)

    Pesic, M.

    2004-01-01

    is given. Importance of validation and verification of data and computer codes is underlined briefly. Examples of applications of the MCNPX, FLUKA and SHIELD codes to simulation of some of processes in nature, from reactor physics, ion medical therapy, cross section calculations, design of accelerator driven sub-critical systems to astrophysics and shielding of spaceships, are shown. More reliable and more frequent cross sections data in intermediate and high- energy range for particles transport and interactions with mater are expected in near future, as a result of new experimental investigations that are under way with the aim to validate theoretical models applied currently in the codes. These new data libraries are expected to be much larger and more comprehensive than existing ones requiring more computer memory and faster CPUs. Updated versions of the codes to be developed in future, beside sequential computation versions, will also include the MPI or PVM options to allow faster ru: ming of the code at acceptable cost for an end-user. A new option to be implemented in the codes is expected too - an end-user written application for particular problem could be added relatively simple to the general source code script. Initial works on full implementation of graphic user interface for preparing input and analysing output of codes and ability to interrupt and/or continue code running should be upgraded to user-friendly level. (author)

  2. NSC KIPT accelerator on nuclear and high energy physics

    NARCIS (Netherlands)

    Guk, I.S.; Dovbnya, A.N.; Kononenko, S.G.; Tarasenko, A.S.; Botman, J.I.M.; Wiel, van der M.J.

    2004-01-01

    One of the main reasons for the outflow of experts in nuclear physics and adjacent areas of science from Ukraine is the absence of modern accelerating facilities, for conducting research in the present fields of interest worldwide in this area of knowledge. A qualitatively new level of research can

  3. Thirty years of physics at the Bucharest tandem accelerator

    International Nuclear Information System (INIS)

    Dobrescu, S.; Marinescu, L.; Dumitru, G.; Cata-Danil, Gh.

    2003-01-01

    The main parameters of the Bucharest tandem accelerator, as well as the main milestones of its history since March 1973 when it was commissioned are shortly presented. A general presentation of the main basic and applied physics research so far undertaken at the tandem is given, ending with some ideas related with the future perspectives of the tandem. (authors)

  4. GPU acceleration of Dock6's Amber scoring computation.

    Science.gov (United States)

    Yang, Hailong; Zhou, Qiongqiong; Li, Bo; Wang, Yongjian; Luan, Zhongzhi; Qian, Depei; Li, Hanlu

    2010-01-01

    Dressing the problem of virtual screening is a long-term goal in the drug discovery field, which if properly solved, can significantly shorten new drugs' R&D cycle. The scoring functionality that evaluates the fitness of the docking result is one of the major challenges in virtual screening. In general, scoring functionality in docking requires a large amount of floating-point calculations, which usually takes several weeks or even months to be finished. This time-consuming procedure is unacceptable, especially when highly fatal and infectious virus arises such as SARS and H1N1, which forces the scoring task to be done in a limited time. This paper presents how to leverage the computational power of GPU to accelerate Dock6's (http://dock.compbio.ucsf.edu/DOCK_6/) Amber (J. Comput. Chem. 25: 1157-1174, 2004) scoring with NVIDIA CUDA (NVIDIA Corporation Technical Staff, Compute Unified Device Architecture - Programming Guide, NVIDIA Corporation, 2008) (Compute Unified Device Architecture) platform. We also discuss many factors that will greatly influence the performance after porting the Amber scoring to GPU, including thread management, data transfer, and divergence hidden. Our experiments show that the GPU-accelerated Amber scoring achieves a 6.5× speedup with respect to the original version running on AMD dual-core CPU for the same problem size. This acceleration makes the Amber scoring more competitive and efficient for large-scale virtual screening problems.

  5. 2-D and 3-D computations of curved accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.

    1991-01-01

    In order to save computer memory, a long accelerator magnet may be computed by treating the long central region and the end regions separately. The dipole magnets for the injector synchrotron of the Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), employ magnet iron consisting of parallel laminations, stacked with a uniform radius of curvature of 33.379 m. Laplace's equation for the magnetic scalar potential has a different form for a straight magnet (x-y coordinates), a magnet with surfaces curved about a common center (r-θ coordinates), and a magnet with parallel laminations like the APS injector dipole. Yet pseudo 2-D computations for the three geometries give basically identical results, even for a much more strongly curved magnet. Hence 2-D (x-y) computations of the central region and 3-D computations of the end regions can be combined to determine the overall magnetic behavior of the magnets. 1 ref., 6 figs

  6. Quantum Accelerators for High-Performance Computing Systems

    OpenAIRE

    Britt, Keith A.; Mohiyaddin, Fahd A.; Humble, Travis S.

    2017-01-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantu...

  7. High energy physics computing in Japan

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1989-01-01

    A brief overview of the computing provision for high energy physics in Japan is presented. Most of the computing power for high energy physics is concentrated in KEK. Here there are two large scale systems: one providing a general computing service including vector processing and the other dedicated to TRISTAN experiments. Each university group has a smaller sized mainframe or VAX system to facilitate both their local computing needs and the remote use of the KEK computers through a network. The large computer system for the TRISTAN experiments is described. An overview of a prospective future large facility is also given. (orig.)

  8. Research on acceleration method of reactor physics based on FPGA platforms

    International Nuclear Information System (INIS)

    Li, C.; Yu, G.; Wang, K.

    2013-01-01

    The physical designs of the new concept reactors which have complex structure, various materials and neutronic energy spectrum, have greatly improved the requirements to the calculation methods and the corresponding computing hardware. Along with the widely used parallel algorithm, heterogeneous platforms architecture has been introduced into numerical computations in reactor physics. Because of the natural parallel characteristics, the CPU-FPGA architecture is often used to accelerate numerical computation. This paper studies the application and features of this kind of heterogeneous platforms used in numerical calculation of reactor physics through practical examples. After the designed neutron diffusion module based on CPU-FPGA architecture achieves a 11.2 speed up factor, it is proved to be feasible to apply this kind of heterogeneous platform into reactor physics. (authors)

  9. Multipactor Physics, Acceleration, and Breakdown in Dielectric-Loaded Accelerating Structures

    International Nuclear Information System (INIS)

    Fischer, Richard P.; Gold, Steven H.

    2016-01-01

    The objective of this 3-year program is to study the physics issues associated with rf acceleration in dielectric-loaded accelerating (DLA) structures, with a focus on the key issue of multipactor loading, which has been found to cause very significant rf power loss in DLA structures whenever the rf pulsewidth exceeds the multipactor risetime (~10 ns). The experiments are carried out in the X-band magnicon laboratory at the Naval Research Laboratory (NRL) in collaboration with Argonne National Laboratory (ANL) and Euclid Techlabs LLC, who develop the test structures with support from the DoE SBIR program. There are two main elements in the research program: (1) high-power tests of DLA structures using the magnicon output (20 MW @11.4 GHz), and (2) tests of electron acceleration in DLA structures using relativistic electrons from a compact X-band accelerator. The work during this period has focused on a study of the use of an axial magnetic field to suppress multipactor in DLA structures, with several new high power tests carried out at NRL, and on preparation of the accelerator for the electron acceleration experiments.

  10. Advances of dense plasma physics with particle accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, D.H.H.; Blazevic, A.; Rosmej, O.N.; Spiller, P.; Tahir, N.A.; Weyrich, K. [Gesellschaft fur Schwerionenforschung, GSI-Darmstadt, Plasmaphysik, Darmstadt (Germany); Hoffmann, D.H.H.; Dafni, T.; Kuster, M.; Roth, M.; Udrea, S.; Varentsov, D. [DarmstadtTechnische Univ., Institut fur Kernphysik (Germany); Jacoby, J. [Frankfurt Univ., Institut fur Angewandte Physik (Germany); Zioutas, K. [European Organization for Nuclear Research (CERN), Geneve (Switzerland); Patras Univ., Dept. of Physics (Greece); Sharkov, B.Y. [Institut for Theoretical and Experimental Physics ITEP, Moscow (Russian Federation)

    2006-06-15

    High intensity particle beams from accelerators induce high energy density states in bulk matter. The SIS-18 heavy ion synchrotron at GSI (Darmstadt, Germany) now routinely delivers intense Uranium beams that deposit about 1 kJ/g of specific energy in solid matter, e.g. solid lead. Due to the specific nature of the ion-matter interaction a volume of matter is heated uniformly with low gradients of temperature and pressure in the initial phase, depending on the pulse structure of the beam with respect to space and time. The new accelerator complex FAIR (Facility for Antiproton and ion Research) at GSI as well as beams from the CERN large hadron collider (LHC) will vastly extend the accessible parameter range for high energy density states. One special piece of accelerator equipment a superconducting high field dipole magnet, developed for the LHC at CERN is now serving as a key instrument to diagnose the dense plasma of the sun interior plasma, thus providing an extremely interesting combination of accelerator physics, plasma physics and particle physics. (authors)

  11. Advances of dense plasma physics with particle accelerators

    International Nuclear Information System (INIS)

    Hoffmann, D.H.H.; Blazevic, A.; Rosmej, O.N.; Spiller, P.; Tahir, N.A.; Weyrich, K.; Hoffmann, D.H.H.; Dafni, T.; Kuster, M.; Roth, M.; Udrea, S.; Varentsov, D.; Jacoby, J.; Zioutas, K.; Sharkov, B.Y.

    2006-01-01

    High intensity particle beams from accelerators induce high energy density states in bulk matter. The SIS-18 heavy ion synchrotron at GSI (Darmstadt, Germany) now routinely delivers intense Uranium beams that deposit about 1 kJ/g of specific energy in solid matter, e.g. solid lead. Due to the specific nature of the ion-matter interaction a volume of matter is heated uniformly with low gradients of temperature and pressure in the initial phase, depending on the pulse structure of the beam with respect to space and time. The new accelerator complex FAIR (Facility for Antiproton and ion Research) at GSI as well as beams from the CERN large hadron collider (LHC) will vastly extend the accessible parameter range for high energy density states. One special piece of accelerator equipment a superconducting high field dipole magnet, developed for the LHC at CERN is now serving as a key instrument to diagnose the dense plasma of the sun interior plasma, thus providing an extremely interesting combination of accelerator physics, plasma physics and particle physics. (authors)

  12. Health physics manual of good practices for accelerator facilities

    International Nuclear Information System (INIS)

    Casey, W.R.; Miller, A.J.; McCaslin, J.B.; Coulson, L.V.

    1988-04-01

    It is hoped that this manual will serve both as a teaching aid as well as a useful adjunct for program development. In the context of application, this manual addresses good practices that should be observed by management, staff, and designers since the achievement of a good radiation program indeed involves a combined effort. Ultimately, radiation safety and good work practices become the personal responsibility of the individual. The practices presented in this manual are not to be construed as mandatory rather they are to be used as appropriate for the specific case in the interest of radiation safety. As experience is accrued and new data obtained in the application of this document, ONS will update the guidance to assure that at any given time the guidance reflects optimum performance consistent with current technology and practice.The intent of this guide therefore is to: define common health physics problems at accelerators; recommend suitable methods of identifying, evaluating, and managing accelerator health physics problems; set out the established safety practices at DOE accelerators that have been arrived at by consensus and, where consensus has not yet been reached, give examples of safe practices; introduce the technical literature in the accelerator health physics field; and supplement the regulatory documents listed in Appendix D. Many accelerator health physics problems are no different than those at other kinds of facilities, e.g., ALARA philosophy, instrument calibration, etc. These problems are touched on very lightly or not at all. Similarly, this document does not cover other hazards such as electrical shock, toxic materials, etc. This does not in any way imply that these problems are not serious. 160 refs

  13. GPU in Physics Computation: Case Geant4 Navigation

    CERN Document Server

    Seiskari, Otto; Niemi, Tapio

    2012-01-01

    General purpose computing on graphic processing units (GPU) is a potential method of speeding up scientific computation with low cost and high energy efficiency. We experimented with the particle physics simulation toolkit Geant4 used at CERN to benchmark its geometry navigation functionality on a GPU. The goal was to find out whether Geant4 physics simulations could benefit from GPU acceleration and how difficult it is to modify Geant4 code to run in a GPU. We ported selected parts of Geant4 code to C99 & CUDA and implemented a simple gamma physics simulation utilizing this code to measure efficiency. The performance of the program was tested by running it on two different platforms: NVIDIA GeForce 470 GTX GPU and a 12-core AMD CPU system. Our conclusion was that GPUs can be a competitive alternate for multi-core computers but porting existing software in an efficient way is challenging.

  14. Physical-dosimetric enabling a dual linear accelerator 3D planning systems for radiotherapy

    International Nuclear Information System (INIS)

    Alfonso, Rodolfo; Martinez, William; Arelis, Lores; Morales, Jorge

    2009-01-01

    The process of commissioning clinical linear accelerator requires a dual comprehensive study of the therapeutic beam parameters, both photons Electron. All information gained by measuring physical and dosimetric these beams must be analyzed, processed and refined for further modeling in computer-based treatment planning (RTPS). Of professionalism of this process will depend on the accuracy and precision of the calculations the prescribed doses. This paper aims to demonstrate availability clinical linear accelerator system-RTPS with late radiotherapy treatments shaped beam of photons and electrons. (author)

  15. Personal computer control system for small size tandem accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Takayama, Hiroshi; Kawano, Kazuhiro; Shinozaki, Masataka [Nissin - High Voltage Co. Ltd., Kyoto (Japan)

    1996-12-01

    As the analysis apparatus using tandem accelerator has a lot of control parameter, numbers of control parts set on control panel are so many to make the panel more complex and its operativity worse. In order to improve these faults, development and design of a control system using personal computer for the control panel mainly constituted by conventional hardware parts were tried. Their predominant characteristics are shown as follows: (1) To make the control panel construction simpler and more compact, because the hardware device on the panel surface becomes the smallest limit as required by using a personal computer for man-machine interface. (2) To make control speed more rapid, because sequence control is closed within each block by driving accelerator system to each block and installing local station of the sequencer network at each block. (3) To make expandability larger, because of few improvement of the present hardware by interrupting the sequencer local station into the net and correcting image of the computer when increasing a new beamline. And, (4) to make control system cheaper, because of cheaper investment and easier programming by using the personal computer. (G.K.)

  16. Statistical and thermal physics with computer applications

    CERN Document Server

    Gould, Harvey

    2010-01-01

    This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the

  17. Innovative applications of genetic algorithms to problems in accelerator physics

    Directory of Open Access Journals (Sweden)

    Alicia Hofler

    2013-01-01

    Full Text Available The genetic algorithm (GA is a powerful technique that implements the principles nature uses in biological evolution to optimize a multidimensional nonlinear problem. The GA works especially well for problems with a large number of local extrema, where traditional methods (such as conjugate gradient, steepest descent, and others fail or, at best, underperform. The field of accelerator physics, among others, abounds with problems which lend themselves to optimization via GAs. In this paper, we report on the successful application of GAs in several problems related to the existing Continuous Electron Beam Accelerator Facility nuclear physics machine, the proposed Medium-energy Electron-Ion Collider at Jefferson Lab, and a radio frequency gun-based injector. These encouraging results are a step forward in optimizing accelerator design and provide an impetus for application of GAs to other problems in the field. To that end, we discuss the details of the GAs used, include a newly devised enhancement which leads to improved convergence to the optimum, and make recommendations for future GA developments and accelerator applications.

  18. CAS CERN Accelerator School. 5. Advanced accelerator physics course. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Turner, S.

    1995-01-01

    The fifth CERN Accelerator School (CAS) advanced course on Accelerator Physics was given at the Paradise Hotel, Rhodes, Greece from 20 September to 1 October 1993. Its syllabus was based on the previous similar courses held at Oxford 1985, Berlin 1987, Uppsala 1989 and Noordwijkerhout 1991, and whose proceedings were published as CERN Reports 97-03, 89-01, 90-04 and 92-01, respectively. The present volumes are intended to replace and to bring up to date all the material in earlier publications. They contain not only all the lectures given in the Rhodes course but a number of important contributions to previous courses which are thought to be essential for a complete understanding of all aspects of the design and construction of particle accelerators at an advanced level. They include sections on Hamiltonian equations and accelerator optics, chromaticity and dynamic beam aperture, particle tracking, the kinetic theory, longitudinal beam optics, coherent instabilities, beam-beam dynamics, intra-beam scattering, beam cooling, Schottky noise, beam radiation, neutralisation, beam polarisation, radio-frequency quadrupoles, as well as chapters on space charge, superconducting magnets, crystal bending, beam-beam measurement and accelerator medical applications. (orig.)

  19. CAS CERN Accelerator School. 5. Advanced accelerator physics course. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    Turner, S.

    1995-01-01

    The fifth CERN Accelerator School (CAS) advanced course on Accelerator Physics was given at the Paradise Hotel, Rhodes, Greece from 20 September to 1 October 1993. Its syllabus was based on the previous similar courses held at Oxford 1985, Berlin 1987, Uppsala 1989 and Noordwijkerhout 1991, and whose proceedings were published as CERN Reports 87-03, 89-01, 90-04 and 92-01, respectively. The present volumes are intended to replace and to bring up to date all the material in earlier publications. They contain not only all the lectures given in the Rhodes course but a number of important contributions to previous courses which are thought to be essential for a complete understanding of all aspects of the design and construction of particle accelerators at an advanced level. They include sections on Hamiltonian equations and accelerator optics, chromaticity and dynamic beam aperture, particle tracking, the kinetic theory, longitudinal beam optics, coherent instabilities, beam-beam dynamics, intra-beam scattering, beam cooling, Schottky noise, beam radiation, neutralisation, beam polarisation, radio-frequency quadrupoles, as well as chapters on space charge, superconducting magnets, crystal bending, beam-beam measurement and accelerator medical applications. (orig.)

  20. CAS CERN Accelerator School. 5. Advanced accelerator physics course. Proceedings. Vol. 1

    Energy Technology Data Exchange (ETDEWEB)

    Turner, S [ed.

    1995-11-22

    The fifth CERN Accelerator School (CAS) advanced course on Accelerator Physics was given at the Paradise Hotel, Rhodes, Greece from 20 September to 1 October 1993. Its syllabus was based on the previous similar courses held at Oxford 1985, Berlin 1987, Uppsala 1989 and Noordwijkerhout 1991, and whose proceedings were published as CERN Reports 87-03, 89-01, 90-04 and 92-01, respectively. The present volumes are intended to replace and to bring up to date all the material in earlier publications. They contain not only all the lectures given in the Rhodes course but a number of important contributions to previous courses which are thought to be essential for a complete understanding of all aspects of the design and construction of particle accelerators at an advanced level. They include sections on Hamiltonian equations and accelerator optics, chromaticity and dynamic beam aperture, particle tracking, the kinetic theory, longitudinal beam optics, coherent instabilities, beam-beam dynamics, intra-beam scattering, beam cooling, Schottky noise, beam radiation, neutralisation, beam polarisation, radio-frequency quadrupoles, as well as chapters on space charge, superconducting magnets, crystal bending, beam-beam measurement and accelerator medical applications. (orig.).

  1. CAS CERN Accelerator School. 5. Advanced accelerator physics course. Proceedings. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Turner, S [ed.

    1995-11-22

    The fifth CERN Accelerator School (CAS) advanced course on Accelerator Physics was given at the Paradise Hotel, Rhodes, Greece from 20 September to 1 October 1993. Its syllabus was based on the previous similar courses held at Oxford 1985, Berlin 1987, Uppsala 1989 and Noordwijkerhout 1991, and whose proceedings were published as CERN Reports 97-03, 89-01, 90-04 and 92-01, respectively. The present volumes are intended to replace and to bring up to date all the material in earlier publications. They contain not only all the lectures given in the Rhodes course but a number of important contributions to previous courses which are thought to be essential for a complete understanding of all aspects of the design and construction of particle accelerators at an advanced level. They include sections on Hamiltonian equations and accelerator optics, chromaticity and dynamic beam aperture, particle tracking, the kinetic theory, longitudinal beam optics, coherent instabilities, beam-beam dynamics, intra-beam scattering, beam cooling, Schottky noise, beam radiation, neutralisation, beam polarisation, radio-frequency quadrupoles, as well as chapters on space charge, superconducting magnets, crystal bending, beam-beam measurement and accelerator medical applications. (orig.).

  2. Extreme Physics and Informational/Computational Limits

    Energy Technology Data Exchange (ETDEWEB)

    Di Sia, Paolo, E-mail: paolo.disia@univr.it, E-mail: 10alla33@virgilio.it [Department of Computer Science, Faculty of Science, Verona University, Strada Le Grazie 15, I-37134 Verona (Italy) and Faculty of Computer Science, Free University of Bozen, Piazza Domenicani 3, I-39100 Bozen-Bolzano (Italy)

    2011-07-08

    A sector of the current theoretical physics, even called 'extreme physics', deals with topics concerning superstring theories, multiverse, quantum teleportation, negative energy, and more, that only few years ago were considered scientific imaginations or purely speculative physics. Present experimental lines of evidence and implications of cosmological observations seem on the contrary support such theories. These new physical developments lead to informational limits, as the quantity of information, that a physical system can record, and computational limits, resulting from considerations regarding black holes and space-time fluctuations. In this paper I consider important limits for information and computation resulting in particular from string theories and its foundations.

  3. Extreme Physics and Informational/Computational Limits

    International Nuclear Information System (INIS)

    Di Sia, Paolo

    2011-01-01

    A sector of the current theoretical physics, even called 'extreme physics', deals with topics concerning superstring theories, multiverse, quantum teleportation, negative energy, and more, that only few years ago were considered scientific imaginations or purely speculative physics. Present experimental lines of evidence and implications of cosmological observations seem on the contrary support such theories. These new physical developments lead to informational limits, as the quantity of information, that a physical system can record, and computational limits, resulting from considerations regarding black holes and space-time fluctuations. In this paper I consider important limits for information and computation resulting in particular from string theories and its foundations.

  4. The impact of the ISR on accelerator physics and technology

    International Nuclear Information System (INIS)

    Bryant, P J

    2012-01-01

    The ISR (Intersecting Storage Rings) were two intersecting proton synchrotron rings each with a circumference of 942 m and eight-fold symmetry that were operational for 13 years from 1971 to 1984. The CERN PS injected 26 GeV/c proton beams into the two rings that could accelerate up to 31.4 GeV/c. The ISR worked for physics with beams of 30-40 A over 40-60 hours with luminosities in its superconducting low-β insertion of 1031-1032 cm-2 s-1. The ISR demonstrated the practicality of collider beam physics while catalysing a rapid advance in accelerator technologies and techniques. (author)

  5. Accelerator Physics Challenges for the NSLS-II Project

    Energy Technology Data Exchange (ETDEWEB)

    Krinsky,S.

    2009-05-04

    The NSLS-II is an ultra-bright synchrotron light source based upon a 3-GeV storage ring with a 30-cell (15 super-period) double-bend-achromat lattice with damping wigglers used to lower the emittance below 1 nm. In this paper, we discuss the accelerator physics challenges for the design including: optimization of dynamic aperture; estimation of Touschek lifetime; achievement of required orbit stability; and analysis of ring impedance and collective effects.

  6. Accelerator physics highlights in the 1997/98 SLC run

    International Nuclear Information System (INIS)

    Assmann, R.W.; Bane, K.L.F.; Barkow, T.

    1998-03-01

    The authors report various accelerator physics studies and improvements from the 1997/98 run at the Stanford Linear Collider (SLC). In particular, the authors discuss damping-ring lattice diagnostics, changes to the linac set up, fast control for linac rf phase stability, new emittance tuning strategies, wakefield reduction, modifications of the final-focus optics, longitudinal bunch shaping, and a novel spot-size control at the interaction point (IP)

  7. Nuclear Physics computer networking: Report of the Nuclear Physics Panel on Computer Networking

    International Nuclear Information System (INIS)

    Bemis, C.; Erskine, J.; Franey, M.; Greiner, D.; Hoehn, M.; Kaletka, M.; LeVine, M.; Roberson, R.; Welch, L.

    1990-05-01

    This paper discusses: the state of computer networking within nuclear physics program; network requirements for nuclear physics; management structure; and issues of special interest to the nuclear physics program office

  8. Acceleration of Feynman loop integrals in high-energy physics on many core GPUs

    International Nuclear Information System (INIS)

    Yuasa, F; Ishikawa, T; Hamaguchi, N; Koike, T; Nakasato, N

    2013-01-01

    The current and future colliders in high-energy physics require theorists to carry out a large scale computation for a precise comparison between experimental results and theoretical ones. In a perturbative approach several methods to evaluate Feynman loop integrals which appear in the theoretical calculation of cross-sections are well established in the one-loop level, however, more studies are necessary for higher-order levels. Direct Computation Method (DCM) is developed to evaluate multi-loop integrals. DCM is based on a combination of multidimensional numerical integration and extrapolation on a sequence of integrals. It is a fully numerical method and is applicable to a wide class of integrals with various physics parameters. The computation time depends on physics parameters and the topology of loop diagrams and it becomes longer for the two-loop integrals. In this paper we present our approach to the acceleration of the two-loop integrals by DCM on multiple GPU boards

  9. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    Directory of Open Access Journals (Sweden)

    Wei Wang

    Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other

  10. International Conference on Theoretical and Computational Physics

    CERN Document Server

    2016-01-01

    Int'l Conference on Theoretical and Computational Physics (TCP 2016) will be held from August 24 to 26, 2016 in Xi'an, China. This Conference will cover issues on Theoretical and Computational Physics. It dedicates to creating a stage for exchanging the latest research results and sharing the advanced research methods. TCP 2016 will be an important platform for inspiring international and interdisciplinary exchange at the forefront of Theoretical and Computational Physics. The Conference will bring together researchers, engineers, technicians and academicians from all over the world, and we cordially invite you to take this opportunity to join us for academic exchange and visit the ancient city of Xi’an.

  11. Operational health physics at the Los Alamos meson physics proton accelerator

    International Nuclear Information System (INIS)

    Engelke, M.J.

    1975-01-01

    The operational health physics practices and procedures at the Clinton P. Anderson Los Alamos Meson Physics Facility (LAMPF), a medium energy, high intensity proton accelerator are reviewed. The operational philosophy used for the control of personnel exposures and radioactive materials is discussed. A particular operation involving the removal of a radioactive beam stop reading in excess of 1000 R/h is described

  12. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  13. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  14. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  15. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  16. Computational plasma physics and supercomputers

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1984-09-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular codes, but parallel processing poses new coding difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematics

  17. Computational Physics as a Path for Physics Education

    Science.gov (United States)

    Landau, Rubin H.

    2008-04-01

    Evidence and arguments will be presented that modifications in the undergraduate physics curriculum are necessary to maintain the long-term relevance of physics. Suggested will a balance of analytic, experimental, computational, and communication skills, that in many cases will require an increased inclusion of computation and its associated skill set into the undergraduate physics curriculum. The general arguments will be followed by a detailed enumeration of suggested subjects and student learning outcomes, many of which have already been adopted or advocated by the computational science community, and which permit high performance computing and communication. Several alternative models for how these computational topics can be incorporated into the undergraduate curriculum will be discussed. This includes enhanced topics in the standard existing courses, as well as stand-alone courses. Applications and demonstrations will be presented throughout the talk, as well as prototype video-based materials and electronic books.

  18. Computing for Heavy Ion Physics

    International Nuclear Information System (INIS)

    Martinez, G.; Schiff, D.; Hristov, P.; Menaud, J.M.; Hrivnacova, I.; Poizat, P.; Chabratova, G.; Albin-Amiot, H.; Carminati, F.; Peters, A.; Schutz, Y.; Safarik, K.; Ollitrault, J.Y.; Hrivnacova, I.; Morsch, A.; Gheata, A.; Morsch, A.; Vande Vyvre, P.; Lauret, J.; Nief, J.Y.; Pereira, H.; Kaczmarek, O.; Conesa Del Valle, Z.; Guernane, R.; Stocco, D.; Gruwe, M.; Betev, L.; Baldisseri, A.; Vilakazi, Z.; Rapp, B.; Masoni, A.; Stoicea, G.; Brun, R.

    2005-01-01

    This workshop was devoted to the computational technologies needed for the heavy quarkonia and open flavor production study at LHC (large hadron collider) experiments. These requirements are huge: peta-bytes of data will be generated each year. Analysing this will require the equivalent of a few thousands of today's fastest PC processors. The new developments in terms of dedicated software has been addressed. This document gathers the transparencies that were presented at the workshop

  19. Computing for Heavy Ion Physics

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, G.; Schiff, D.; Hristov, P.; Menaud, J.M.; Hrivnacova, I.; Poizat, P.; Chabratova, G.; Albin-Amiot, H.; Carminati, F.; Peters, A.; Schutz, Y.; Safarik, K.; Ollitrault, J.Y.; Hrivnacova, I.; Morsch, A.; Gheata, A.; Morsch, A.; Vande Vyvre, P.; Lauret, J.; Nief, J.Y.; Pereira, H.; Kaczmarek, O.; Conesa Del Valle, Z.; Guernane, R.; Stocco, D.; Gruwe, M.; Betev, L.; Baldisseri, A.; Vilakazi, Z.; Rapp, B.; Masoni, A.; Stoicea, G.; Brun, R

    2005-07-01

    This workshop was devoted to the computational technologies needed for the heavy quarkonia and open flavor production study at LHC (large hadron collider) experiments. These requirements are huge: peta-bytes of data will be generated each year. Analysing this will require the equivalent of a few thousands of today's fastest PC processors. The new developments in terms of dedicated software has been addressed. This document gathers the transparencies that were presented at the workshop.

  20. Computing for Heavy Ion Physics

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, G; Schiff, D; Hristov, P; Menaud, J M; Hrivnacova, I; Poizat, P; Chabratova, G; Albin-Amiot, H; Carminati, F; Peters, A; Schutz, Y; Safarik, K; Ollitrault, J Y; Hrivnacova, I; Morsch, A; Gheata, A; Morsch, A; Vande Vyvre, P; Lauret, J; Nief, J Y; Pereira, H; Kaczmarek, O; Conesa Del Valle, Z; Guernane, R; Stocco, D; Gruwe, M; Betev, L; Baldisseri, A; Vilakazi, Z; Rapp, B; Masoni, A; Stoicea, G; Brun, R

    2005-07-01

    This workshop was devoted to the computational technologies needed for the heavy quarkonia and open flavor production study at LHC (large hadron collider) experiments. These requirements are huge: peta-bytes of data will be generated each year. Analysing this will require the equivalent of a few thousands of today's fastest PC processors. The new developments in terms of dedicated software has been addressed. This document gathers the transparencies that were presented at the workshop.

  1. Accelerator-based techniques for the support of senior-level undergraduate physics laboratories

    International Nuclear Information System (INIS)

    Williams, J.R.; Clark, J.C.; Isaacs-Smith, T.

    2001-01-01

    Approximately three years ago, Auburn University replaced its aging Dynamitron accelerator with a new 2MV tandem machine (Pelletron) manufactured by the National Electrostatics Corporation (NEC). This new machine is maintained and operated for the University by Physics Department personnel, and the accelerator supports a wide variety of materials modification/analysis studies. Computer software is available that allows the NEC Pelletron to be operated from a remote location, and an Internet link has been established between the Accelerator Laboratory and the Upper-Level Undergraduate Teaching Laboratory in the Physics Department. Additional software supplied by Canberra Industries has also been used to create a second Internet link that allows live-time data acquisition in the Teaching Laboratory. Our senior-level undergraduates and first-year graduate students perform a number of experiments related to radiation detection and measurement as well as several standard accelerator-based experiments that have been added recently. These laboratory exercises will be described, and the procedures used to establish the Internet links between our Teaching Laboratory and the Accelerator Laboratory will be discussed

  2. Physics in ;Real Life;: Accelerator-based Research with Undergraduates

    Science.gov (United States)

    Klay, J. L.

    All undergraduates in physics and astronomy should have access to significant research experiences. When given the opportunity to tackle challenging open-ended problems outside the classroom, students build their problem-solving skills in ways that better prepare them for the workplace or future research in graduate school. Accelerator-based research on fundamental nuclear and particle physics can provide a myriad of opportunities for undergraduate involvement in hardware and software development as well as ;big data; analysis. The collaborative nature of large experiments exposes students to scientists of every culture and helps them begin to build their professional network even before they graduate. This paper presents an overview of my experiences - the good, the bad, and the ugly - engaging undergraduates in particle and nuclear physics research at the CERN Large Hadron Collider and the Los Alamos Neutron Science Center.

  3. Introduction: the changing face of accelerator target physics and chemistry

    International Nuclear Information System (INIS)

    Sunderland, J.J.

    1992-01-01

    The explosive growth of the small accelerator industry, an offshoot of the expansion of both clinical and research PET imaging, is driving a changing perspective in the field of accelerator targetry. To meet the new demands placed on targetry by the increasingly active and demanding PET institutions it has become necessary to design targets capable of producing large amounts of the four common positron-emitting radionuclides ( 15 O, 13 N, 11 C, 18 F) with unfailing reliability and simplicity. The economic clinical and research survival of PET absolutely relies upon these capabilities. In response to this perceived need, the lion's share of the effort in the field of target physics and chemistry is being directed toward the profuse production of these four common radioisotopes. (author)

  4. CAS course on Advanced Accelerator Physics in Warsaw

    CERN Multimedia

    CERN Accelerator School

    2015-01-01

    The CERN Accelerator School (CAS) and the National Centre for Nuclear Research (NCBJ) recently organised a course on Advanced Accelerator Physics. The course was held in Warsaw, Poland from 27 September to 9 October 2015.    The course followed an established format with lectures in the mornings and practical courses in the afternoons. The lecture programme consisted of 34 lectures, supplemented by private study, tutorials and seminars. The practical courses provided ‘hands-on’ experience of three topics: ‘Beam Instrumentation and Diagnostics’, ‘RF Measurement Techniques’ and ‘Optics Design and Corrections’. Participants selected one of the three courses and followed their chosen topic throughout the duration of the school. Sixty-six students representing 18 nationalities attended this course, with most participants coming from European counties, but also from South Korea, Taiwan and Russia. Feedback from th...

  5. CAS course on advanced accelerator physics in Trondheim, Norway

    CERN Multimedia

    CERN Accelerator School

    2013-01-01

    The CERN Accelerator School (CAS) and the Norwegian University of Science and Technology (NTNU) recently organised a course on advanced accelerator physics. The course was held in Trondheim, Norway, from 18 to 29 August 2013. Accommodation and lectures were at the Hotel Britannia and practical courses were held at the university.   The course's format included lectures in the mornings and practical courses in the afternoons. The lecture programme consisted of 32 lectures supplemented by discussion sessions, private study and tutorials. The practical courses provided "hands-on" experience in three topics: RF measurement techniques, beam instrumentation and diagnostics, and optics design and corrections. Participants selected one of the three courses and followed the chosen topic throughout the course. The programme concluded with seminars and a poster session.  70 students representing 21 nationalities were selected from over 90 applicants, with most participa...

  6. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  7. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories, along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.

  8. Application of electrostatic accelerators for nuclear physics studies

    International Nuclear Information System (INIS)

    Kuz'minov, B.D.; Romanov, V.A.; Usachev, L.N.

    1983-01-01

    The data are reviewed on dynamics of the development of single- and two-stage electrostatic accelerators (ESA) used as a tool or nuclear physics studies in the range of low and medium energies. The ESA wide possibilities are shown on examples of the most specific studies in the field of nuclear physics, work on measurement of nuclear constants to safisfy the nuclear power needs and applied studies on nuclear microanalysis. It is concluded that the contribution of studies performed using ESA to the development of nowadays concepts on nuclear structure and nuclear reaction kinetics is immeasurably higher than of any other nuclear-physics tool. ESA turned out to be also exceptionally useful for solving applied problems and investigations in different fields of knowledge. Carrying over the technique of investigations using ESA and nuclear physics concepts to atomic and molecular problems has found its application in optical spectroscopy in Lamb shift investigations in strongly ionized heavy ions, in various experiments on atom-atom and atom-molecular scattering, in stUdies of collisions and charge exchange. ESA contributed to the progress in such scientific fields as astraphysics, nuclear physics, solid-state physics, material science and biophysics

  9. Computational physics problem solving with Python

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2015-01-01

    The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python progr

  10. Physical Foundations for Acceleration by Traveling Laser Focus

    International Nuclear Information System (INIS)

    Mikhailichenko, A.A.

    2004-01-01

    In this method called Travelling Laser Focus (TLF), multi-cell microstructures scaled down to the laser wavelength-size. Each cell in these structures has an opening from the side. Special Electro-Optical device controllably sweeps focused laser spot along these openings in accordance with instant position of accelerated micro-bunch inside the structure. This arrangement reduces the illuminating time for every point on the structure's surface and power required from the laser. Physical limitations considered for mostly important components of the TLF scheme

  11. Accelerator-driven molten-salt blankets: Physics issues

    International Nuclear Information System (INIS)

    Houts, M.G.; Beard, C.A.; Buksa, J.J.; Davidson, J.W.; Durkee, J.W.; Perry, R.T.; Poston, D.I.

    1994-01-01

    A number of nuclear physics issues concerning the Los Alamos molten-salt, accelerator-driven plutonium converter are discussed. General descriptions of several concepts using internal and external, moderation are presented. Burnup and salt processing requirement calculations are presented for four concepts, indicating that both the high power density externally moderated concept and an internally moderated concept achieve total plutonium burnups approaching 90% at salt processing rates of less than 2 m 3 per year. Beginning-of-life reactivity temperature coefficients and system kinetic response are also discussed. Future research should investigate the effect of changing blanket composition on operational and safety characteristics

  12. Accelerator-driven molten-salt blankets: Physics issues

    International Nuclear Information System (INIS)

    Houts, M.G.; Beard, C.A.; Buksa, J.J.; Davidson, J.W.; Durkee, J.W.; Perry, R.T.; Poston, D.I.

    1994-01-01

    A number of nuclear physics issues concerning the Los Alamos molten-salt accelerator-driven plutonium converter are discussed. General descriptions of several concepts using internal and external moderation are presented. Burnup and salt processing requirement calculations are presented for four concepts, indicating that both the high power density externally moderated concept and an internally moderated concept achieve total plutonium burnups approaching 90% at salt processing rates of less than 2 m 3 per year. Beginning-of-life reactivity temperature coefficients and system kinetic response are also discussed. Future research should investigate the effect of changing blanket composition on operational and safety characteristics

  13. Proceedings of the workshop on B physics at hadron accelerators

    International Nuclear Information System (INIS)

    McBride, P.; Mishra, C.S.

    1993-01-01

    This report contains papers on the following topics: Measurement of Angle α; Measurement of Angle β; Measurement of Angle γ; Other B Physics; Theory of Heavy Flavors; Charged Particle Tracking and Vertexing; e and γ Detection; Muon Detection; Hadron ID; Electronics, DAQ, and Computing; and Machine Detector Interface. Selected papers have been indexed separately for inclusion the in Energy Science and Technology Database

  14. Proceedings of the workshop on B physics at hadron accelerators

    Energy Technology Data Exchange (ETDEWEB)

    McBride, P. [Superconducting Super Collider Lab., Dallas, TX (United States); Mishra, C.S. [Fermi National Accelerator Lab., Batavia, IL (United States)] [eds.

    1993-12-31

    This report contains papers on the following topics: Measurement of Angle {alpha}; Measurement of Angle {beta}; Measurement of Angle {gamma}; Other B Physics; Theory of Heavy Flavors; Charged Particle Tracking and Vertexing; e and {gamma} Detection; Muon Detection; Hadron ID; Electronics, DAQ, and Computing; and Machine Detector Interface. Selected papers have been indexed separately for inclusion the in Energy Science and Technology Database.

  15. Fermilab | Particle Physics Division

    Science.gov (United States)

    Diversity Education Safety Sustainability and Environment Contact Science Science Particle Physics Neutrinos Scientific Computing Research & Development Key Discoveries Benefits of Particle Physics Particle Superconducting Test Accelerator LHC and Future Accelerators Accelerators for Science and Society Particle Physics

  16. Computing in high-energy physics

    International Nuclear Information System (INIS)

    Mount, Richard P.

    2016-01-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software

  17. Computing in high-energy physics

    Science.gov (United States)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  18. ACE3P Computations of Wakefield Coupling in the CLIC Two-Beam Accelerator

    International Nuclear Information System (INIS)

    Candel, Arno

    2010-01-01

    The Compact Linear Collider (CLIC) provides a path to a multi-TeV accelerator to explore the energy frontier of High Energy Physics. Its novel two-beam accelerator concept envisions rf power transfer to the accelerating structures from a separate high-current decelerator beam line consisting of power extraction and transfer structures (PETS). It is critical to numerically verify the fundamental and higher-order mode properties in and between the two beam lines with high accuracy and confidence. To solve these large-scale problems, SLAC's parallel finite element electromagnetic code suite ACE3P is employed. Using curvilinear conformal meshes and higher-order finite element vector basis functions, unprecedented accuracy and computational efficiency are achieved, enabling high-fidelity modeling of complex detuned structures such as the CLIC TD24 accelerating structure. In this paper, time-domain simulations of wakefield coupling effects in the combined system of PETS and the TD24 structures are presented. The results will help to identify potential issues and provide new insights on the design, leading to further improvements on the novel CLIC two-beam accelerator scheme.

  19. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Borsic, A; Attardo, E A; Halter, R J

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  20. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    Science.gov (United States)

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  1. Accelerated ions as a tool in atomic physics

    International Nuclear Information System (INIS)

    Hansteen, J.M.

    1977-01-01

    Some of the aspects of atomic physics which are being brought into focus by the construction and completion of a new generation of heavy-ion accelerators are dealt with. Various types of processes occurring in the overlapping electron clouds are visualised in an elementary way, using among others, some recent observations on the formation of quasi-molecules and quasi-atoms. Phenomena connected with the inner electron shells in superheavy atoms are touched upon, in particular those processes possibly leading to the production of positrons. In such cases the crucial importance of an atomic Coulomb excitation mechanism is stressed. In conclusion the view is emphasized that inner shell ionization phenomena in heavy ion collisions form a bridge between processes originating respectively from nuclear and atomic physics. (Auth.)

  2. European Strategy for Accelerator-Based Neutrino Physics

    CERN Document Server

    Bertolucci, Sergio; Cervera, Anselmo; Donini, Andrea; Dracos, Marcos; Duchesneau, Dominique; Dufour, Fanny; Edgecock, Rob; Efthymiopoulos, Ilias; Gschwendtner, Edda; Kudenko, Yury; Long, Ken; Maalampi, Jukka; Mezzetto, Mauro; Pascoli, Silvia; Palladino, Vittorio; Rondio, Ewa; Rubbia, Andre; Rubbia, Carlo; Stahl, Achim; Stanco, Luca; Thomas, Jenny; Wark, David; Wildner, Elena; Zito, Marco

    2012-01-01

    Massive neutrinos reveal physics beyond the Standard Model, which could have deep consequences for our understanding of the Universe. Their study should therefore receive the highest level of priority in the European Strategy. The discovery and study of leptonic CP violation and precision studies of the transitions between neutrino flavours require high intensity, high precision, long baseline accelerator neutrino experiments. The community of European neutrino physicists involved in oscillation experiments is strong enough to support a major neutrino long baseline project in Europe, and has an ambitious, competitive and coherent vision to propose. Following the 2006 European Strategy for Particle Physics (ESPP) recommendations, two complementary design studies have been carried out: LAGUNA/LBNO, focused on deep underground detector sites, and EUROnu, focused on high intensity neutrino facilities. LAGUNA LBNO recommends, as first step, a conventional neutrino beam CN2PY from a CERN SPS North Area Neutrino Fac...

  3. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  4. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  5. Computer simulation of 2-D and 3-D ion beam extraction and acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Ido, Shunji; Nakajima, Yuji [Saitama Univ., Urawa (Japan). Faculty of Engineering

    1997-03-01

    The two-dimensional code and the three-dimensional code have been developed to study the physical features of the ion beams in the extraction and acceleration stages. By using the two-dimensional code, the design of first electrode(plasma grid) is examined in regard to the beam divergence. In the computational studies by using the three-dimensional code, the axis-off model of ion beam is investigated. It is found that the deflection angle of ion beam is proportional to the gap displacement of the electrodes. (author)

  6. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  7. Nanostructure symmetry: Relevance for physics and computing

    International Nuclear Information System (INIS)

    Dupertuis, Marc-André; Oberli, D. Y.; Karlsson, K. F.; Dalessi, S.; Gallinet, B.; Svendsen, G.

    2014-01-01

    We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented

  8. Nanostructure symmetry: Relevance for physics and computing

    Energy Technology Data Exchange (ETDEWEB)

    Dupertuis, Marc-André; Oberli, D. Y. [Laboratory for Physics of Nanostructure, EPF Lausanne (Switzerland); Karlsson, K. F. [Department of Physics, Chemistry, and Biology (IFM), Linköping University (Sweden); Dalessi, S. [Computational Biology Group, Department of Medical Genetics, University of Lausanne (Switzerland); Gallinet, B. [Nanophotonics and Metrology Laboratory, EPF Lausanne (Switzerland); Svendsen, G. [Dept. of Electronics and Telecom., Norwegian University of Science and Technology, Trondheim (Norway)

    2014-03-31

    We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented.

  9. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  10. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  11. Computer codes and methods for simulating accelerator driven systems

    International Nuclear Information System (INIS)

    Sartori, E.; Byung Chan Na

    2003-01-01

    A large set of computer codes and associated data libraries have been developed by nuclear research and industry over the past half century. A large number of them are in the public domain and can be obtained under agreed conditions from different Information Centres. The areas covered comprise: basic nuclear data and models, reactor spectra and cell calculations, static and dynamic reactor analysis, criticality, radiation shielding, dosimetry and material damage, fuel behaviour, safety and hazard analysis, heat conduction and fluid flow in reactor systems, spent fuel and waste management (handling, transportation, and storage), economics of fuel cycles, impact on the environment of nuclear activities etc. These codes and models have been developed mostly for critical systems used for research or power generation and other technological applications. Many of them have not been designed for accelerator driven systems (ADS), but with competent use, they can be used for studying such systems or can form the basis for adapting existing methods to the specific needs of ADS's. The present paper describes the types of methods, codes and associated data available and their role in the applications. It provides Web addresses for facilitating searches for such tools. Some indications are given on the effect of non appropriate or 'blind' use of existing tools to ADS. Reference is made to available experimental data that can be used for validating the methods use. Finally, some international activities linked to the different computational aspects are described briefly. (author)

  12. Physics, Computer Science and Mathematics Division annual report, 1 January--31 December 1975

    International Nuclear Information System (INIS)

    Lepore, J.L.

    1975-01-01

    This annual report describes the scientific research and other work carried out during the calendar year 1975. The report is nontechnical in nature, with almost no data. A 17-page bibliography lists the technical papers which detail the work. The contents of the report include the following: experimental physics (high-energy physics--SPEAR, PEP, SLAC, FNAL, BNL, Bevatron; particle data group; medium-energy physics; astrophysics, astronomy, and cosmic rays; instrumentation development), theoretical physics (particle theory and accelerator theory and design), computer science and applied mathematics (data management systems, socio-economic environment demographic information system, computer graphics, computer networks, management information systems, computational physics and data analysis, mathematical modeling, programing languages, applied mathematics research), real-time systems (ModComp and PDP networks), and computer center activities (systems programing, user services, hardware development, computer operations). A glossary of computer science and mathematics terms is also included. 32 figures

  13. Physics, Computer Science and Mathematics Division annual report, 1 January--31 December 1975. [LBL

    Energy Technology Data Exchange (ETDEWEB)

    Lepore, J.L. (ed.)

    1975-01-01

    This annual report describes the scientific research and other work carried out during the calendar year 1975. The report is nontechnical in nature, with almost no data. A 17-page bibliography lists the technical papers which detail the work. The contents of the report include the following: experimental physics (high-energy physics--SPEAR, PEP, SLAC, FNAL, BNL, Bevatron; particle data group; medium-energy physics; astrophysics, astronomy, and cosmic rays; instrumentation development), theoretical physics (particle theory and accelerator theory and design), computer science and applied mathematics (data management systems, socio-economic environment demographic information system, computer graphics, computer networks, management information systems, computational physics and data analysis, mathematical modeling, programing languages, applied mathematics research), real-time systems (ModComp and PDP networks), and computer center activities (systems programing, user services, hardware development, computer operations). A glossary of computer science and mathematics terms is also included. 32 figures. (RWR)

  14. ASP2012: Fundamental Physics and Accelerator Sciences in Africa

    Science.gov (United States)

    Darve, Christine

    2012-02-01

    Much remains to be done to improve education and scientific research in Africa. Supported by the international scientific community, our initiative has been to contribute to fostering science in sub-Saharan Africa by establishing a biennial school on fundamental subatomic physics and its applications. The school is based on a close interplay between theoretical, experimental, and applied physics. The lectures are addressed to students or young researchers with at least a background of 4 years of university formation. The aim of the school is to develop capacity, interpret, and capitalize on the results of current and future physics experiments with particle accelerators; thereby spreading education for innovation in related applications and technologies, such as medicine and information science. Following the worldwide success of the first school edition, which gathered 65 students for 3-week in Stellenbosch (South Africa) in August 2010, the second edition will be hosted in Ghana from July 15 to August 4, 2012. The school is a non-profit organization, which provides partial or full financial support to 50 of the selected students, with priority to Sub-Saharan African students.

  15. The Physics Perspectives at the Future Accelerator Facility FAIR

    CERN Document Server

    Stroth, J

    2004-01-01

    The physics perspective of the approved future international accelerator Facility for Anti-proton and Ion Research (FAIR) near Darmstadt, Germany will be outlined. The physics programme will comprise many body aspects of matter ranging from macroscopic system like highly correlated plasmas down to the properties of baryons and nuclear matter at high baryon densities. Through fragmentation of intense ion beams investigations with beams of short-lived radioactive nuclei far from stability will be possible. The addressed physics questions concern nuclear structure at the drip-lines, areas of astrophysics and nucleo-synthesis in supernovae and other stellar processes, as well as tests of fundamental symmetry. The structure of baryons and their limits of their existence is the interest of the two large experimental set-ups PANDA and CBM. Finally QED will be studied in extremely strong field effects and also the interaction of ions with matter. The future facility will feature a double-ring synchrotron SIS100/300 a...

  16. Quantum algorithms for computational nuclear physics

    Directory of Open Access Journals (Sweden)

    Višňák Jakub

    2015-01-01

    Full Text Available While quantum algorithms have been studied as an efficient tool for the stationary state energy determination in the case of molecular quantum systems, no similar study for analogical problems in computational nuclear physics (computation of energy levels of nuclei from empirical nucleon-nucleon or quark-quark potentials have been realized yet. Although the difference between the above mentioned studies might seem negligible, it will be examined. First steps towards a particular simulation (on classical computer of the Iterative Phase Estimation Algorithm for deuterium and tritium nuclei energy level computation will be carried out with the aim to prove algorithm feasibility (and extensibility to heavier nuclei for its possible practical realization on a real quantum computer.

  17. Physical condition for the slowing down of cosmic acceleration

    Directory of Open Access Journals (Sweden)

    Ming-Jian Zhang

    2018-04-01

    Full Text Available The possible slowing down of cosmic acceleration was widely studied. However, judgment on this effect in different dark energy parameterizations was very ambiguous. Moreover, the reason of generating these uncertainties was still unknown. In the present paper, we analyze the derivative of deceleration parameter q′(z using the Gaussian processes. This model-independent reconstruction suggests that no slowing down of acceleration is presented within 95% C.L. from the Union2.1 and JLA supernova data. However, q′(z from the observational H(z data is a little smaller than zero at 95% C.L., which indicates that future H(z data may have a potential to test this effect. From the evolution of q′(z, we present an interesting constraint on the dark energy and observational data. The physical constraint clearly solves the problem of why some dark energy models cannot produce this effect in previous work. Comparison between the constraint and observational data also shows that most of current data are not in the allowed regions. This implies a reason of why current data cannot convincingly measure this effect.

  18. CAS Introduction to Accelerator Physics in the Czech Republic

    CERN Multimedia

    CERN Accelerator School

    2014-01-01

    The CERN Accelerator School (CAS) and the Czech Technical University in Prague jointly organised the Introduction to Accelerator Physics course in Prague, Czech Republic from 31 August to 12 September 2014.   The course was held in the Hotel Don Giovanni on the outskirts of the city, and was attended by 111 participants of 29 nationalities, from countries as far away as Armenia, Argentina, Canada, Iceland, Thailand and Russia. The intensive programme comprised 41 lectures, 3 seminars, 4 tutorials and 6 hours of guided and private study. A poster session and a 1-minute/1-slide session were also included in the programme, where the students were able to present their work. Feedback from the students was very positive, praising the expertise of the lecturers, as well as the high standard and quality of their lectures. During the second week, the afternoon lectures were held in the Czech Technical University in Prague. In addition to the academic programme, the students had the opportunity to vis...

  19. Physical condition for the slowing down of cosmic acceleration

    Science.gov (United States)

    Zhang, Ming-Jian; Xia, Jun-Qing

    2018-04-01

    The possible slowing down of cosmic acceleration was widely studied. However, judgment on this effect in different dark energy parameterizations was very ambiguous. Moreover, the reason of generating these uncertainties was still unknown. In the present paper, we analyze the derivative of deceleration parameter q‧ (z) using the Gaussian processes. This model-independent reconstruction suggests that no slowing down of acceleration is presented within 95% C.L. from the Union2.1 and JLA supernova data. However, q‧ (z) from the observational H (z) data is a little smaller than zero at 95% C.L., which indicates that future H (z) data may have a potential to test this effect. From the evolution of q‧ (z), we present an interesting constraint on the dark energy and observational data. The physical constraint clearly solves the problem of why some dark energy models cannot produce this effect in previous work. Comparison between the constraint and observational data also shows that most of current data are not in the allowed regions. This implies a reason of why current data cannot convincingly measure this effect.

  20. The Scanning Electron Microscope As An Accelerator For The Undergraduate Advanced Physics Laboratory

    International Nuclear Information System (INIS)

    Peterson, Randolph S.; Berggren, Karl K.; Mondol, Mark

    2011-01-01

    Few universities or colleges have an accelerator for use with advanced physics laboratories, but many of these institutions have a scanning electron microscope (SEM) on site, often in the biology department. As an accelerator for the undergraduate, advanced physics laboratory, the SEM is an excellent substitute for an ion accelerator. Although there are no nuclear physics experiments that can be performed with a typical 30 kV SEM, there is an opportunity for experimental work on accelerator physics, atomic physics, electron-solid interactions, and the basics of modern e-beam lithography.

  1. International Linear Collider Accelerator Physics R and D

    International Nuclear Information System (INIS)

    Gollin, George D.; Davidsaver, Michael; Haney, Michael J.; Kasten, Michael; Chang, Jason; Chodash, Perry; Dluger, Will; Lang, Alex; Liu, Yehan

    2008-01-01

    to an event at the beginning of the run. We determined that the device installed in our beam, which was instrumented with an 8-bit 500 MHz ADC, could measure the beam timing to an accuracy of 0.4 picoseconds. Simulations of the device showed that an increase in ADC clock rate to 2 GHz would improve measurement precision by the required factor of four. As a result, we felt that a device of this sort, assuming matters concerning dynamic range and long-term stability can be addressed successfully, would work at the ILC. Cost effective operation of the ILC will demand highly reliable, fault tolerant and adaptive solutions for both hardware and software. The large numbers of subsystems and large multipliers associated with the modules in those subsystems will cause even a strong level of unit reliability to become an unacceptable level of system availability. An evaluation effort is underway to evaluate standards associated with high availability, and to guide ILC development with standard practices and well-supported commercial solutions. One area of evaluation involves the Advanced Telecom Computing Architecture (ATCA) hardware and software. We worked with an ATCA crate, processor monitors, and a small amount of ATCA circuit boards in order to develop a backplane 'spy' board that would let us watch the ATCA backplane communications and pursue development of an inexpensive processor monitor that could be used as a physics-driven component of the crate-level controls system. We made good progress, and felt that we had determined a productive direction to extend this work. We felt that we had learned enough to begin designing a workable processor monitor chip if there were to be sufficient interest in ATCA shown by the ILC community. Fault recognition is a challenging issue in the crafting a high reliability controls system. With tens of thousands of independent processors running hundreds of thousands of critical processes, how can the system identify that a problem has

  2. Modern computer networks and distributed intelligence in accelerator controls

    International Nuclear Information System (INIS)

    Briegel, C.

    1991-01-01

    Appropriate hardware and software network protocols are surveyed for accelerator control environments. Accelerator controls network topologies are discussed with respect to the following criteria: vertical versus horizontal and distributed versus centralized. Decision-making considerations are provided for accelerator network architecture specification. Current trends and implementations at Fermilab are discussed

  3. Computer Algebra Recipes for Mathematical Physics

    CERN Document Server

    Enns, Richard H

    2005-01-01

    Over two hundred novel and innovative computer algebra worksheets or "recipes" will enable readers in engineering, physics, and mathematics to easily and rapidly solve and explore most problems they encounter in their mathematical physics studies. While the aim of this text is to illustrate applications, a brief synopsis of the fundamentals for each topic is presented, the topics being organized to correlate with those found in traditional mathematical physics texts. The recipes are presented in the form of stories and anecdotes, a pedagogical approach that makes a mathematically challenging subject easier and more fun to learn. Key features: * Uses the MAPLE computer algebra system to allow the reader to easily and quickly change the mathematical models and the parameters and then generate new answers * No prior knowledge of MAPLE is assumed; the relevant MAPLE commands are introduced on a need-to-know basis * All MAPLE commands are indexed for easy reference * A classroom-tested story/anecdote format is use...

  4. Topics in radiation at accelerators: Radiation physics for personnel and environmental protection

    International Nuclear Information System (INIS)

    Cossairt, J.D.

    1996-10-01

    In the first chapter, terminology, physical and radiological quantities, and units of measurement used to describe the properties of accelerator radiation fields are reviewed. The general considerations of primary radiation fields pertinent to accelerators are discussed. The primary radiation fields produced by electron beams are described qualitatively and quantitatively. In the same manner the primary radiation fields produced by proton and ion beams are described. Subsequent chapters describe: shielding of electrons and photons at accelerators; shielding of proton and ion accelerators; low energy prompt radiation phenomena; induced radioactivity at accelerators; topics in radiation protection instrumentation at accelerators; and accelerator radiation protection program elements

  5. Grid computing in high energy physics

    CERN Document Server

    Avery, P

    2004-01-01

    Over the next two decades, major high energy physics (HEP) experiments, particularly at the Large Hadron Collider, will face unprecedented challenges to achieving their scientific potential. These challenges arise primarily from the rapidly increasing size and complexity of HEP datasets that will be collected and the enormous computational, storage and networking resources that will be deployed by global collaborations in order to process, distribute and analyze them. Coupling such vast information technology resources to globally distributed collaborations of several thousand physicists requires extremely capable computing infrastructures supporting several key areas: (1) computing (providing sufficient computational and storage resources for all processing, simulation and analysis tasks undertaken by the collaborations); (2) networking (deploying high speed networks to transport data quickly between institutions around the world); (3) software (supporting simple and transparent access to data and software r...

  6. Neurons compute internal models of the physical laws of motion.

    Science.gov (United States)

    Angelaki, Dora E; Shaikh, Aasef G; Green, Andrea M; Dickman, J David

    2004-07-29

    A critical step in self-motion perception and spatial awareness is the integration of motion cues from multiple sensory organs that individually do not provide an accurate representation of the physical world. One of the best-studied sensory ambiguities is found in visual processing, and arises because of the inherent uncertainty in detecting the motion direction of an untextured contour moving within a small aperture. A similar sensory ambiguity arises in identifying the actual motion associated with linear accelerations sensed by the otolith organs in the inner ear. These internal linear accelerometers respond identically during translational motion (for example, running forward) and gravitational accelerations experienced as we reorient the head relative to gravity (that is, head tilt). Using new stimulus combinations, we identify here cerebellar and brainstem motion-sensitive neurons that compute a solution to the inertial motion detection problem. We show that the firing rates of these populations of neurons reflect the computations necessary to construct an internal model representation of the physical equations of motion.

  7. Particle accelerators and the progress of particle physics

    CERN Document Server

    Mangano, Michelangelo

    2016-01-01

    The following sections are included: •The Standard Model of fundamental interactions •Accelerators, and the experimental path towards the standard model •Complementarity and synergy of different accelerator facilities •The future challenges

  8. Computation of Normal Conducting and Superconducting Linear Accelerator (LINAC) Availabilities

    International Nuclear Information System (INIS)

    Haire, M.J.

    2000-01-01

    A brief study was conducted to roughly estimate the availability of a superconducting (SC) linear accelerator (LINAC) as compared to a normal conducting (NC) one. Potentially, SC radio frequency cavities have substantial reserve capability, which allows them to compensate for failed cavities, thus increasing the availability of the overall LINAC. In the initial SC design, there is a klystron and associated equipment (e.g., power supply) for every cavity of an SC LINAC. On the other hand, a single klystron may service eight cavities in the NC LINAC. This study modeled that portion of the Spallation Neutron Source LINAC (between 200 and 1,000 MeV) that is initially proposed for conversion from NC to SC technology. Equipment common to both designs was not evaluated. Tabular fault-tree calculations and computer-event-driven simulation (EDS) computer computations were performed. The estimated gain in availability when using the SC option ranges from 3 to 13% under certain equipment and conditions and spatial separation requirements. The availability of an NC LINAC is estimated to be 83%. Tabular fault-tree calculations and computer EDS modeling gave the same 83% answer to within one-tenth of a percent for the NC case. Tabular fault-tree calculations of the availability of the SC LINAC (where a klystron and associated equipment drive a single cavity) give 97%, whereas EDS computer calculations give 96%, a disagreement of only 1%. This result may be somewhat fortuitous because of limitations of tabular fault-tree calculations. For example, tabular fault-tree calculations can not handle spatial effects (separation distance between failures), equipment network configurations, and some failure combinations. EDS computer modeling of various equipment configurations were examined. When there is a klystron and associated equipment for every cavity and adjacent cavity, failure can be tolerated and the SC availability was estimated to be 96%. SC availability decreased as

  9. GRID computing for experimental high energy physics

    International Nuclear Information System (INIS)

    Moloney, G.R.; Martin, L.; Seviour, E.; Taylor, G.N.; Moorhead, G.F.

    2002-01-01

    Full text: The Large Hadron Collider (LHC), to be completed at the CERN laboratory in 2006, will generate 11 petabytes of data per year. The processing of this large data stream requires a large, distributed computing infrastructure. A recent innovation in high performance distributed computing, the GRID, has been identified as an important tool in data analysis for the LHC. GRID computing has actual and potential application in many fields which require computationally intensive analysis of large, shared data sets. The Australian experimental High Energy Physics community has formed partnerships with the High Performance Computing community to establish a GRID node at the University of Melbourne. Through Australian membership of the ATLAS experiment at the LHC, Australian researchers have an opportunity to be involved in the European DataGRID project. This presentation will include an introduction to the GRID, and it's application to experimental High Energy Physics. We will present the results of our studies, including participation in the first LHC data challenge

  10. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul

    2017-03-26

    We provide a general discussion of Smolyak\\'s algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak\\'s work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak\\'s algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  11. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul; Wolfers, Soeren

    2017-01-01

    We provide a general discussion of Smolyak's algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak's work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak's algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  12. NSC KIPT accelerator on nuclear and high energy physics

    NARCIS (Netherlands)

    Dovbnya, A.N.; Guk, I.S.; Kononenko, S.G.; Wiel, van der M.J.; Botman, J.I.M.; Tarasenko, A.S.

    2004-01-01

    Qualitatively new level can be performed by creating the accelerator that will incorporate the latest technological achievements in the field of electron beam acceleration on the basis of a superconducting TESLA accelerating structure. This structure permits the production of both quasi-continuous

  13. Accelerating complex for basic researches in the nuclear physics

    NARCIS (Netherlands)

    Dovbnya, A.N.; Guk, I.S.; Kononenko, S.G.; Peev, F.A.; Tarasenko, A.S.; Botman, J.I.M.

    2009-01-01

    In 2003 in NSC KIPT was begun the work on development the project of accelerator, base facility IHEPNP NSC KIPT electron recirculator SALO. The accelerator will be disposed in target hall of accelerator LU 2000 complex. It is projected first of all as facility for basic researches in the field of

  14. Recent Improvements to CHEF, a Framework for Accelerator Computations

    Energy Technology Data Exchange (ETDEWEB)

    Ostiguy, J.-F.; Michelotti, L.P.; /Fermilab

    2009-05-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5

  15. Recent Improvements to CHEF, a Framework for Accelerator Computations

    International Nuclear Information System (INIS)

    Ostiguy, J.-F.; Michelotti, L.P.

    2009-01-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5], it is also

  16. CHEP95: Computing in high energy physics. Abstracts

    International Nuclear Information System (INIS)

    1995-01-01

    These proceedings cover the technical papers on computation in High Energy Physics, including computer codes, computer devices, control systems, simulations, data acquisition systems. New approaches on computer architectures are also discussed

  17. Grid Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Avery, Paul

    2004-01-01

    Over the next two decades, major high energy physics (HEP) experiments, particularly at the Large Hadron Collider, will face unprecedented challenges to achieving their scientific potential. These challenges arise primarily from the rapidly increasing size and complexity of HEP datasets that will be collected and the enormous computational, storage and networking resources that will be deployed by global collaborations in order to process, distribute and analyze them.Coupling such vast information technology resources to globally distributed collaborations of several thousand physicists requires extremely capable computing infrastructures supporting several key areas: (1) computing (providing sufficient computational and storage resources for all processing, simulation and analysis tasks undertaken by the collaborations); (2) networking (deploying high speed networks to transport data quickly between institutions around the world); (3) software (supporting simple and transparent access to data and software resources, regardless of location); (4) collaboration (providing tools that allow members full and fair access to all collaboration resources and enable distributed teams to work effectively, irrespective of location); and (5) education, training and outreach (providing resources and mechanisms for training students and for communicating important information to the public).It is believed that computing infrastructures based on Data Grids and optical networks can meet these challenges and can offer data intensive enterprises in high energy physics and elsewhere a comprehensive, scalable framework for collaboration and resource sharing. A number of Data Grid projects have been underway since 1999. Interestingly, the most exciting and far ranging of these projects are led by collaborations of high energy physicists, computer scientists and scientists from other disciplines in support of experiments with massive, near-term data needs. I review progress in this

  18. Doing accelerator physics using SDDS, UNIX, and EPICS

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.; Sereno, N.

    1995-01-01

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinate the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization

  19. Accelerator-based atomic and molecular collision physics

    International Nuclear Information System (INIS)

    Datz, S.

    1993-01-01

    Accelerators have been shown to have great utility in addressing a broad range of problems in experimental atomic physics. There are, of course, phenomena such as inner-shell MO promotion which can occur only at high collision energies. At much higher energies, large transient Coulomb fields can be generated which lead to copious production electron-positron pairs and to capture of electrons from the negative continuum. But in addition, many advantages can be gained by carrying out low-energy (center-of-mass) collisions at high laboratory energies, specifically in a single pass mode or in multi-pass modes in ion storage rings in which, e.g., collision in the milli-electron volt region can be achieved for electron-molecule reactions. Certain advantages also accrue using open-quotes reverse kinematicsclose quotes in which high velocity ions collide with almost open-quotes stationaryclose quotes electrons as in resonant transfer and excitation (RTE) and collisions of energetic ions in the dense open-quotes electron gasclose quotes found in crystal channels

  20. Electron accelerators for research at the frontiers of nuclear physics

    International Nuclear Information System (INIS)

    Grunder, H.A.; Hartline, B.K.; Corneliussen, S.T.

    1986-01-01

    Electron accelerators for the frontiers of nuclear physics must provide high duty factor (>80%) for coincidence measurements; few-hundred-MeV through few-GeV energy for work in the nucleonic, hadronic, and confinement regimes; energy resolution of ∼10 -4 ; and high current (≥ 100 μA). To fulfill these requirements new machines and upgrades of existing ones are being planned or constructed. Representative microtron-based facilities are the upgrade of MAMI at the University of Mainz (West Germany), the proposed two-stage cascaded microtron at the University of Illinois (USA), and the three-stage Troitsk ''polytron'' (USSR). Representative projects to add pulse stretcher rings to existing linacs are the upgrades at MIT-Bates (USA) and at NIKHEF-K (Netherlands). Recent advances in superconducting rf technology, especially in cavity design and fabrication, have made large superconducting cw linacs become feasible. Recirculating superconducting cw linacs are under construction at the University of Darmstadt (West Germany) and at CEBAF (USA), and a proposal is being developed at Saclay (France). 31 refs

  1. Polarized target physics at the Bonn electron accelerators

    International Nuclear Information System (INIS)

    Meyer, W.

    1988-12-01

    At the BONN 2.5 GeV electron synchrotron experiments with polarized nucleon targets have a long tradition. Starting with measurements of the target asymmetry in single pion photoproduction off polarized protons, resp. neutrons, the experiments have been concentrated on photodisintegration measurements of polarized deuterons. Parallel to these activities a considerable progress in the field of the target technology, e.g. cryogenics and target materials, has been made, by which all the measurements have profitted enormously. Especially the development of the new target material ammonia has allowed the first use of a polarized deuteron (ND 3 ) target in an intense electron beam. The construction of a frozen spin target, which will be used in combination with a tagged polarized photon beam, makes a new generation of polarized target experiments in photon induced reactions possible. Together with electron scattering off polarized deuterons and neutrons they will be a main activity in the physics program at the new stretcher accelerator ELSA in BONN. (orig.)

  2. Particle accelerator physics and technology for high energy density physics research

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, D.H.H.; Blazevic, A.; Rosmej, O.N.; Spiller, P.; Tahir, N.A.; Weyrich, K. [Gesellschaft fur Schwerionenforschung, GSI-Darmstadt, Plasmaphysik, Darmstadt (Germany); Hoffmann, D.H.H.; Dafni, T.; Kuster, M.; Ni, P.; Roth, M.; Udrea, S.; Varentsov, D. [Darmstadt Univ., Institut fur Kernphysik, Technische Schlobgartenstr. 9 (Germany); Jacoby, J. [Frankfurt Univ., Institut fur Angewandte Physik (Germany); Kain, V.; Schmidt, R.; Zioutas, K. [European Organization for Nuclear Research (CERN), Geneve (Switzerland); Zioutas, K. [Patras Univ., Dept. of Physics (Greece); Mintsev, V.; Fortov, V.E. [Russian Academy of Sciences, Institute of Problems of Chemical Physics, Chernogolovka (Russian Federation); Sharkov, B.Y. [Institut for Theoretical and Experimental Physics ITEP, Moscow (Russian Federation)

    2007-08-15

    Interaction phenomena of intense ion- and laser radiation with matter have a large range of application in different fields of science, extending from basic research of plasma properties to applications in energy science, especially in inertial fusion. The heavy ion synchrotron at GSI now routinely delivers intense uranium beams that deposit about 1 kJ/g of specific energy in solid matter, e.g. solid lead. Our simulations show that the new accelerator complex FAIR (Facility for Antiproton and Ion Research) at GSI as well as beams from the CERN large hadron collider (LHC) will vastly extend the accessible parameter range for high energy density states. A natural example of hot dense plasma is provided by our neighbouring star the sun, and allows a deep insight into the physics of fusion, the properties of matter at high energy density, and is moreover an excellent laboratory for astro-particle physics. As such the sun's interior plasma can even be used to probe the existence of novel particles and dark matter candidates. We present an overview on recent results and developments of dense plasma physics addressed with heavy ion and laser beams combined with accelerator- and nuclear physics technology. (authors)

  3. Physics, Computer Science and Mathematics Division. Annual report, January 1-December 31, 1980

    International Nuclear Information System (INIS)

    Birge, R.W.

    1981-12-01

    Research in the physics, computer science, and mathematics division is described for the year 1980. While the division's major effort remains in high energy particle physics, there is a continually growing program in computer science and applied mathematics. Experimental programs are reported in e + e - annihilation, muon and neutrino reactions at FNAL, search for effects of a right-handed gauge boson, limits on neutrino oscillations from muon-decay neutrinos, strong interaction experiments at FNAL, strong interaction experiments at BNL, particle data center, Barrelet moment analysis of πN scattering data, astrophysics and astronomy, earth sciences, and instrument development and engineering for high energy physics. In theoretical physics research, studies included particle physics and accelerator physics. Computer science and mathematics research included analytical and numerical methods, information analysis techniques, advanced computer concepts, and environmental and epidemiological studies

  4. Physics, Computer Science and Mathematics Division. Annual report, January 1-December 31, 1980

    Energy Technology Data Exchange (ETDEWEB)

    Birge, R.W.

    1981-12-01

    Research in the physics, computer science, and mathematics division is described for the year 1980. While the division's major effort remains in high energy particle physics, there is a continually growing program in computer science and applied mathematics. Experimental programs are reported in e/sup +/e/sup -/ annihilation, muon and neutrino reactions at FNAL, search for effects of a right-handed gauge boson, limits on neutrino oscillations from muon-decay neutrinos, strong interaction experiments at FNAL, strong interaction experiments at BNL, particle data center, Barrelet moment analysis of ..pi..N scattering data, astrophysics and astronomy, earth sciences, and instrument development and engineering for high energy physics. In theoretical physics research, studies included particle physics and accelerator physics. Computer science and mathematics research included analytical and numerical methods, information analysis techniques, advanced computer concepts, and environmental and epidemiological studies. (GHT)

  5. Proceeding on the scientific meeting and presentation on accelerator technology and its applications: physics, nuclear reactor

    International Nuclear Information System (INIS)

    Pramudita Anggraita; Sudjatmoko; Darsono; Tri Marji Atmono; Tjipto Sujitno; Wahini Nurhayati

    2012-01-01

    The scientific meeting and presentation on accelerator technology and its applications was held by PTAPB BATAN on 13 December 2011. This meeting aims to promote the technology and its applications to accelerator scientists, academics, researchers and technology users as well as accelerator-based accelerator research that have been conducted by researchers in and outside BATAN. This proceeding contains 23 papers about physics and nuclear reactor. (PPIKSN)

  6. Computer applications: Automatic control system for high-voltage accelerator

    International Nuclear Information System (INIS)

    Bryukhanov, A.N.; Komissarov, P.Yu.; Lapin, V.V.; Latushkin, S.T.. Fomenko, D.E.; Yudin, L.I.

    1992-01-01

    An automatic control system for a high-voltage electrostatic accelerator with an accelerating potential of up to 500 kV is described. The electronic apparatus on the high-voltage platform is controlled and monitored by means of a fiber-optic data-exchange system. The system is based on CAMAC modules that are controlled by a microprocessor crate controller. Data on accelerator operation are represented and control instructions are issued by means of an alphanumeric terminal. 8 refs., 6 figs

  7. 3rd International Conference on Particle Physics Beyond the Standard Model : Accelerator, Non-Accelerator and Space Approaches

    CERN Document Server

    Beyond The Desert 2002

    2003-01-01

    The third conference on particle physics beyond the Standard Model (BEYOND THE DESERT'02 - Accelerator, Non-accelerator and Space Approaches) was held during 2--7 June, 2002 at the Finish town of Oulu, almost at the northern Arctic Circle. It was the first of the BEYOND conference series held outside Germany (CERN Courier March 2003, pp. 29-30). Traditionally the Scientific Programme of BEYOND conferences, brought into life in 1997 (see CERN Courier, November 1997, pp.16-18), covers almost all topics of modern particle physics (see contents).

  8. Grid computing in high-energy physics

    International Nuclear Information System (INIS)

    Bischof, R.; Kuhn, D.; Kneringer, E.

    2003-01-01

    Full text: The future high energy physics experiments are characterized by an enormous amount of data delivered by the large detectors presently under construction e.g. at the Large Hadron Collider and by a large number of scientists (several thousands) requiring simultaneous access to the resulting experimental data. Since it seems unrealistic to provide the necessary computing and storage resources at one single place, (e.g. CERN), the concept of grid computing i.e. the use of distributed resources, will be chosen. The DataGrid project (under the leadership of CERN) develops, based on the Globus toolkit, the software necessary for computation and analysis of shared large-scale databases in a grid structure. The high energy physics group Innsbruck participates with several resources in the DataGrid test bed. In this presentation our experience as grid users and resource provider is summarized. In cooperation with the local IT-center (ZID) we installed a flexible grid system which uses PCs (at the moment 162) in student's labs during nights, weekends and holidays, which is especially used to compare different systems (local resource managers, other grid software e.g. from the Nordugrid project) and to supply a test bed for the future Austrian Grid (AGrid). (author)

  9. Computational physics: an introduction (second edition)

    International Nuclear Information System (INIS)

    Borcherds, Peter

    2002-01-01

    This book has much in common with many other books on Computational Physics texts, some of which are helpfully listed by the author as 'A subjective review on related texts'. The first five chapters are introductory, covering finite differences, linear algebra, stochastics and ordinary and partial differential equations. The final section of chapter 3 is entitled 'Stochastic Optimisation', and covers Simulated Annealing and Genetic Algorithms. Neither topic is adequately covered; an explicit example, with algorithms, in each case would have been helpful. However few other computational physics texts mention these topics at all. The chapters in the final part of the book are more advanced, and cover comprehensively Simulation and Statistical Mechanics, Quantum Mechanical Simulation and Hydrodynamics. These chapters include specialist material not in other texts, e.g. Alder vortices and the Nose--Hoover method. There is an extensive coverage of Ewald summation. The author is in the course of augmenting his book by web-resident sample programs, which should enhance the value of the book. This book should appeal to anyone working in the fields covered in the final section. It ought also to be in any physics library. (author)

  10. Genetic algorithms and their applications in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Hofler, Alicia S. [JLAB

    2013-12-01

    Multi-objective optimization techniques are widely used in an extremely broad range of fields. Genetic optimization for multi-objective optimization was introduced in the accelerator community in relatively recent times and quickly spread becoming a fundamental tool in multi-dimensional optimization problems. This discussion introduces the basics of the technique and reviews applications in accelerator problems.

  11. Accelerator physics and radiometric properties of superconducting wavelength shifters

    International Nuclear Information System (INIS)

    Scheer, Michael

    2008-01-01

    Subject of this thesis is the operation of wave-length shifters at electron storage rings and their use in radiometry. The basic aspects of the radiometry, the technical requirements, the influence of wave-length shifters on the storage ring, and results of first measurements are presented for a device installed at BESSY. Most of the calculations are carried out by the program WAVE, which has been developed within this thesis. WAVE allows to calculate the synchrotron radiation spectra of wavelength shifters within an relative uncertainty of 1/100000. The properties of wave-length shifters in terms of accelerator physics as well as a generating function for symplectic tracking calculations can also be calculated by WAVE. The later was implemented in the tracking code BETA to investigate the influence of insertion devices on the dynamic aperture and emittance of the storage ring. These studies led to the concept of alternating low- and high-beta-sections at BESSY-II, which allow to operate superconducting insertion devices without a significant distortion of the magnetic optics. To investigate the experimental aspects of the radiometry at wave-length shifters, a program based on the Monte-Carlo-code GEANT4 has been developed. It allows to simulate the radiometrical measurements and the absorption properties of detectors. With the developed codes first radiometrical measurements by the PTB have been analysed. A comparison of measurements and calculations show a reasonable agreement with deviations of about five percent in the spectral range of 40-60 keV behind a 1-mm-Cu filter. A better agreement was found between 20 keV and 80 keV without Cu filter. In this case the measured data agreed within a systematic uncertainty of two percent with the results of the calculations. (orig.)

  12. BLUES function method in computational physics

    Science.gov (United States)

    Indekeu, Joseph O.; Müller-Nedebock, Kristian K.

    2018-04-01

    We introduce a computational method in physics that goes ‘beyond linear use of equation superposition’ (BLUES). A BLUES function is defined as a solution of a nonlinear differential equation (DE) with a delta source that is at the same time a Green’s function for a related linear DE. For an arbitrary source, the BLUES function can be used to construct an exact solution to the nonlinear DE with a different, but related source. Alternatively, the BLUES function can be used to construct an approximate piecewise analytical solution to the nonlinear DE with an arbitrary source. For this alternative use the related linear DE need not be known. The method is illustrated in a few examples using analytical calculations and numerical computations. Areas for further applications are suggested.

  13. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  14. 179th International School of Physics "Enrico Fermi" : Laser-Plasma Acceleration

    CERN Document Server

    Gizzi, L A; Faccini, R

    2012-01-01

    Impressive progress has been made in the field of laser-plasma acceleration in the last decade, with outstanding achievements from both experimental and theoretical viewpoints. Closely exploiting the development of ultra-intense, ultrashort pulse lasers, laser-plasma acceleration has developed rapidly, achieving accelerating gradients of the order of tens of GeV/m, and making the prospect of miniature accelerators a more realistic possibility. This book presents the lectures delivered at the Enrico Fermi International School of Physics and summer school: "Laser-Plasma Acceleration" , held in Varenna, Italy, in June 2011. The school provided an opportunity for young scientists to experience the best from the worlds of laser-plasma and accelerator physics, with intensive training and hands-on opportunities related to key aspects of laser-plasma acceleration. Subjects covered include: the secrets of lasers; the power of numerical simulations; beam dynamics; and the elusive world of laboratory plasmas. The object...

  15. Computational applications of DNA physical scales

    DEFF Research Database (Denmark)

    Baldi, Pierre; Chauvin, Yves; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example we construct a strand invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combinations with hidden Markov models......The authors study from a computational standpoint several different physical scales associated with structural features of DNA sequences, including dinucleotide scales such as base stacking energy and propellor twist, and trinucleotide scales such as bendability and nucleosome positioning. We show...

  16. Trends in supercomputers and computational physics

    International Nuclear Information System (INIS)

    Bloch, T.

    1985-01-01

    Today, scientists using numerical models explore the basic mechanisms of semiconductors, apply global circulation models to climatic and oceanographic problems, probe into the behaviour of galaxies and try to verify basic theories of matter, such as Quantum Chromo Dynamics by simulating the constitution of elementary particles. Chemists, crystallographers and molecular dynamics researchers develop models for chemical reactions, formation of crystals and try to deduce the chemical properties of molecules as a function of the shapes of their states. Chaotic systems are studied extensively in turbulence (combustion included) and the design of the next generation of controlled fusion devices relies heavily on computational physics. (orig./HSI)

  17. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  18. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  19. Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided "g" Determination

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen; Muller, Sebastian

    2011-01-01

    This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education. We describe a computer-aided determination of the free-fall acceleration "g" using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects…

  20. Computer Based Dose Control System on Linear Accelerator

    International Nuclear Information System (INIS)

    Taxwim; Djoko-SP; Widi-Setiawan; Agus-Budi Wiyatna

    2000-01-01

    The accelerator technology has been used for radio therapy. DokterKaryadi Hospital in Semarang use electron or X-ray linear accelerator (Linac)for cancer therapy. One of the control parameter of linear accelerator isdose rate. It is particle current or amount of photon rate to the target. Thecontrol of dose rate in linac have been done by adjusting repetition rate ofanode pulse train of electron source. Presently the control is stillproportional control. To enhance the quality of the control result (minimalstationer error, velocity and stability), the dose control system has beendesigned by using the PID (Proportional Integral Differential) controlalgorithm and the derivation of transfer function of control object.Implementation of PID algorithm control system is done by giving an input ofdose error (the different between output dose and dose rate set point). Theoutput of control system is used for correction of repetition rate set pointfrom pulse train of electron source anode. (author)

  1. Physics and Novel Schemes of Laser Radiation Pressure Acceleration for Quasi-monoenergetic Proton Generation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Chuan S. [Univ. of Maryland, College Park, MD (United States). Dept. of Physics; Shao, Xi [Univ. of Maryland, College Park, MD (United States)

    2016-06-14

    The main objective of our work is to provide theoretical basis and modeling support for the design and experimental setup of compact laser proton accelerator to produce high quality proton beams tunable with energy from 50 to 250 MeV using short pulse sub-petawatt laser. We performed theoretical and computational studies of energy scaling and Raleigh--Taylor instability development in laser radiation pressure acceleration (RPA) and developed novel RPA-based schemes to remedy/suppress instabilities for high-quality quasimonoenergetic proton beam generation as we proposed. During the project period, we published nine peer-reviewed journal papers and made twenty conference presentations including six invited talks on our work. The project supported one graduate student who received his PhD degree in physics in 2013 and supported two post-doctoral associates. We also mentored three high school students and one undergraduate student of physics major by inspiring their interests and having them involved in the project.

  2. The physics of accelerator driven sub-critical reactors

    Indian Academy of Sciences (India)

    Accelerator driven systems (ADS) are attracting worldwide attention .... The region of interest (or the entire reactor core) is divided into a suitable number ..... have also presented the status of the theoretical and experimental activities being.

  3. Reactor physics computations for nuclear engineering undergraduates

    International Nuclear Information System (INIS)

    Huria, H.C.

    1989-01-01

    The undergraduate program in nuclear engineering at the University of Cincinnati provides three-quarters of nuclear reactor theory that concentrate on physical principles, with calculations limited to those that can be conveniently completed on programmable calculators. An additional one-quarter course is designed to introduce the student to realistic core physics calculational methods, which necessarily requires a computer. Such calculations can be conveniently demonstrated and completed with the modern microcomputer. The one-quarter reactor computations course includes a one-group, one-dimensional diffusion code to introduce the concepts of inner and outer iterations, a cell spectrum code based on integral transport theory to generate cell-homogenized few-group cross sections, and a multigroup diffusion code to determine multiplication factors and power distributions in one-dimensional systems. Problem assignments include the determination of multiplication factors and flux distributions for typical pressurized water reactor (PWR) cores under various operating conditions, such as cold clean, hot clean, hot clean at full power, hot full power with xenon and samarium, and a boron concentration search. Moderator and Doppler coefficients can also be evaluated and examined

  4. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    Science.gov (United States)

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  5. submitter LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    CERN Document Server

    Barranco, Javier; Cameron, David; Crouch, Matthew; De Maria, Riccardo; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Van der Veken, Frederik; Zacharov, Igor

    2017-01-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted i...

  6. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    Science.gov (United States)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  7. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. A programmatic challenge - accelerating, expanding, and innovating physical protection

    International Nuclear Information System (INIS)

    Caravelli, J.

    2002-01-01

    Full text: In the wake of the September 11th terrorists attacks, the Office of international material protection and cooperation is responding to the international community's call to strengthen a global response to the serious challenge of securing nuclear material with the aim of preventing nuclear terrorism. Recent events underline the urgency to proactively address the threat posed by insufficiently secured nuclear material. The sobering reality is that, at present, the threat is disproportional to international efforts to mitigate and stop the proliferation of nuclear materials. The potential consequences of failing to address deficiencies in security systems, or for that matter aiming at anything below 'comprehensive' nuclear material security' is a horrifying reminder of the incredible challenge that we are facing. Against this backdrop, our Office has undertaken a comprehensive program review and is making all possible efforts to expand, accelerate and innovate our physical protection approach. The presentation that I propose to deliver will provide an overview of our new thinking regarding the vulnerability of nuclear/radioactive material post 9-11, touch on some of the obstacles that we are experiencing, and outline the steps that we are aggressively pursuing with the aim of achieving real threat reduction. My presentation will begin with a look at the success and knowledge gained from the bilateral material protection, control and accounting (MPC and A) cooperation between the United States and the Russian Federation and use this as a platform from which to launch a wider discussion on international efforts to strengthen practices for protecting nuclear material. I will examine lessons learned from our cooperation in relation to their applicability to today's security challenges and will outline how we are expanding on our traditional mission to address emerging threats. I will discuss programmatic efforts to bolster traditional, first line of defense

  9. The Extrapolation-Accelerated Multilevel Aggregation Method in PageRank Computation

    Directory of Open Access Journals (Sweden)

    Bing-Yuan Pu

    2013-01-01

    Full Text Available An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.

  10. Shredder: GPU-Accelerated Incremental Storage and Computation

    OpenAIRE

    Bhatotia, Pramod; Rodrigues, Rodrigo; Verma, Akshat

    2012-01-01

    Redundancy elimination using data deduplication and incremental data processing has emerged as an important technique to minimize storage and computation requirements in data center computing. In this paper, we present the design, implementation and evaluation of Shredder, a high performance content-based chunking framework for supporting incremental storage and computation systems. Shredder exploits the massively parallel processing power of GPUs to overcome the CPU bottlenecks of content-ba...

  11. BaBar computing - From collisions to physics results

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The BaBar experiment at SLAC studies B-physics at the Upsilon(4S) resonance using the high-luminosity e+e- collider PEP-II at the Stanford Linear Accelerator Center (SLAC). Taking, processing and analyzing the very large data samples is a significant computing challenge. This presentation will describe the entire BaBar computing chain and illustrate the solutions chosen as well as their evolution with the ever higher luminosity being delivered by PEP-II. This will include data acquisition and software triggering in a high availability, low-deadtime online environment, a prompt, automated calibration pass through the data SLAC and then the full reconstruction of the data that takes place at INFN-Padova within 24 hours. Monte Carlo production takes place in a highly automated fashion in 25+ sites. The resulting real and simulated data is distributed and made available at SLAC and other computing centers. For analysis a much more sophisticated skimming pass has been introduced in the past year, ...

  12. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  13. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    International Nuclear Information System (INIS)

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  14. Linear accelerator for production of tritium: Physics design challenges

    Energy Technology Data Exchange (ETDEWEB)

    Wangler, T.P.; Lawrence, G.P.; Bhatia, T.S.; Billen, J.H.; Chan, K.C.D.; Garnett, R.W.; Guy, F.W.; Liska, D.; Nath, S.; Neuschaefer, G.; Shubaly, M.

    1990-01-01

    In the summer of 1989, a collaboration between Los Alamos National Laboratory and Brookhaven National Laboratory conducted a study to establish a reference design of a facility for accelerator production of tritium (APT). The APT concept is that of a neutron-spallation source, which is based on the use of high-energy protons to bombard lead nuclei, resulting in the production of large quantities of neutrons. Neutrons from the lead are captured by lithium to produce tritium. This paper describes the design of a 1.6-GeV, 250-mA proton cw linear accelerator for APT.

  15. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  16. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  17. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  18. PC as physics computer for LHC?

    International Nuclear Information System (INIS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong

    1996-01-01

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing to existing RISC workstation in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments. (author)

  19. Simulation and computation in health physics training

    International Nuclear Information System (INIS)

    Lakey, S.R.A.; Gibbs, D.C.C.; Marchant, C.P.

    1980-01-01

    The Royal Naval College has devised a number of computer aided learning programmes applicable to health physics which include radiation shield design and optimisation, environmental impact of a reactor accident, exposure levels produced by an inert radioactive gas cloud, and the prediction of radiation detector response in various radiation field conditions. Analogue computers are used on reduced or fast time scales because time dependent phenomenon are not always easily assimilated in real time. The build-up and decay of fission products, the dynamics of intake of radioactive material and reactor accident dynamics can be effectively simulated. It is essential to relate these simulations to real time and the College applies a research reactor and analytical phantom to this end. A special feature of the reactor is a chamber which can be supplied with Argon-41 from reactor exhaust gases to create a realistic gaseous contamination environment. Reactor accident situations are also taught by using role playing sequences carried out in real time in the emergency facilities associated with the research reactor. These facilities are outlined and the training technique illustrated with examples of the calculations and simulations. The training needs of the future are discussed, with emphasis on optimisation and cost-benefit analysis. (H.K.)

  20. PC as physics computer for LHC?

    CERN Document Server

    Jarp, S; Simmins, A; Yaari, R; Jarp, Sverre; Tang, Hong; Simmins, Antony; Yaari, Refael

    1995-01-01

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation fa...

  1. Pc as Physics Computer for Lhc ?

    Science.gov (United States)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  2. Proceedings of the Workshop on relativistic heavy ion physics at present and future accelerators

    International Nuclear Information System (INIS)

    Csoergoe, T.; Hegyi, S.; Lukacs, B.; Zimanyi, J.

    1991-09-01

    This volume contains the Proceedings of the Budapest Workshop on relativistic heavy ion physics at present and future accelerators. The topics includes experimental heavy ion physics, particle phenomenology, Bose-Einstein correlations, relativistic transport theory, quark-gluon plasma rehadronization, astronuclear physics, leptonpair production and intermittency. All contributions were indexed separately for the INIS database. (G.P.)

  3. Acceleration of heavy ions to relativistic energies and their use in physics and biomedicine

    International Nuclear Information System (INIS)

    White, M.G.

    1977-01-01

    The uses of accelerated heavy ions in physics and biomedicine are listed. The special properties of high energy heavy ions and their fields of applications, the desirable ions and energies, requirements for a relativistic heavy ion accelerator, and AGS and Bevalac parameters are discussed. 26 references

  4. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  5. Bookshelf (Advances of Accelerator Physics Technologies, edited by Herwig Schooper)

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Particle accelerators have always drawn upon the most advanced technologies. For Cockcroft and Walton it was high voltages, while the cyclotrons and synchrotrons that followed depended upon acceleration systems designed in the race to perfect wartime radar. As accelerators became too big for the university workshop to handle, the manufacturers of heavy electrical machinery were brought in to make hundreds of metres of electromagnets. They found the requirements of precision and reliability surpassed the quality of the best of their products and had to develop new methods of insulation and precision assembly. They now readily admit that in meeting our challenge they extended their own grasp of technology to the benefit of their less exotic customers; not to mention their shareholders. The stimulation of industry did not stop there - the physicist, by the nature of his craft, is always the first to know of what has just become possible. In their turn many industries, from those which prospect for petrochemicals to others constructing the channel tunnel, have become the technological beneficiaries of this big science. The latest of these technologies is of course that of superconductivity, and this is fully covered in this book. But in the many chapters which describe the state of the art of accelerator design, the reader will encounter numerous examples where the possible awaits an everyday application. This excellent compendium of advances in the accelerator field is therefore obligatory reading for anyone in an industry striving to deserve the label of high-tech. Not only does it for the first time draw together authoritative contributions by those who lead these technologies, but it explains how the large majority of today's accelerators are put to work to cure patients in hospital and to provide synchrotron radiation for a rich spectrum of new industrial applications. In addition there is much in the volume that is essential reading for the accelerator

  6. Physics design of an accelerator for an accelerator-driven subcritical system

    Directory of Open Access Journals (Sweden)

    Zhihui Li

    2013-08-01

    Full Text Available An accelerator-driven subcritical system (ADS program was launched in China in 2011, which aims to design and build an ADS demonstration facility with the capability of more than 1000 MW thermal power in multiple phases lasting about 20 years. The driver linac is defined to be 1.5 GeV in energy, 10 mA in current and in cw operation mode. To meet the extremely high reliability and availability, the linac is designed with much installed margin and fault tolerance, including hot-spare injectors and local compensation method for key element failures. The accelerator complex consists of two parallel 10-MeV injectors, a joint medium-energy beam transport line, a main linac, and a high-energy beam transport line. The superconducting acceleration structures are employed except for the radio frequency quadrupole accelerators (RFQs which are at room temperature. The general design considerations and the beam dynamics design of the driver linac complex are presented here.

  7. Steady-state natural circulation analysis with computational fluid dynamic codes of a liquid metal-cooled accelerator driven system

    International Nuclear Information System (INIS)

    Abanades, A.; Pena, A.

    2009-01-01

    A new innovative nuclear installation is under research in the nuclear community for its potential application to nuclear waste management and, above all, for its capability to enhance the sustainability of nuclear energy in the future as component of a new nuclear fuel cycle in which its efficiency in terms of primary Uranium ore profit and radioactive waste generation will be improved. Such new nuclear installations are called accelerator driven system (ADS) and are the result of a profitable symbiosis between accelerator technology, high-energy physics and reactor technology. Many ADS concepts are based on the utilization of heavy liquid metal (HLM) coolants due to its neutronic and thermo-physical properties. Moreover, such coolants permit the operation in free circulation mode, one of the main aims of passive systems. In this paper, such operation regime is analysed in a proposed ADS design applying computational fluid dynamics (CFD)

  8. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  9. Overview of Accelerator Physics Studies and High Level Software for the Diamond Light Source

    CERN Document Server

    Bartolini, Riccardo; Belgroune, Mahdia; Christou, Chris; Holder, David J; Jones, James; Kempson, Vince; Martin, Ian; Rowland, James H; Singh, Beni; Smith, Susan L; Varley, Jennifer Anne; Wyles, Naomi

    2005-01-01

    DIAMOND is a 3 GeV synchrotron light source under construction at Rutherford Appleton Laboratory in Oxfordshire (UK). The accelerators complex consists of a 100 MeV LINAC, a full energy booster and a 3GeV storage ring with 22 straight sections available for IDs. Installation of all three accelerators has begun, and LINAC commissioning is due to start in Spring 2005. This paper will give an overview of the accelerator physics activity to produce final layouts and prepare for the commissioning of the accelerator complex. The DIAMOND facility is expected to be operational for users in 2007

  10. Accelerator facilities and development of physics in Kazakhstan (1992-2002)

    International Nuclear Information System (INIS)

    Shkol'nik, V.S.; Arzumanov, A.A.; Borisenko, A.N.; Gorlachev, I.D.; Kadyrzhanov, K.K.; Kuterbekov, K.A.; Lysukhin, S.N.; Tuleushev, A.Zh.

    2003-01-01

    The monograph is devoted to the use both the isochronous cyclotron U-150M and the accelerator of the heavy ions UKP-2-1, which are the base facilitates of the Institute of Nuclear Physics of the National Nuclear Center of the Republic of Kazakhstan (INP NNC RK) for scientific researches in the field of nuclear physics of low and middle energies, radiation solid state physics and applied nuclear physics. The history of creation of facilities, some archival documents are given The use of the accelerators of INP NNC RK for the last ten years (1992-2002) is described in detail. The parameters of facilities, photos of the main functional units of the accelerators as well as nuclear and physical methods realized on these basic facilities have been presented. The appendixes present copies of some important historical documents as well as the following materials: a list of on accelerator themes, a list of dissertation works, a list of publications of the Nuclear Physics Department within the period of 1972-2002 and the Solid State Department within the period of 1995-2002 carried out using the accelerators of INP NNC RK. The book is intended for scientists studying actual problems of nuclear physics of low and middle energies, radiation solid state physics as well as students specializing in this field (author)

  11. Accelerator Preparations for Muon Physics Experiments at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Syphers, M.J.; /Fermilab

    2009-10-01

    The use of existing Fermilab facilities to provide beams for two muon experiments - the Muon to Electron Conversion Experiment (Mu2e) and the New g-2 Experiment - is under consideration. Plans are being pursued to perform these experiments following the completion of the Tevatron Collider Run II, utilizing the beam lines and storage rings used today for antiproton accumulation without considerable reconfiguration. Operating scenarios being investigated and anticipated accelerator improvements or reconfigurations will be presented.

  12. Computational acceleration for MR image reconstruction in partially parallel imaging.

    Science.gov (United States)

    Ye, Xiaojing; Chen, Yunmei; Huang, Feng

    2011-05-01

    In this paper, we present a fast numerical algorithm for solving total variation and l(1) (TVL1) based image reconstruction with application in partially parallel magnetic resonance imaging. Our algorithm uses variable splitting method to reduce computational cost. Moreover, the Barzilai-Borwein step size selection method is adopted in our algorithm for much faster convergence. Experimental results on clinical partially parallel imaging data demonstrate that the proposed algorithm requires much fewer iterations and/or less computational cost than recently developed operator splitting and Bregman operator splitting methods, which can deal with a general sensing matrix in reconstruction framework, to get similar or even better quality of reconstructed images.

  13. Sonification of simulations in computational physics

    International Nuclear Information System (INIS)

    Vogt, K.

    2010-01-01

    Sonification is the translation of information for auditory perception, excluding speech itself. The cognitive performance of pattern recognition is striking for sound, and has too long been disregarded by the scientific mainstream. Examples of 'spontaneous sonification' and systematic research for about 20 years have proven that sonification provides a valuable tool for the exploration of scientific data. The data in this thesis stem from computational physics, where numerical simulations are applied to problems in physics. Prominent examples are spin models and lattice quantum field theories. The corresponding data lend themselves very well to innovative display methods: they are structured on discrete lattices, often stochastic, high-dimensional and abstract, and they provide huge amounts of data. Furthermore, they have no inher- ently perceptual dimension. When designing the sonification of simulation data, one has to make decisions on three levels, both for the data and the sound model: the level of meaning (phenomenological; metaphoric); of structure (in time and space), and of elements ('display units' vs. 'gestalt units'). The design usually proceeds as a bottom-up or top-down process. This thesis provides a 'toolbox' for helping in these decisions. It describes tools that have proven particularly useful in the context of simulation data. An explicit method of top-down sonification design is the metaphoric sonification method, which is based on expert interviews. Furthermore, qualitative and quantitative evaluation methods are presented, on the basis of which a set of evaluation criteria is proposed. The translation between a scientific and the sound synthesis domain is elucidated by a sonification operator. For this formalization, a collection of notation modules is provided. Showcases are discussed in detail that have been developed in the interdisciplinary research projects SonEnvir and QCD-audio, during the second Science By Ear workshop and during a

  14. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  15. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Perret-Gallix, D.; Wojcik, W.

    1990-01-01

    These proceedings relate in a pragmatic way the use of methods and techniques of software engineering and artificial intelligence in high energy and nuclear physics. Such fundamental research can only be done through the design, the building and the running of equipments and systems among the most complex ever undertaken by mankind. The use of these new methods is mandatory in such an environment. However their proper integration in these real applications raise some unsolved problems. Their solution, beyond the research field, will lead to a better understanding of some fundamental aspects of software engineering and artificial intelligence. Here is a sample of subjects covered in the proceedings : Software engineering in a multi-users, multi-versions, multi-systems environment, project management, software validation and quality control, data structure and management object oriented languages, multi-languages application, interactive data analysis, expert systems for diagnosis, expert systems for real-time applications, neural networks for pattern recognition, symbolic manipulation for automatic computation of complex processes

  16. Can low energy electrons affect high energy physics accelerators?

    CERN Document Server

    Cimino, R; Furman, M A; Pivi, M; Ruggiero, F; Rumolo, Giovanni; Zimmermann, Frank

    2004-01-01

    The properties of the electrons participating in the build up of an electron cloud (EC) inside the beam-pipe have become an increasingly important issue for present and future accelerators whose performance may be limited by this effect. The EC formation and evolution are determined by the wall-surface properties of the accelerator vacuum chamber. Thus, the accurate modeling of these surface properties is an indispensible input to simulation codes aimed at the correct prediction of build-up thresholds, electron-induced instability or EC heat load. In this letter, we present the results of surface measurements performed on a prototype of the beam screen adopted for the Large Hadron Collider (LHC), which presently is under construction at CERN. We have measured the total secondary electron yield (SEY) as well as the related energy distribution curves (EDC) of the secondary electrons as a function of incident electron energy. Attention has been paid, for the first time in this context, to the probability at whic...

  17. Can Low Energy Electrons Affect High Energy Physics Accelerators?

    International Nuclear Information System (INIS)

    Cimino, Roberto

    2004-01-01

    The properties of the electrons participating in the build up of an electron cloud (EC) inside the beam-pipe have become an increasingly important issue for present and future accelerators whose performance may be limited by this effect. The EC formation and evolution are determined by the wall-surface properties of the accelerator vacuum chamber. Thus, the accurate modeling of these surface properties is an indispensible input to simulation codes aimed at the correct prediction of build-up thresholds, electron-induced instability or EC heat load. In this letter, we present the results of surface measurements performed on a prototype of the beam screen adopted for the Large Hadron Collider (LHC), which presently is under construction at CERN. We have measured the total secondary electron yield (SEY) as well as the related energy distribution curves (EDC) of the secondary electrons as a function of incident electron energy. Attention has been paid, for the first time in this context, to the probability at which low-energy electrons (<∼ 20 eV) impacting on the wall create secondaries or are elastically reflected. It is shown that the ratio of reflected to true-secondary electrons increases for decreasing energy and that the SEY approaches unity in the limit of zero primary electron energy

  18. Accelerators

    CERN Multimedia

    CERN. Geneva

    2001-01-01

    The talk summarizes the principles of particle acceleration and addresses problems related to storage rings like LEP and LHC. Special emphasis will be given to orbit stability, long term stability of the particle motion, collective effects and synchrotron radiation.

  19. Computer control of large accelerators design concepts and methods

    International Nuclear Information System (INIS)

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references

  20. Advanced Computational Models for Accelerator-Driven Systems

    International Nuclear Information System (INIS)

    Talamo, A.; Ravetto, P.; Gudowsk, W.

    2012-01-01

    In the nuclear engineering scientific community, Accelerator Driven Systems (ADSs) have been proposed and investigated for the transmutation of nuclear waste, especially plutonium and minor actinides. These fuels have a quite low effective delayed neutron fraction relative to uranium fuel, therefore the subcriticality of the core offers a unique safety feature with respect to critical reactors. The intrinsic safety of ADS allows the elimination of the operational control rods, hence the reactivity excess during burnup can be managed by the intensity of the proton beam, fuel shuffling, and eventually by burnable poisons. However, the intrinsic safety of a subcritical system does not guarantee that ADSs are immune from severe accidents (core melting), since the decay heat of an ADS is very similar to the one of a critical system. Normally, ADSs operate with an effective multiplication factor between 0.98 and 0.92, which means that the spallation neutron source contributes little to the neutron population. In addition, for 1 GeV incident protons and lead-bismuth target, about 50% of the spallation neutrons has energy below 1 MeV and only 15% of spallation neutrons has energies above 3 MeV. In the light of these remarks, the transmutation performances of ADS are very close to those of critical reactors.

  1. Computer control of large accelerators design concepts and methods

    Energy Technology Data Exchange (ETDEWEB)

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references.

  2. X-BAND LINEAR COLLIDER R and D IN ACCELERATING STRUCTURES THROUGH ADVANCED COMPUTING

    International Nuclear Information System (INIS)

    Li, Z

    2004-01-01

    This paper describes a major computational effort that addresses key design issues in the high gradient accelerating structures for the proposed X-band linear collider, GLC/NLC. Supported by the US DOE's Accelerator Simulation Project, SLAC is developing a suite of parallel electromagnetic codes based on unstructured grids for modeling RF structures with higher accuracy and on a scale previously not possible. The new simulation tools have played an important role in the R and D of X-Band accelerating structures, in cell design, wakefield analysis and dark current studies

  3. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  4. Selected works of basic research on the physics and technology of accelerator driven clean nuclear power system

    International Nuclear Information System (INIS)

    Zhao Zhixiang

    2002-01-01

    38 theses are presented in this selected works of basic research on the physics and technology of accelerator driven clean nuclear power system. It includes reactor physics and experiment, accelerators physics and technology, nuclear physics, material research and partitioning. 13 abstracts, which has been presented on magazines home and abroad, are collected in the appendix

  5. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  6. A test harness for accelerating physics parameterization advancements into operations

    Science.gov (United States)

    Firl, G. J.; Bernardet, L.; Harrold, M.; Henderson, J.; Wolff, J.; Zhang, M.

    2017-12-01

    The process of transitioning advances in parameterization of sub-grid scale processes from initial idea to implementation is often much quicker than the transition from implementation to use in an operational setting. After all, considerable work must be undertaken by operational centers to fully test, evaluate, and implement new physics. The process is complicated by the scarcity of like-to-like comparisons, availability of HPC resources, and the ``tuning problem" whereby advances in physics schemes are difficult to properly evaluate without first undertaking the expensive and time-consuming process of tuning to other schemes within a suite. To address this process shortcoming, the Global Model TestBed (GMTB), supported by the NWS NGGPS project and undertaken by the Developmental Testbed Center, has developed a physics test harness. It implements the concept of hierarchical testing, where the same code can be tested in model configurations of varying complexity from single column models (SCM) to fully coupled, cycled global simulations. Developers and users may choose at which level of complexity to engage. Several components of the physics test harness have been implemented, including a SCM and an end-to-end workflow that expands upon the one used at NOAA/EMC to run the GFS operationally, although the testbed components will necessarily morph to coincide with changes to the operational configuration (FV3-GFS). A standard, relatively user-friendly interface known as the Interoperable Physics Driver (IPD) is available for physics developers to connect their codes. This prerequisite exercise allows access to the testbed tools and removes a technical hurdle for potential inclusion into the Common Community Physics Package (CCPP). The testbed offers users the opportunity to conduct like-to-like comparisons between the operational physics suite and new development as well as among multiple developments. GMTB staff have demonstrated use of the testbed through a

  7. The physics design of accelerator-driven transmutation systems

    International Nuclear Information System (INIS)

    Venneri, F.

    1995-01-01

    Nuclear systems under study in the Los Alamos Accelerator-Driven Transmutation Technology program (ADTT) will allow the destruction of nuclear spent fuel and weapons-return plutonium, as well as the production of nuclear energy from the thorium cycle, without a long-lived radioactive waste stream. The subcritical systems proposed represent a radical departure from traditional nuclear concepts (reactors), yet the actual implementation of ADTT systems is based on modest extrapolations of existing technology. These systems strive to keep the best that the nuclear technology has developed over the years, within a sensible conservative design envelope and eventually manage to offer a safer, less expensive and more environmentally sound approach to nuclear power

  8. The physics design of accelerator-driven transmutation systems

    Energy Technology Data Exchange (ETDEWEB)

    Venneri, F. [Los Alamos National Laboratory, NM (United States)

    1995-10-01

    Nuclear systems under study in the Los Alamos Accelerator-Driven Transmutation Technology program (ADTT) will allow the destruction of nuclear spent fuel and weapons-return plutonium, as well as the production of nuclear energy from the thorium cycle, without a long-lived radioactive waste stream. The subcritical systems proposed represent a radical departure from traditional nuclear concepts (reactors), yet the actual implementation of ADTT systems is based on modest extrapolations of existing technology. These systems strive to keep the best that the nuclear technology has developed over the years, within a sensible conservative design envelope and eventually manage to offer a safe, less expensive and more environmentally sound approach to nuclear power.

  9. Computing and physical methods to calculate Pu

    International Nuclear Information System (INIS)

    Mohamed, Ashraf Elsayed Mohamed

    2013-01-01

    Main limitations due to the enhancement of the plutonium content are related to the coolant void effect as the spectrum becomes faster, the neutron flux in the thermal region tends towards zero and is concentrated in the region from 10 Ke to 1 MeV. Thus, all captures by 240 Pu and 242 Pu in the thermal and epithermal resonance disappear and the 240 Pu and 242 Pu contributions to the void effect became positive. The higher the Pu content and the poorer the Pu quality, the larger the void effect. The core control in nominal or transient conditions Pu enrichment leads to a decrease in (B eff.), the efficiency of soluble boron and control rods. Also, the Doppler effect tends to decrease when Pu replaces U, so, that in case of transients the core could diverge again if the control is not effective enough. As for the voiding effect, the plutonium degradation and the 240 Pu and 242 Pu accumulation after multiple recycling lead to spectrum hardening and to a decrease in control. One solution would be to use enriched boron in soluble boron and shutdown rods. In this paper, I discuss and show the advanced computing and physical methods to calculate Pu inside the nuclear reactors and glovebox and the different solutions to be used to overcome the difficulties that effect, on safety parameters and on reactor performance, and analysis the consequences of plutonium management on the whole fuel cycle like Raw materials savings, fraction of nuclear electric power involved in the Pu management. All through two types of scenario, one involving a low fraction of the nuclear park dedicated to plutonium management, the other involving a dilution of the plutonium in all the nuclear park. (author)

  10. Computer codes for particle accelerator design and analysis: A compendium. Second edition

    International Nuclear Information System (INIS)

    Deaven, H.S.; Chan, K.C.D.

    1990-05-01

    The design of the next generation of high-energy accelerators will probably be done as an international collaborative efforts and it would make sense to establish, either formally or informally, an international center for accelerator codes with branches for maintenance, distribution, and consultation at strategically located accelerator centers around the world. This arrangement could have at least three beneficial effects. It would cut down duplication of effort, provide long-term support for the best codes, and provide a stimulating atmosphere for the evolution of new codes. It does not take much foresight to see that the natural evolution of accelerator design codes is toward the development of so-called Expert Systems, systems capable of taking design specifications of future accelerators and producing specifications for optimized magnetic transport and acceleration components, making a layout, and giving a fairly impartial cost estimate. Such an expert program would use present-day programs such as TRANSPORT, POISSON, and SUPERFISH as tools in the optimization process. Such a program would also serve to codify the experience of two generations of accelerator designers before it is lost as these designers reach retirement age. This document describes 203 codes that originate from 10 countries and are currently in use. The authors feel that this compendium will contribute to the dialogue supporting the international collaborative effort that is taking place in the field of accelerator physics today

  11. ISABELLE [Intersecting Storage Accelerators with the adjective belle] physics prospects

    International Nuclear Information System (INIS)

    1972-01-01

    This volume contains a collection of reports on physics prospects at a 200 x 200-GeV proton intersecting storage ring facility (Isabelle or ISA). General topics of papers included are: machine-related topics, general purpose detectors, strong interaction experiments, weak and electromagnetic interaction experiments, and other exotic ideas

  12. research proposal to the national accelerator centre: physical ...

    African Journals Online (AJOL)

    COWLEY

    the study by Moussa however differ from this study since building material was .... The efficiency calibrations measurements of the system were done at a standard ..... underlying geochemical and -physical principles”. ... sites – a case study from East Sinai, Egypt”. ... “A new high background radiation area in the geothermal.

  13. Hadron physics at the new CW electron accelerators

    International Nuclear Information System (INIS)

    Burkert, V.D.

    1990-01-01

    Major trends of the physics program related to the study of hadron structure and hadron spectroscopy at the new high current, high duty cycle electron machines are discussed. It is concluded that planned experiments at these machines may have important impact on our understanding of the strong interaction by studying the internal structure and spectroscopy of the nucleon and lower mass hyperon states

  14. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  15. Physical design of 9 MeV travelling wave electron linac accelerating tube

    International Nuclear Information System (INIS)

    Chen Huaibi; Ding Xiaodong; Lin Yuzheng

    2000-01-01

    An accelerating tube is described. It is a part of an accelerator used for inspection of vehicle cargoes in rail cars, trucks, shipping containers, or airplanes in customs. A klystron with power of 4 MW and frequency of 2856 MHz will be applied to supply microwave power. The electrons can be accelerated by a travelling wave in the accelerating tube about 220 cm long, with a buncher whose capture efficiency is more than 80%. Energy of electrons after travelling through the tube can reach 9 MeV (pulse current intensity 170 mA) or 6 MeV (pulse current intensity 300 mA). Physical design of the accelerating tube, including the calculations of longitudinal particle dynamics, structure parameter and working character is carried out

  16. Introduction to the study of particle accelerators. Atomic, nuclear and high energy physics for engineers

    International Nuclear Information System (INIS)

    Warnecke, R.R.

    1975-01-01

    This book is destined for engineers taking part in the design building and running of nuclear physics and high-energy physics particle accelerators. It starts with some notions on the theory of relativity, analytical and statistical mechanics and quantum mechanics. An outline of the properties of atomic nuclei, the collision theory and the elements of gaseous plasma physics is followed by a discussion on elementary particles: characteristic parameters, properties, interactions, classification [fr

  17. Golden Jubilee photos: Computers for physics

    CERN Multimedia

    2004-01-01

    CERN's first computer, a huge vacuum-tube Ferranti Mercury, was installed in building 2 in 1958. With its 60 microsecond clock cycle, it was a million times slower than today's big computers. The Mercury took 3 months to install and filled a huge room, even so, its computational ability didn't quite match that of a modern pocket calculator. "Mass" storage was provided by four magnetic drums each holding 32K x 20 bits - not enough to hold the data from a single proton-proton collision in the LHC. It was replaced in 1960 by the IBM 709 computer, seen here being unloaded at Cointrin airport. Although it was taken over so quickly by transistor equipped machines, a small part of the Ferranti Mercury remains. The computer's engineers installed a warning bell to signal computing errors - it can still be found mounted on the wall in a corridor of building 2.

  18. Report of the Subpanel on Accelerator Research and Development of the High Energy Physics Advisory Panel

    International Nuclear Information System (INIS)

    1980-06-01

    Accelerator R and D in the US High Energy Physics (HEP) program is reviewed. As a result of this study, some shift in priority, particularly as regards long-range accelerator R and D, is suggested to best serve the future needs of the US HEP program. Some specific new directions for the US R and D effort are set forth. 18 figures, 5 tables

  19. Report of the Subpanel on Theoretical Computing of the High Energy Physics Advisory Panel

    International Nuclear Information System (INIS)

    1984-09-01

    The Subpanel on Theoretical Computing of the High Energy Physics Advisory Panel (HEPAP) was formed in July 1984 to make recommendations concerning the need for state-of-the-art computing for theoretical studies. The specific Charge to the Subpanel is attached as Appendix A, and the full membership is listed in Appendix B. For the purposes of this study, theoretical computing was interpreted as encompassing both investigations in the theory of elementary particles and computation-intensive aspects of accelerator theory and design. Many problems in both areas are suited to realize the advantages of vectorized processing. The body of the Subpanel Report is organized as follows. The Introduction, Section I, explains some of the goals of computational physics as it applies to elementary particle theory and accelerator design. Section II reviews the availability of mainframe supercomputers to researchers in the United States, in Western Europe, and in Japan. Other promising approaches to large-scale computing are summarized in Section III. Section IV details the current computing needs for problems in high energy theory, and for beam dynamics studies. The Subpanel Recommendations appear in Section V. The Appendices attached to this Report give the Charge to the Subpanel, the Subpanel membership, and some background information on the financial implications of establishing a supercomputer center

  20. Supercomputers and the future of computational atomic scattering physics

    International Nuclear Information System (INIS)

    Younger, S.M.

    1989-01-01

    The advent of the supercomputer has opened new vistas for the computational atomic physicist. Problems of hitherto unparalleled complexity are now being examined using these new machines, and important connections with other fields of physics are being established. This talk briefly reviews some of the most important trends in computational scattering physics and suggests some exciting possibilities for the future. 7 refs., 2 figs

  1. Accelerated physical modelling of radioactive waste migration in soil

    International Nuclear Information System (INIS)

    Zimmie, T.F.; De, A.; Mahmud, M.B.

    1994-01-01

    A 100 g-tonne geotechnical centrifuge was used to study the long-term migration of a contaminant and radioactive tracer through a saturated soil medium. The use of the centrifuge simulates the acceleration of travel time in the prototype, which is N times larger than the model, by N 2 , where N is the desired g level. For a 5 h run at 60 g, the test modelled a migration time of about 2 years for a prototype 60 times larger than the small-scale model tested. Iodine 131, used as the tracer, was injected onto the surface of the soil, and was allowed to migrate with a constant head of water through the saturated soil. End window Geiger-Mueller (G-M) tubes were used to measure the count rate of the radioactive tracer flowing through the soil. The time from the peak response of one G-M tube to the other denotes the travel time between the two points in the flow domain. The results obtained using the radioactive tracer are in good agreement with the test performed on the same model setup using potassium permanganate as tracer and with numerical flow net modelling. Radioactive tracers can be useful in the study of nonradioactive contaminants as well, offering a nonintrusive (nondestructive) method of measuring contaminant migration. (author). 18 refs., 1 tab., 7 figs

  2. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  3. Gravitational Acceleration Effects on Macrosegregation: Experiment and Computational Modeling

    Science.gov (United States)

    Leon-Torres, J.; Curreri, P. A.; Stefanescu, D. M.; Sen, S.

    1999-01-01

    Experiments were performed under terrestrial gravity (1g) and during parabolic flights (10-2 g) to study the solidification and macrosegregation patterns of Al-Cu alloys. Alloys having 2% and 5% Cu were solidified against a chill at two different cooling rates. Microscopic and Electron Microprobe characterization was used to produce microstructural and macrosegregation maps. In all cases positive segregation occurred next to the chill because shrinkage flow, as expected. This positive segregation was higher in the low-g samples, apparently because of the higher heat transfer coefficient. A 2-D computational model was used to explain the experimental results. The continuum formulation was employed to describe the macroscopic transports of mass, energy, and momentum, associated with the solidification phenomena, for a two-phase system. The model considers that liquid flow is driven by thermal and solutal buoyancy, and by solidification shrinkage. The solidification event was divided into two stages. In the first one, the liquid containing freely moving equiaxed grains was described through the relative viscosity concept. In the second stage, when a fixed dendritic network was formed after dendritic coherency, the mushy zone was treated as a porous medium. The macrosegregation maps and the cooling curves obtained during experiments were used for validation of the solidification and segregation model. The model can explain the solidification and macrosegregation patterns and the differences between low- and high-gravity results.

  4. Accelerator driven systems for energy production and waste incineration: Physics, design and related nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M; Stanculescu, A [International Atomic Energy Agency, Vienna (Austria); Paver, N [University of Trieste and INFN, Trieste (Italy)

    2003-06-15

    This volume contains the notes of lectures given at the workshops 'Hybrid Nuclear Systems for Energy Production, Utilisation of Actinides and Transmutation of Long-lived Radioactive Waste' and 'Nuclear Data for Science and Technology: Accelerator Driven Waste Incineration', held at the Abdus Salam ICTP in September 2001. The subject of the first workshop was focused on the so-called Accelerator Driven Systems, and covered the most important physics and technological aspects of this innovative field. The second workshop was devoted to an exhaustive survey on the acquisition, evaluation, retrieval and validation of the nuclear data relevant to the design of Accelerator Driven Systems.

  5. Accelerator driven systems for energy production and waste incineration: Physics, design and related nuclear data

    International Nuclear Information System (INIS)

    Herman, M.; Stanculescu, A.; Paver, N.

    2003-01-01

    This volume contains the notes of lectures given at the workshops 'Hybrid Nuclear Systems for Energy Production, Utilisation of Actinides and Transmutation of Long-lived Radioactive Waste' and 'Nuclear Data for Science and Technology: Accelerator Driven Waste Incineration', held at the Abdus Salam ICTP in September 2001. The subject of the first workshop was focused on the so-called Accelerator Driven Systems, and covered the most important physics and technological aspects of this innovative field. The second workshop was devoted to an exhaustive survey on the acquisition, evaluation, retrieval and validation of the nuclear data relevant to the design of Accelerator Driven Systems

  6. Shielding considerations for an electron linear accelerator complex for high energy physics and photonics research

    International Nuclear Information System (INIS)

    Holmes, J.A.; Huntzinger, C.J.

    1987-01-01

    Radiation shielding considerations for a major high-energy physics and photonics research complex which comprise a 50 MeV electron linear accelerator injector, a 1.0 GeV electron linear accelerator and a 1.3 GeV storage ring are discussed. The facilities will be unique because of the close proximity of personnel to the accelerator beam lines, the need to adapt existing facilities and shielding materials and the application of strict ALARA dose guidelines while providing maximum access and flexibility during a phased construction program

  7. Neural chips, neural computers and application in high and superhigh energy physics experiments

    International Nuclear Information System (INIS)

    Nikityuk, N.M.; )

    2001-01-01

    Architecture peculiarity and characteristics of series of neural chips and neural computes used in scientific instruments are considered. Tendency of development and use of them in high energy and superhigh energy physics experiments are described. Comparative data which characterize the efficient use of neural chips for useful event selection, classification elementary particles, reconstruction of tracks of charged particles and for search of hypothesis Higgs particles are given. The characteristics of native neural chips and accelerated neural boards are considered [ru

  8. Computer methods in physics 250 problems with guided solutions

    CERN Document Server

    Landau, Rubin H

    2018-01-01

    Our future scientists and professionals must be conversant in computational techniques. In order to facilitate integration of computer methods into existing physics courses, this textbook offers a large number of worked examples and problems with fully guided solutions in Python as well as other languages (Mathematica, Java, C, Fortran, and Maple). It’s also intended as a self-study guide for learning how to use computer methods in physics. The authors include an introductory chapter on numerical tools and indication of computational and physics difficulty level for each problem.

  9. Subcritical set coupled to accelerator (ADS) for transmutation of radioactive wastes: an approach of computational modelling

    International Nuclear Information System (INIS)

    Torres, Mirta B.; Dominguez, Dany S.

    2013-01-01

    Nuclear fission devices coupled to particle accelerators ADS are being widely studied. These devices have several applications, including nuclear waste transmutation and producing hydrogen, both applications with strong social and environmental impact. The essence of this work was to model an ADS geometry composed of small TRISO fuel loaded with a mixture of MOX uranium and thorium target material spallation of uranium, using methods of computational modeling probabilistic, in particular the MCNPX 2.6e program to evaluate the physical characteristics of the device and their ability to transmutation. As a result of the characterization of the spallation target, it can be concluded that production of neutrons per incident proton increases with increasing dimensions of the spallation target (thickness and radius), until it reached the maximum production of neutrons per incident proton or call the region saturation. The results obtained in modeling the ADS device bed kind of balls with respect to isotopic variation in the isotopes of plutonium and minor actinides considered in the analysis revealed that accumulation of mass of the isotopes of plutonium and minor actinides increase for subcritical configuration considered. In the particular case of the isotope 239 Pu, it is observed a reduction of the mass from the time of burning of 99 days. The increase of power in the core, whereas tungsten spallation targets and Lead is among the key future developments of this work

  10. Beam transport physics issues for the recirculating linear accelerator

    International Nuclear Information System (INIS)

    Shokair, I.R.

    1992-11-01

    The Recirculating Linear Accelerator (RLA) utilizes the Ion Focused Regime (IFR) of beam transport plus a ramped bending field to guide the beam around the curved sections. Several issues of beam transport are considered. Beam transverse perturbations that could result in growth of the ion hose instability are analyzed. It is found that transverse kicks due to bending field errors, energy mismatches and fringe fields are the most important. The scaling of these perturbations with beam and channel parameters is derived. The effect of ramping of the bending field on the preformed plasma channel is then considered. For RLA experimental parameters the effect is found to be very small. For high energies however, in addition to axial heating, it is found that ramping the field causes compression of the plasma channel along the radius of curvature. This compression results in a quasi-equilibrium plasma electron temperature along the field lines which leads to collisionless transport towards the walls. The analysis of compression is done in an approximate way using a single particle picture and the channel expansion is analyzed using an envelope solution which gives a simple expression for the expansion time. This solution is then verified by Buckshot simulations. For a bending field of 2 kG ramped in 2 μ-secs and an argon channel (RLA parameters) we estimate that the channel radius doubling time (along field lines) is of the order of 0.5 μ-secs. Finally the effect of electron impact ionization due to axially heated electrons by the action of the inductive field is estimated. It is found that in Argon gas the electron avalanche time could be as low as 0.5 μ-sec which is smaller than the field ramp time

  11. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  12. 3-D computations and measurements of accelerator magnets for the APS

    International Nuclear Information System (INIS)

    Turner, L.R.; Kim, S.H.; Kim, K.

    1993-01-01

    The Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), requires dipole, quadrupole, sextupole, and corrector magnets for each of its circular accelerator systems. Three-dimensional (3-D) field computations are needed to eliminate unwanted multipole fields from the ends of long quadrupole and dipole magnets and to guarantee that the flux levels in the poles of short magnets will not cause saturation. Measurements of the magnets show good agreement with the computations

  13. Physics design and scaling of recirculating induction accelerators: from benchtop prototypes to drivers

    International Nuclear Information System (INIS)

    Barnard, J.J.; Cable, M.D.; Callahan, D.A.

    1996-01-01

    Recirculating induction accelerators (recirculators) have been investigated as possible drivers for inertial fusion energy production because of their potential cost advantage over linear induction accelerators. Point designs were obtained and many of the critical physics and technology issues that would need to be addressed were detailed. A collaboration involving Lawrence Livermore National Laboratory and Lawrence Berkeley National Laboratory researchers is now developing a small prototype recirculator in order to demonstrate an understanding of nearly all of the critical beam dynamics issues that have been raised. We review the design equations for recirculators and demonstrate how, by keeping crucial dimensionless quantities constant, a small prototype recirculator was designed which will simulate the essential beam physics of a driver. We further show how important physical quantities such as the sensitivity to errors of optical elements (in both field strength and placement), insertion/extraction, vacuum requirements, and emittance growth, scale from small-prototype to driver-size accelerator

  14. submitter Accelerating high-energy physics exploration with deep learning

    CERN Document Server

    Ojika, Dave; Gordon-Ross, Ann; Carnes, Andrew; Gleyzer, Sergei

    2017-01-01

    In this work, we present our approach to using deep learning for identification of rarely produced physics particles (such as the Higgs Boson) out of a majority of uninteresting, background or noise-dominated data. A fast and efficient system to eliminate uninteresting data would result in much less data being stored, thus significantly reducing processing time and storage requirements. In this paper, we present a generalized preliminary version of our approach to motivate research interest in advancing the state-of-the-art in deep learning networks for other applications that can benefit from learning systems.

  15. Abstraction/Representation Theory for heterotic physical computing.

    Science.gov (United States)

    Horsman, D C

    2015-07-28

    We give a rigorous framework for the interaction of physical computing devices with abstract computation. Device and program are mediated by the non-logical representation relation; we give the conditions under which representation and device theory give rise to commuting diagrams between logical and physical domains, and the conditions for computation to occur. We give the interface of this new framework with currently existing formal methods, showing in particular its close relationship to refinement theory, and the implications for questions of meaning and reference in theoretical computer science. The case of hybrid computing is considered in detail, addressing in particular the example of an Internet-mediated social machine, and the abstraction/representation framework used to provide a formal distinction between heterotic and hybrid computing. This forms the basis for future use of the framework in formal treatments of non-standard physical computers. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. GENIUS - a new facility of non-accelerator particle physics

    International Nuclear Information System (INIS)

    Klapdor-Kleingrothaus, H.V.

    2001-01-01

    The GENIUS (Germanium in Liquid Nitrogen Underground Setup) project has been proposed in 1997 [1] as first third generation double beta decay project, with a sensitivity aiming down to a level of an effective neutrino mass of ∼ 0.01 - 0.001 eV. Such sensitivity has been shown to be indispensable to solve the question of the structure of the neutrino mass matrix which cannot be solved by neutrino oscillation experiments alone [2]. It will allow broad access also to many other topics of physics beyond the Standard Model of particle physics at the multi-TeV scale. For search of cold dark matter GENIUS will cover almost the full range of the parameter space of predictions of SUSY for neutralinos as dark matter [3,4]. Finally, GENIUS has the potential to be the first real-time detector for low-energy (pp and 7 Be) solar neutrinos [6,5]. A GENIUS-Test Facility has just been funded and will come into operation by end of 2001

  17. High energy physics advisory panel's composite subpanel for the assessment of the status of accelerator physics and technology

    International Nuclear Information System (INIS)

    1996-05-01

    In November 1994, Dr. Martha Krebs, Director of the US Department of Energy (DOE) Office of Energy Research (OER), initiated a broad assessment of the current status and promise of the field of accelerator physics and technology with respect to five OER programs -- High Energy Physics, Nuclear Physics, Basic Energy Sciences, Fusion Energy, and Health and Environmental Research. Dr. Krebs asked the High Energy Physics Advisory Panel (HEPAP) to establish a composite subpanel with representation from the five OER advisory committees and with a balance of membership drawn broadly from both the accelerator community and from those scientific disciplines associated with the OER programs. The Subpanel was also charged to provide recommendations and guidance on appropriate future research and development needs, management issues, and funding requirements. The Subpanel finds that accelerator science and technology is a vital and intellectually exciting field. It has provided essential capabilities for the DOE/OER research programs with an enormous impact on the nation's scientific research, and it has significantly enhanced the nation's biomedical and industrial capabilities. Further progress in this field promises to open new possibilities for the scientific goals of the OER programs and to further benefit the nation. Sustained support of forefront accelerator research and development by the DOE's OER programs and the DOE's predecessor agencies has been responsible for much of this impact on research. This report documents these contributions to the DOE energy research mission and to the nation

  18. Accelerator-based atomic physics experiments with photon and ion beams

    International Nuclear Information System (INIS)

    Johnson, B.M.; Jones, K.W.; Meron, M.

    1984-01-01

    Accelerator-based atomic physics experiments at Brookhaven presently use heavy-ion beams from the Dual MP Tandem Van de Graaff Accelerator Facility for atomic physics experiments of several types. Work is presently in progress to develop experiments which will use the intense photon beams which will be available in the near future from the ultraviolet (uv) and x-ray rings of the National Synchrotron Light Source (NSLS). Plans are described for experiments at the NSLS and an exciting development in instrumentation for heavy-ion experiments is summarized

  19. The Computer Revolution and Physical Chemistry.

    Science.gov (United States)

    O'Brien, James F.

    1989-01-01

    Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)

  20. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  1. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  2. Proceedings of the workshop on physics at future accelerators

    International Nuclear Information System (INIS)

    1987-01-01

    A workshop took place at La Thuile and at CERN in January 1987 to study the physics potential of three types of particle collider with energies in the TeV region, together with the feasibility of experiments with them. The machines were: A Large Hadron Collider (LHC) placed in the LEP tunnel at CERN, with a total proton-proton centre-of-mass energy of about 16 TeV; an electron-proton collider, using the LHC and LEP, with a centre-of-mass energy in the range 1.3 TeV to 1.8 TeV; and an electron-positron linear collider with centre-of-mass energy about 2 TeV. This volume of the proceedings contains summary talks given at CERN by the conveners of the study groups. They cover the possibilities for discovery of new phenomena anticipated in the energy region up to the order of a TeV in the centre of mass of colliding partons, or of the electron and positron. Also discussed are the limits of current technology in the construction of particle-detector systems suitable for use at these energies, and especially in the high event rates provided by a proton-proton collider of luminosity 10 33 cm -2 s -1 or more. (orig.)

  3. Particle accelerators from Big Bang physics to hadron therapy

    CERN Document Server

    Amaldi, Ugo

    2015-01-01

    The theoretical physicist Victor “Viki” Weisskopf, Director-General of CERN from 1961 to 1965, once “There are three kinds of physicists, namely the machine builders, the experimental physicists, and the theoretical physicists. […] The machine builders are the most important ones, because if they were not there, we would not get into this small-scale region of space. If we compare this with the discovery of America, the machine builders correspond to captains and ship builders who really developed the techniques at that time. The experimentalists were those fellows on the ships who sailed to the other side of the world and then landed on the new islands and wrote down what they saw. The theoretical physicists are those who stayed behind in Madrid and told Columbus that he was going to land in India.” Rather than focusing on the theoretical physicists, as most popular science books on particle physics do, this beautifully written and also entertaining book is different in that, firstly, the main foc...

  4. The key physics and technology issues in the intense-beam proton accelerators

    International Nuclear Information System (INIS)

    Fu Shinian; Fang Shouxian

    2002-01-01

    Beam power is required to raise one order in the next generation spallation neutron source. There are still some physics and technology difficulties need to be overcome, even though no fatal obstacle exists due to the rapid development of the technology in intense-beam accelerator in recent years. Therefore, it is highly demanded to clarify the key issues and to lunch an R and D program to break through the technological barriers before author start to build the expansive machine. The new technological challenge arises from the high beam current, the high accelerator power and the high demand on the reliability and stability of the accelerator operation. The author will discuss these issues and the means to resolve them, as well as the state of the art in a few of major technological disciplines. Finally, the choice the framework of intense-beam accelerator is discussed

  5. Electromagnetic computer simulations of collective ion acceleration by a relativistic electron beam

    International Nuclear Information System (INIS)

    Galvez, M.; Gisler, G.R.

    1988-01-01

    A 2.5 electromagnetic particle-in-cell computer code is used to study the collective ion acceleration when a relativistic electron beam is injected into a drift tube partially filled with cold neutral plasma. The simulations of this system reveals that the ions are subject to electrostatic acceleration by an electrostatic potential that forms behind the head of the beam. This electrostatic potential develops soon after the beam is injected into the drift tube, drifts with the beam, and eventually settles to a fixed position. At later times, this electrostatic potential becomes a virtual cathode. When the permanent position of the electrostatic potential is at the edge of the plasma or further up, then ions are accelerated forward and a unidirectional ion flow is obtained otherwise a bidirectional ion flow occurs. The ions that achieve higher energy are those which drift with the negative potential. When the plasma density is varied, the simulations show that optimum acceleration occurs when the density ratio between the beam (n b ) and the plasma (n o ) is unity. Simulations were carried out by changing the ion mass. The results of these simulations corroborate the hypothesis that the ion acceleration mechanism is purely electrostatic, so that the ion acceleration depends inversely on the charge particle mass. The simulations also show that the ion maximum energy increased logarithmically with the electron beam energy and proportional with the beam current

  6. Effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing.

    Science.gov (United States)

    Yoo, Won-Gyu

    2015-01-01

    [Purpose] This study showed the effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing. [Subjects] Twenty-one male computer workers voluntarily consented to participate in this study. They consisted of 7 workers who could type 200-300 characteristics/minute, 7 workers who could type 300-400 characteristics/minute, and 7 workers who could type 400-500 chracteristics/minute. [Methods] This study was used to measure the acceleration and peak contact pressure of the fingertips for different typing speed groups using an accelerometer and CONFORMat system. [Results] The fingertip contact pressure was increased in the high typing speed group compared with the low and medium typing speed groups. The fingertip acceleration was increased in the high typing speed group compared with the low and medium typing speed groups. [Conclusion] The results of the present study indicate that a fast typing speed cause continuous pressure stress to be applied to the fingers, thereby creating pain in the fingers.

  7. Computing for particle physics. Report of the HEPAP subpanel on computer needs for the next decade

    International Nuclear Information System (INIS)

    1985-08-01

    The increasing importance of computation to the future progress in high energy physics is documented. Experimental computing demands are analyzed for the near future (four to ten years). The computer industry's plans for the near term and long term are surveyed as they relate to the solution of high energy physics computing problems. This survey includes large processors and the future role of alternatives to commercial mainframes. The needs for low speed and high speed networking are assessed, and the need for an integrated network for high energy physics is evaluated. Software requirements are analyzed. The role to be played by multiple processor systems is examined. The computing needs associated with elementary particle theory are briefly summarized. Computing needs associated with the Superconducting Super Collider are analyzed. Recommendations are offered for expanding computing capabilities in high energy physics and for networking between the laboratories

  8. Physics of x-ray computed tomography

    International Nuclear Information System (INIS)

    Akutagawa, W.M.; Huth, G.C.

    1976-01-01

    Sections are included on theoretical limits of x-ray computed tomography and the relationship of these limits to human organ imaging and specific disease diagnosis; potential of x-ray computed tomography in detection of small calcified particles in early breast cancer detection; early lung cancer measurement and detection; advanced materials for ionizing radiation detection; positron system with circular ring transaxial tomographic camera; contrast mechanism of transmission scanner and algorithms; and status of design on a 200 keV scanning proton microprobe

  9. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  10. Accelerator physics and technology limitations to ultimate energy and luminosity in very large hadron colliders

    Energy Technology Data Exchange (ETDEWEB)

    P. Bauer et al.

    2002-12-05

    The following presents a study of the accelerator physics and technology limitations to ultimate energy and luminosity in very large hadron colliders (VLHCs). The main accelerator physics limitations to ultimate energy and luminosity in future energy frontier hadron colliders are synchrotron radiation (SR) power, proton-collision debris power in the interaction regions (IR), number of events-per-crossing, stored energy per beam and beam-stability [1]. Quantitative estimates of these limits were made and translated into scaling laws that could be inscribed into the particle energy versus machine size plane to delimit the boundaries for possible VLHCs. Eventually, accelerator simulations were performed to obtain the maximum achievable luminosities within these boundaries. Although this study aimed at investigating a general VLHC, it was unavoidable to refer in some instances to the recently studied, [2], 200 TeV center-of-mass energy VLHC stage-2 design (VLHC-2). A more thorough rendering of this work can be found in [3].

  11. Accelerator Technology and High Energy Physic Experiments, WILGA 2012; EuCARD Sessions

    CERN Document Server

    Romaniuk, R S

    2012-01-01

    Wilga Sessions on HEP experiments, astroparticle physica and accelerator technology were organized under the umbrella of the EU FP7 Project EuCARD – European Coordination for Accelerator Research and Development. The paper is the second part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with accelerator technology and high energy physics experiments. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the XXXth Jubilee SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET and pi-of-the ...

  12. The use of personal computers in reactor physics

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1988-01-01

    This paper points out that personal computers are now powerful enough (in terms of core size and speed) to allow them to be used for serious reactor physics applications. In addition the low cost of personal computers means that even small institutes can now have access to a significant amount of computer power. At the present time distribution centers, such as RSIC, are beginning to distribute reactor physics codes for use on personal computers; hopefully in the near future more and more of these codes will become available through distribution centers, such as RSIC

  13. Physics with a high-intensity proton accelerator below 30 GeV

    International Nuclear Information System (INIS)

    Hoffman, C.M.

    1982-01-01

    The types of physics that would be pursued at a high-intensity, moderate-energy proton accelerator are discussed. The discussion is drawn from the deliberations of the 30-GeV subgroup of the Fixed-Target Group at this workshop

  14. Neutron physics and nuclear data measurements with accelerators and research reactors

    International Nuclear Information System (INIS)

    1988-08-01

    The report contains a collection of lectures devoted to the latest theoretical and experimental developments in the field of fast neutron physics and nuclear data measurements. The possibilities offered by particle accelerators and research reactors for research and technological applications in these fields are pointed out. Refs, figs and tabs

  15. Computational plasma physics and supercomputers. Revision 1

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1985-01-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular models, but parallel processing poses new programming difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematical models

  16. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    Science.gov (United States)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  17. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  18. High-Precision Computation and Mathematical Physics

    International Nuclear Information System (INIS)

    Bailey, David H.; Borwein, Jonathan M.

    2008-01-01

    At the present time, IEEE 64-bit floating-point arithmetic is sufficiently accurate for most scientific applications. However, for a rapidly growing body of important scientific computing applications, a higher level of numeric precision is required. Such calculations are facilitated by high-precision software packages that include high-level language translation modules to minimize the conversion effort. This paper presents a survey of recent applications of these techniques and provides some analysis of their numerical requirements. These applications include supernova simulations, climate modeling, planetary orbit calculations, Coulomb n-body atomic systems, scattering amplitudes of quarks, gluons and bosons, nonlinear oscillator theory, Ising theory, quantum field theory and experimental mathematics. We conclude that high-precision arithmetic facilities are now an indispensable component of a modern large-scale scientific computing environment.

  19. Numerical computation of special functions with applications to physics

    CSIR Research Space (South Africa)

    Motsepe, K

    2008-09-01

    Full Text Available Students of mathematical physics, engineering, natural and biological sciences sometimes need to use special functions that are not found in ordinary mathematical software. In this paper a simple universal numerical algorithm is developed to compute...

  20. Future of computing technology in physics - the potentials and pitfalls

    International Nuclear Information System (INIS)

    Brenner, A.E.

    1984-02-01

    The impact of the developments of modern digital computers is discussed, especially with respect to physics research in the future. The effects of large data processing capability and increasing rates at which data can be acquired and processed are considered

  1. Physics and instrumentation of emission computed tomography

    International Nuclear Information System (INIS)

    Links, J.M.

    1986-01-01

    Transverse emission computed tomography can be divided into two distinct classes: single photon emission computed tomography (SPECT) and positron emission tomography (PET). SPECT is usually accomplished with specially-adapted scintillation cameras, although dedicated SPECT scanners are available. The special SPECT cameras are standard cameras which are mounted on gantries that allow 360 degree rotation around the long axis of the head or body. The camera stops at a number of angles around the body (usually 64-128), acquiring a ''projection'' image at each stop. The data from these projections are used to reconstruct transverse images with a standard ''filtered back-projection'' algorithm, identical to that used in transmission CT. Because the scintillation camera acquires two-dimensional images, a simple 360 degree rotation around the patient results in the acquisition of data for a number of contiguous transverse slices. These slices, once reconstructed, can be ''stacked'' in computer memory, and orthogonal coronal and sagittal slices produced. Additionally, reorienting algorithms allow the generation of slices that are oblique to the long axis of the body

  2. Information-preserving models of physics and computation: Final report

    International Nuclear Information System (INIS)

    1986-01-01

    This research pertains to discrete dynamical systems, as embodied by cellular automata, reversible finite-difference equations, and reversible computation. The research has strengthened the cross-fertilization between physics, computer science and discrete mathematics. It has shown that methods and concepts of physics can be exported to computation. Conversely, fully discrete dynamical systems have been shown to be fruitful for representing physical phenomena usually described with differential equations - cellular automata for fluid dynamics has been the most noted example of such a representation. At the practical level, the fully discrete representation approach suggests innovative uses of computers for scientific computing. The originality of these uses lies in their non-numerical nature: they avoid the inaccuracies of floating-point arithmetic and bypass the need for numerical analysis. 38 refs

  3. A contribution to the computation of the impedance in acceleration resonators

    International Nuclear Information System (INIS)

    Liu, Cong

    2016-05-01

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  4. A contribution to the computation of the impedance in acceleration resonators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong

    2016-05-15

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  5. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  6. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    Science.gov (United States)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  7. Complexity vs energy: theory of computation and theoretical physics

    International Nuclear Information System (INIS)

    Manin, Y I

    2014-01-01

    This paper is a survey based upon the talk at the satellite QQQ conference to ECM6, 3Quantum: Algebra Geometry Information, Tallinn, July 2012. It is dedicated to the analogy between the notions of complexity in theoretical computer science and energy in physics. This analogy is not metaphorical: I describe three precise mathematical contexts, suggested recently, in which mathematics related to (un)computability is inspired by and to a degree reproduces formalisms of statistical physics and quantum field theory.

  8. Computing in radiation protection and health physics - 10 years further

    International Nuclear Information System (INIS)

    Behrens, R.; Greif, N.; Struwe, H.; Wissmann, F.

    2008-01-01

    Computing influences radiation protection and health physics more extensively as ever before. The good old data processing and main frame computing has changed towards information technology in a wider sense. Technologies and operating systems out of workplace computing have amended microprocessor technology in measuring devices. The boundaries between them are constantly in a state of flux. The use of the world wide web has become indispensable. No radiation protection expert could still manage without a workplace computer. Measuring networks, radiation protection information systems, data bases, computer simulation and other challenging applications form the image of today. (orig.)

  9. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  10. XXVII IUPAP Conference on Computational Physics (CCP2015)

    International Nuclear Information System (INIS)

    Santra, Sitangshu Bikas; Ray, Purusattam

    2016-01-01

    The 27th IUPAP Conference on Computational Physics, CCP2015, was held in the heritage city Guwahati, in the eastern part of India, next to the mighty river Brahmaputra, during December 2-5, 2015. The Conference on Computational Physics is organized annually under the auspices of Commission 20 (C20) of the IUPAP (International Union of Pure and Applied Physics). This is the first time it has been held in India. Almost 300 participants from 25 countries convened at the auditorium and lecture halls at the Indian Institute of Technology Guwahati for four days. Thirteen plenary speakers, fifty six invited speakers, three presnters from the computer industries and two hundred and eight contributory participants coverd a broad range of topics in computational physics and related areas. Thirty eight women participated in CCP2015 and seven of them presented invited talks. This volume of Journal of Physics: Conference Series contains the proceedings of the scientific contributions presented at the Conference. The main purpose of the meeting was to discuss the progress, opportunities and challenges of common interest to physicists engaged in computational research. Computational physics has taken giant leaps during the lat few years, not only because of the enormous increases in computer power but especially because of the development of new methods and algorithms. Computational physics now represents a third leg of research alongside analytical theory and experiments. A meeting such as CCP, must have sufficient depth in different areas and at the same time should be broad and accessible. The topics covered in this conference were: Materials/Condensed Matter Theory and Nanoscience, Strongly Correlated Systems and Quantum Phase Transitions, Quantum Chemistry and Atomic Physics, Quantum Chromodynamics, Astrophysics, Plasma Physics, Nuclear and High Energy Physics, Complex Systems: Chaos and Statistical Physics, Macroscopic Transport and Mesoscopic Methods, Biological Physics

  11. State-Transition Structures in Physics and in Computation

    Science.gov (United States)

    Petri, C. A.

    1982-12-01

    In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.

  12. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  13. Introduction to Computational Physics for Undergraduates

    Science.gov (United States)

    Zubairi, Omair; Weber, Fridolin

    2018-03-01

    This is an introductory textbook on computational methods and techniques intended for undergraduates at the sophomore or junior level in the fields of science, mathematics, and engineering. It provides an introduction to programming languages such as FORTRAN 90/95/2000 and covers numerical techniques such as differentiation, integration, root finding, and data fitting. The textbook also entails the use of the Linux/Unix operating system and other relevant software such as plotting programs, text editors, and mark up languages such as LaTeX. It includes multiple homework assignments.

  14. Computer Self-Efficacy, Computer Anxiety, Performance and Personal Outcomes of Turkish Physical Education Teachers

    Science.gov (United States)

    Aktag, Isil

    2015-01-01

    The purpose of this study is to determine the computer self-efficacy, performance outcome, personal outcome, and affect and anxiety level of physical education teachers. Influence of teaching experience, computer usage and participation of seminars or in-service programs on computer self-efficacy level were determined. The subjects of this study…

  15. Proceedings of the conference on computer codes and the linear accelerator community

    International Nuclear Information System (INIS)

    Cooper, R.K.

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned

  16. Proceedings of the conference on computer codes and the linear accelerator community

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.K. (comp.)

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned.

  17. Beam polarization at the ILC. The physics impact and the accelerator solutions

    Energy Technology Data Exchange (ETDEWEB)

    Aurand, B. [Bonn Univ. (Germany). Phys. Inst.; Bailey, I. [Liverpool Univ. (United Kingdom). Cockcroft Inst.; Bartels, C. [DESY, Hamburg (Germany); DESY, Zeuthen (DE)] (and others)

    2009-03-15

    In this contribution accelerator solutions for polarized beams and their impact on physics measurements are discussed. Focus are physics requirements for precision polarimetry near the interaction point and their realization with polarized sources. Based on the ILC baseline programme as described in the Reference Design Report (RDR), recent developments are discussed and evaluated taking into account physics runs at beam energies between 100 GeV and 250 GeV, as well as calibration runs on the Z-pole and options as the 1 TeV upgrade and GigaZ. (orig.)

  18. Accelerator physics and technology challenges of very high energy hadron colliders

    Science.gov (United States)

    Shiltsev, Vladimir D.

    2015-08-01

    High energy hadron colliders have been in the forefront of particle physics for more than three decades. At present, international particle physics community considers several options for a 100 TeV proton-proton collider as a possible post-LHC energy frontier facility. The method of colliding beams has not fully exhausted its potential but has slowed down considerably in its progress. This paper briefly reviews the accelerator physics and technology challenges of the future very high energy colliders and outlines the areas of required research and development towards their technical and financial feasibility.

  19. An Introduction to Quantum Computing, Without the Physics

    OpenAIRE

    Nannicini, Giacomo

    2017-01-01

    This paper is a gentle but rigorous introduction to quantum computing intended for discrete mathematicians. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon's algorithm and Grover's algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefor...

  20. Local computer network of the JINR Neutron Physics Laboratory

    International Nuclear Information System (INIS)

    Alfimenkov, A.V.; Vagov, V.A.; Vajdkhadze, F.

    1988-01-01

    New high-speed local computer network, where intelligent network adapter (NA) is used as hardware base, is developed in the JINR Neutron Physics Laboratory to increase operation efficiency and data transfer rate. NA consists of computer bus interface, cable former, microcomputer segment designed for both program realization of channel-level protocol and organization of bidirectional transfer of information through direct access channel between monochannel and computer memory with or witout buffering in NA operation memory device

  1. In situ particle acceleration and physical conditions in radio tail galaxies

    International Nuclear Information System (INIS)

    Pacholczyk, A.G.; Scott, J.S.

    1976-01-01

    A model for the objects known as radio tail galaxies is presented. Independent plasmons emerging from an active radio galaxy into an intracluster medium become turbulent due to Rayleigh-Taylor and Kelvin-Helmholtz instabilities. The turbulence produces both in situ betatron and second order Fermi acceleration. Predictions of the dependence of spectral index and flux on distance along the tail match observations well. Fitting provides values of physical parameters in the tail. The relevance of this method of particle acceleration for the problem of the origin of X-ray emission in clusters of galaxies is discussed

  2. Induction-accelerator heavy-ion fusion: Status and beam physics issues

    International Nuclear Information System (INIS)

    Friedman, A.

    1996-01-01

    Inertial confinement fusion driven by beams of heavy ions is an attractive route to controlled fusion. In the U.S., induction accelerators are being developed as open-quotes driversclose quotes for this process. This paper is divided into two main sections. In the first section, the concept of induction-accelerator driven heavy-ion fusion is briefly reviewed, and the U.S. program of experiments and theoretical investigations is described. In the second, a open-quotes taxonomyclose quotes of space-charge-dominated beam physics issues is presented, accompanied by a brief discussion of each area

  3. Elementary computer physics, a concentrated one-week course

    DEFF Research Database (Denmark)

    Christiansen, Gunnar Dan

    1978-01-01

    A concentrated one-week course (8 hours per day in 5 days) in elementary computer physics for students in their freshman university year is described. The aim of the course is to remove the constraints on traditional physics courses imposed by the necessity of only dealing with problems that have...... fields, and a lunar space vehicle are used as examples....

  4. From the Web to the Grid and beyond computing paradigms driven by high-energy physics

    CERN Document Server

    Carminati, Federico; Galli-Carminati, Giuliana

    2012-01-01

    Born after World War II, large-scale experimental high-energy physics (HEP) has found itself limited ever since by available accelerator, detector and computing technologies. Accordingly, HEP has made significant contributions to the development of these fields, more often than not driving their innovations. The invention of the World Wide Web at CERN is merely the best-known example out of many. This book is the first comprehensive account to trace the history of this pioneering spirit in the field of computing technologies. It covers everything up to and including the present-day handling of the huge demands imposed upon grid and distributed computing by full-scale LHC operations - operations which have for years involved many thousands of collaborating members worldwide and accordingly provide the original and natural testbed for grid computing concepts. This book takes the reader on a guided tour encompassing all relevant topics, including programming languages, software engineering, large databases, the ...

  5. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  6. Advances in Reactor Physics, Mathematics and Computation. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    These proceedings of the international topical meeting on advances in reactor physics, mathematics and computation, volume one, are divided into 6 sessions bearing on: - session 1: Advances in computational methods including utilization of parallel processing and vectorization (7 conferences) - session 2: Fast, epithermal, reactor physics, calculation, versus measurements (9 conferences) - session 3: New fast and thermal reactor designs (9 conferences) - session 4: Thermal radiation and charged particles transport (7 conferences) - session 5: Super computers (7 conferences) - session 6: Thermal reactor design, validation and operating experience (8 conferences).

  7. Performance analysis and acceleration of explicit integration for large kinetic networks using batched GPU computations

    Energy Technology Data Exchange (ETDEWEB)

    Shyles, Daniel [University of Tennessee (UT); Dongarra, Jack J. [University of Tennessee, Knoxville (UTK); Guidry, Mike W. [ORNL; Tomov, Stanimire Z. [ORNL; Billings, Jay Jay [ORNL; Brock, Benjamin A. [ORNL; Haidar Ahmad, Azzam A. [ORNL

    2016-09-01

    Abstract—We demonstrate the systematic implementation of recently-developed fast explicit kinetic integration algorithms that solve efficiently N coupled ordinary differential equations (subject to initial conditions) on modern GPUs. We take representative test cases (Type Ia supernova explosions) and demonstrate two or more orders of magnitude increase in efficiency for solving such systems (of realistic thermonuclear networks coupled to fluid dynamics). This implies that important coupled, multiphysics problems in various scientific and technical disciplines that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible. As examples of such applications we present the computational techniques developed for our ongoing deployment of these new methods on modern GPU accelerators. We show that similarly to many other scientific applications, ranging from national security to medical advances, the computation can be split into many independent computational tasks, each of relatively small-size. As the size of each individual task does not provide sufficient parallelism for the underlying hardware, especially for accelerators, these tasks must be computed concurrently as a single routine, that we call batched routine, in order to saturate the hardware with enough work.

  8. Acceleration of FDTD mode solver by high-performance computing techniques.

    Science.gov (United States)

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  9. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  10. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  11. Computational intelligence for decision support in cyber-physical systems

    CERN Document Server

    Ali, A; Riaz, Zahid

    2014-01-01

    This book is dedicated to applied computational intelligence and soft computing techniques with special reference to decision support in Cyber Physical Systems (CPS), where the physical as well as the communication segment of the networked entities interact with each other. The joint dynamics of such systems result in a complex combination of computers, software, networks and physical processes all combined to establish a process flow at system level. This volume provides the audience with an in-depth vision about how to ensure dependability, safety, security and efficiency in real time by making use of computational intelligence in various CPS applications ranging from the nano-world to large scale wide area systems of systems. Key application areas include healthcare, transportation, energy, process control and robotics where intelligent decision support has key significance in establishing dynamic, ever-changing and high confidence future technologies. A recommended text for graduate students and researche...

  12. Using a 400 kV Van de Graaff accelerator to teach physics at West Point

    Science.gov (United States)

    Marble, D. K.; Bruch, S. E.; Lainis, T.

    1997-02-01

    A small accelerator visitation laboratory is being built at the United States Military Academy using two 400 kV Van de Graaff accelerators. This laboratory will provide quality teaching experiments and increased research opportunities for both faculty and cadets as well as enhancing the department's ability to teach across the curriculum by using nuclear techniques to solve problems in environmental engineering, material science, archeology, art, etc. This training enhances a students ability to enter non-traditional fields that are becoming a large part of the physics job market. Furthermore, a small accelerator visitation laboratory for high school students can stimulate student interest in science and provide an effective means of communicating the scientific method to a general audience. A discussion of the USMA facility, class experiments and student research projects will be presented.

  13. Small-sized cyclotron for studies of physical processes in accelerators

    International Nuclear Information System (INIS)

    Arzumanov, A.A.; Voronin, A.M.; Gerasimov, V.I.; Gor'kovets, M.S.; Gromov, D.D.; Zavezionov, V.P.; Kruglov, V.G.

    1979-01-01

    A description is given of a cyclotron intended for studying physical processes taking place in the accelerator central part, for investigating various ion sources and also for optimizing the elements and systems of the U-150M isochronous cyclotron. The accelerator uses a hot-cathode slit ion source. The resonance system constitutes a quarter-wave nonaxial resonator excited at a frequency of 11.2 MHz. Investigations of beam time characteristics showed that the beam axial size constituted 11 mm, its radial size 5 mm. Displacement of the beam with respect to the median plane does not exceed 2 mm. In the cyclotron H + ions have been accelerated to an energy of 1 MeV. The integrated beam current constituted 250 μA

  14. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  15. Particle modeling of plasmas computational plasma physics

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1991-01-01

    Recently, through the development of supercomputers, a powerful new method for exploring plasmas has emerged; it is computer modeling of plasmas. Such modeling can duplicate many of the complex processes that go on in a plasma and allow scientists to understand what the important processes are. It helps scientists gain an intuition about this complex state of matter. It allows scientists and engineers to explore new ideas on how to use plasma before building costly experiments; it allows them to determine if they are on the right track. It can duplicate the operation of devices and thus reduce the need to build complex and expensive devices for research and development. This is an exciting new endeavor that is in its infancy, but which can play an important role in the scientific and technological competitiveness of the US. There are a wide range of plasma models that are in use. There are particle models, fluid models, hybrid particle fluid models. These can come in many forms, such as explicit models, implicit models, reduced dimensional models, electrostatic models, magnetostatic models, electromagnetic models, and almost an endless variety of other models. Here the author will only discuss particle models. He will give a few examples of the use of such models; these will be taken from work done by the Plasma Modeling Group at UCLA because he is most familiar with work. However, it only gives a small view of the wide range of work being done around the US, or for that matter around the world

  16. From the web to the grid and beyond. Computing paradigms driven by high energy physics

    International Nuclear Information System (INIS)

    Brun, Rene; Carminati, Federico; Galli Carminati, Giuliana

    2012-01-01

    Born after World War II, large-scale experimental high-energy physics (HEP) has found itself limited ever since by available accelerator, detector and computing technologies. Accordingly, HEP has made significant contributions to the development of these fields, more often than not driving their innovations. The invention of the World Wide Web at CERN is merely the best-known example out of many. This book is the first comprehensive account to trace the history of this pioneering spirit in the field of computing technologies. It covers everything up to and including the present-day handling of the huge demands imposed upon grid and distributed computing by full-scale LHC operations - operations which have for years involved many thousands of collaborating members worldwide and accordingly provide the original and natural testbed for grid computing concepts. This book takes the reader on a guided tour encompassing all relevant topics, including programming languages, software engineering, large databases, the Web, and grid- and cloud computing. The important issue of intellectual property regulations for distributed software engineering and computing is also addressed. Aptly, the book closes with a visionary chapter of what may lie ahead. Approachable and requiring only basic understanding of physics and computer sciences, this book is intended for both education and research. (orig.)

  17. From the web to the grid and beyond. Computing paradigms driven by high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Brun, Rene; Carminati, Federico [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Galli Carminati, Giuliana (eds.) [Hopitaux Universitaire de Geneve, Chene-Bourg (Switzerland). Unite de la Psychiatrie du Developpement Mental

    2012-07-01

    Born after World War II, large-scale experimental high-energy physics (HEP) has found itself limited ever since by available accelerator, detector and computing technologies. Accordingly, HEP has made significant contributions to the development of these fields, more often than not driving their innovations. The invention of the World Wide Web at CERN is merely the best-known example out of many. This book is the first comprehensive account to trace the history of this pioneering spirit in the field of computing technologies. It covers everything up to and including the present-day handling of the huge demands imposed upon grid and distributed computing by full-scale LHC operations - operations which have for years involved many thousands of collaborating members worldwide and accordingly provide the original and natural testbed for grid computing concepts. This book takes the reader on a guided tour encompassing all relevant topics, including programming languages, software engineering, large databases, the Web, and grid- and cloud computing. The important issue of intellectual property regulations for distributed software engineering and computing is also addressed. Aptly, the book closes with a visionary chapter of what may lie ahead. Approachable and requiring only basic understanding of physics and computer sciences, this book is intended for both education and research. (orig.)

  18. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  19. Parallel Computing:. Some Activities in High Energy Physics

    Science.gov (United States)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  20. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  1. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  2. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  3. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  4. Applications of FLUKA Monte Carlo code for nuclear and accelerator physics

    CERN Document Server

    Battistoni, Giuseppe; Brugger, Markus; Campanella, Mauro; Carboni, Massimo; Empl, Anton; Fasso, Alberto; Gadioli, Ettore; Cerutti, Francesco; Ferrari, Alfredo; Ferrari, Anna; Lantz, Matthias; Mairani, Andrea; Margiotta, M; Morone, Christina; Muraro, Silvia; Parodi, Katerina; Patera, Vincenzo; Pelliccioni, Maurizio; Pinsky, Lawrence; Ranft, Johannes; Roesler, Stefan; Rollet, Sofia; Sala, Paola R; Santana, Mario; Sarchiapone, Lucia; Sioli, Maximiliano; Smirnov, George; Sommerer, Florian; Theis, Christian; Trovati, Stefania; Villari, R; Vincke, Heinz; Vincke, Helmut; Vlachoudis, Vasilis; Vollaire, Joachim; Zapp, Neil

    2011-01-01

    FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently the code is maintained on Linux. The validity of the physical models implemented in FLUKA has been benchmarked against a variety of experimental data over a wide energy range, from accelerator data to cosmic ray showers in the Earth atmosphere. FLUKA is widely used for studies related both to basic research and to applications in particle accelerators, radiation protection and dosimetry, including the specific issue of radiation damage in space missions, radiobiology (including radiotherapy) and cosmic ray calculations. After a short description of the main features that make FLUKA valuable for these topics, the present paper summarizes some of the recent applications of the FLUKA Monte Carlo code in the nuclear as well high energy physics. In particular it addresses such top...

  5. Computational and experimental investigation of plasma deflagration jets and detonation shocks in coaxial plasma accelerators

    Science.gov (United States)

    Subramaniam, Vivek; Underwood, Thomas C.; Raja, Laxminarayan L.; Cappelli, Mark A.

    2018-02-01

    We present a magnetohydrodynamic (MHD) numerical simulation to study the physical mechanisms underlying plasma acceleration in a coaxial plasma gun. Coaxial plasma accelerators are known to exhibit two distinct modes of operation depending on the delay between gas loading and capacitor discharging. Shorter delays lead to a high velocity plasma deflagration jet and longer delays produce detonation shocks. During a single operational cycle that typically consists of two discharge events, the plasma acceleration exhibits a behavior characterized by a mode transition from deflagration to detonation. The first of the discharge events, a deflagration that occurs when the discharge expands into an initially evacuated domain, requires a modification of the standard MHD algorithm to account for rarefied regions of the simulation domain. The conventional approach of using a low background density gas to mimic the vacuum background results in the formation of an artificial shock, inconsistent with the physics of free expansion. To this end, we present a plasma-vacuum interface tracking framework with the objective of predicting a physically consistent free expansion, devoid of the spurious shock obtained with the low background density approach. The interface tracking formulation is integrated within the MHD framework to simulate the plasma deflagration and the second discharge event, a plasma detonation, formed due to its initiation in a background prefilled with gas remnant from the deflagration. The mode transition behavior obtained in the simulations is qualitatively compared to that observed in the experiments using high framing rate Schlieren videography. The deflagration mode is further investigated to understand the jet formation process and the axial velocities obtained are compared against experimentally obtained deflagration plasma front velocities. The simulations are also used to provide insight into the conditions responsible for the generation and sustenance of

  6. Computational physics of electric discharges in gas flows

    CERN Document Server

    Surzhikov, Sergey T

    2012-01-01

    Gas discharges are of interest for many processes in mechanics, manufacturing, materials science and aerophysics. To understand the physics behind the phenomena is of key importance for the effective use and development of gas discharge devices. This worktreats methods of computational modeling of electrodischarge processes and dynamics of partially ionized gases. These methods are necessary to tackleproblems of physical mechanics, physics of gas discharges and aerophysics.Particular attention is given to a solution of two-dimensional problems of physical mechanics of glow discharges.The use o

  7. Dynamics of dissipative systems and computational physics

    International Nuclear Information System (INIS)

    Adam, Gh.; Scutaru, H.; Ixaru, L.; Adam, S.; Rizea, M.; Stefanescu, E.; Mihalache, D.; Mazilu, D.; Crasovan, L.

    2002-01-01

    given. These coefficients describe correlated transitions of the system and environment particles, depending on the dissipative two-body potential V, the populations f(ε α ), f(ε β ) and the densities g(ε α ), g(ε β ) of the environment states. Therefrom we infer that for a normal Fermi-Dirac distribution of the environment particles, the decay processes are favored in comparison with the excitation ones, while for a reversed distribution of the environment populations the excitations are favored. Concerning the second topics approached in the frame of this project one starts from admitting that the topologic charge of a soliton is an integer number 's' which arises in the axial symmetric solution of the local amplitude of the electromagnetic wave, A(z, x, y) = U(z, r) exp (isθ) of the (2+1)-dimensional Ginzburg-Landau equation. The 's' parameter is also called 'spin' or 'vorticity'. The investigation conducted within this topics has been directed along two main lines: (i) The study of fundamental phenomena concerning vortex solitons in dissipative (open) systems, and (ii) Comparison of the specific properties of the vortex type solitons in Hamiltonian (conservative) systems and in dissipative systems. The following fundamental results have been obtained: 1. Formulation of the relevant physical model and identification of the values of the physical parameters of the model. 2. Systematic analysis of the stable localized solutions of the (2+1)-dimensional Ginzburg-Landau equation in media characterized by cubic saturable nonlinearities. 3. Extensive numerical simulations of the (2+1)-dimensional Ginzburg-Landau equation in polar coordinates resulting in the demonstration of the occurrence of stable two-dimensional solutions characterized by axial symmetry both for non-vanishing 'spin' (annular, vortex type solitons) and vanishing 'spin' (fundamental solitons). The study of the propagation of these solitons under azimuthal perturbations demonstrates soliton

  8. 15th International Conference on Accelerator and Large Experimental Physics Control Systems

    CERN Document Server

    2015-01-01

    ICALEPCS is a biennial series of conferences that is intended to: * Provide a forum for the interchange of ideas and information between control system specialists working on large experimental physics facilities around the world (accelerators, particle detectors, fusion reactors, telescopes, etc.); * Create an archival literature of developments and progress in this rapidly changing discipline; * Promote, where practical, standardization in both hardware and software; Promote collaboration between laboratories, institutes and industry.

  9. Variation of beam characteristics for physical and enhanced dynamic wedge from a dual energy accelerator

    International Nuclear Information System (INIS)

    Varatharaj, C.; Ravikumar, M.; Sathiyan, S.; Supe, Sanjay S.

    2008-01-01

    The use of Megavoltage X-ray sources of radiation, with their skin-sparing qualities in radiation therapy has been a boon in relieving patient discomfort and allowing higher tumor doses to be given with fewer restrictions due to radiation effects in the skin. The aim of this study was to compare few of the dosimetric characteristics of a physical and enhanced dynamic wedge from a dual energy (6-18 MV) linear accelerator

  10. Survey of physics research with a high duty cycle electron accelerator

    International Nuclear Information System (INIS)

    Bartholomew, G.A.; Earle, E.D.; Knowles, J.W.; Lone, M.A.

    1981-02-01

    The opportunities for nuclear physics research afforded by a CW electron linac with nominal energy 100 MeV and beam current >= 100 μA equipped with a bremsstrahlung monochromator and reaction product coincidence facilities are outlined. It is proposed that a program toward realization of an accelerator meeting these requirements and with provision for eventual extension to higher energies be undertaken at the Chalk River Nuclear Laboratories. (author)

  11. Signal of Acceleration and Physical Mechanism of Water Cycle in Xinjiang, China

    OpenAIRE

    Feng, Guo-Lin; Wu, Yong-Ping

    2016-01-01

    Global warming accelerates water cycle with features of regional difference. However, little is known about the physical mechanism behind the phenomenon. To reveal the links between water cycle and climatic environment, we analyzed the changes of water cycle elements and their relationships with climatic and environmental factors. We found that when global warming was significant during the period of 1986-2003, the precipitation in Tarim mountains as well as Xinjiang increased rapidly except ...

  12. Program for computing inhomogeneous coaxial resonators and accelerating systems of the U-400 and ITs-100 cyclotrons

    International Nuclear Information System (INIS)

    Gul'bekyan, G.G.; Ivanov, Eh.L.

    1987-01-01

    The ''Line'' computer code for computing inhomogeneous coaxial resonators is described. The results obtained for the resonators of the U-400 cyclotron made it possible to increase the energy of accelerated ions up to 27 MeV/nucl. The computations fot eh ITs-100 cyclic implantator gave the opportunity to build a compact design with a low value of consumed RF power

  13. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  14. Open-Source Java for Teaching Computational Physics

    Science.gov (United States)

    Wolfgang, Christian; Gould, Harvey; Gould, Joshua; Tobochnik, Jan

    2001-11-01

    The switch from procedural to object-oriented (OO) programming has produced dramatic changes in professional software design. OO techniques have not, however, been widely adopted in computational physics. Although most physicists are familiar with procedural languages such as Fortran, few physicists have formal training in computer science and few therefore have made the switch to OO programming. The continued use of procedural languages in education is due, in part, to the lack of up-to-date curricular materials that combine current computational physics research topics with an OO framework. This talk describes an Open-Source curriculum development project to produce such material. Examples will be presented that show how OO techniques can be used to encapsulate the relevant Physics, the analysis, and the associated numerical methods.

  15. Use of new computer technologies in elementary particle physics

    International Nuclear Information System (INIS)

    Gaines, I.; Nash, T.

    1987-01-01

    Elementary particle physics and computers have progressed together for as long as anyone can remember. The symbiosis is surprising considering the dissimilar objectives of these fields, but physics understanding cannot be had simply by detecting the passage of particles. It requires a selection of interesting events and their analysis in comparison with quantitative theoretical predictions. The extraordinary reach made by experimentalists into realms always further removed from everyday observation frequently encountered technology constraints. Pushing away such barriers has been an essential activity of the physicist since long before Rossi developed the first practical electronic AND gates as coincidence circuits in 1930. This article describes the latest episode of this history, the development of new computer technologies to meet the various and increasing appetite for computing of experimental (and theoretical) high energy physics

  16. Computations, Complexity, Experiments, and the World Outside Physics

    International Nuclear Information System (INIS)

    Kadanoff, L.P

    2009-01-01

    Computer Models in the Sciences and Social Sciences. 1. Simulation and Prediction in Complex Systems: the Good the Bad and the Awful. This lecture deals with the history of large-scale computer modeling mostly in the context of the U.S. Department of Energy's sponsorship of modeling for weapons development and innovation in energy sources. 2. Complexity: Making a Splash-Breaking a Neck - The Making of Complexity in Physical System. For ages thinkers have been asking how complexity arise. The laws of physics are very simple. How come we are so complex? This lecture tries to approach this question by asking how complexity arises in physical fluids. 3. Forrester, et. al. Social and Biological Model-Making The partial collapse of the world's economy has raised the question of whether we could improve the performance of economic and social systems by a major effort on creating understanding via large-scale computer models. (author)

  17. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    Science.gov (United States)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  18. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    International Nuclear Information System (INIS)

    Frankel, R.S.

    1995-01-01

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation

  19. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  20. A review of accelerator and particle physics at the CERN intersecting storage rings

    International Nuclear Information System (INIS)

    Jacob, M.; Johnsen, K.

    1984-01-01

    The last meeting of the CERN Intersecting Storage Rings Committee (ISRC) was held on 27 January 1984, following the closing of the ISR for colliding-beam physics in December 1983. This report consists of the written versions of the two review talks presented at that meeting. K. Johnsen describes the history and importance of the ISR for accelerator physics, from the first ideas on colliding-beam devices to the final operation. M. Jacob gives his view of the role of the ISR physics programme in the development of particle physics up to and including the latest available results. The preface is by G. Bellettini, the last chairman of the ISR Committee. (orig.)