WorldWideScience

Sample records for accelerated strategic computing

  1. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories, along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.

  2. Accelerator requirments for strategic defense

    International Nuclear Information System (INIS)

    Gullickson, R.L.

    1987-01-01

    The authors discuss how directed energy applications require accelerators with high brightness and large gradients to minimize size and weight for space systems. Several major directed energy applications are based upon accelerator technology. The radio-frequency linear accelerator is the basis for both space-based neutral particle beam (NPB) and free electron laser (FEL) devices. The high peak current of the induction linac has made it a leading candidate for ground based free electron laser applications

  3. CONFERENCE: Computers and accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-01-15

    In September of last year a Conference on 'Computers in Accelerator Design and Operation' was held in West Berlin attracting some 160 specialists including many from outside Europe. It was a Europhysics Conference, organized by the Hahn-Meitner Institute with Roman Zelazny as Conference Chairman, postponed from an earlier intended venue in Warsaw. The aim was to bring together specialists in the fields of accelerator design, computer control and accelerator operation.

  4. Accelerating Strategic Change Through Action Learning

    DEFF Research Database (Denmark)

    Younger, Jon; Sørensen, René; Cleemann, Christine

    2013-01-01

    Purpose – The purpose of this paper is to describe how a leading global company used action-learning based leadership development to accelerate strategic culture change. Design/methodology/approach – It describes the need for change, and the methodology and approach by which the initiative, Impact......, generated significant benefits. Findings – The initiative led to financial benefit, as well as measurable gains in customer centricity, collaboration, and innovation. It was also a powerful experience for participants in their journey as commercial leaders. Originality/value – Impact was created using...

  5. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  6. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  7. Personal computers in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.

    1988-01-01

    The advent of the personal computer has created a popular movement which has also made a strong impact on science and engineering. Flexible software environments combined with good computational performance and large storage capacities are becoming available at steadily decreasing costs. Of equal importance, however, is the quality of the user interface offered on many of these products. Graphics and screen interaction is available in ways that were only possible on specialized systems before. Accelerator engineers were quick to pick up the new technology. The first applications were probably for controllers and data gatherers for beam measurement equipment. Others followed, and today it is conceivable to make personal computer a standard component of an accelerator control system. This paper reviews the experience gained at CERN so far and describes the approach taken in the design of the common control center for the SPS and the future LEP accelerators. The design goal has been to be able to integrate personal computers into the accelerator control system and to build the operator's workplace around it. (orig.)

  8. Computer programs in accelerator physics

    International Nuclear Information System (INIS)

    Keil, E.

    1984-01-01

    Three areas of accelerator physics are discussed in which computer programs have been applied with much success: i) single-particle beam dynamics in circular machines, i.e. the design and matching of machine lattices; ii) computations of electromagnetic fields in RF cavities and similar objects, useful for the design of RF cavities and for the calculation of wake fields; iii) simulation of betatron and synchrotron oscillations in a machine with non-linear elements, e.g. sextupoles, and of bunch lengthening due to longitudinal wake fields. (orig.)

  9. Accelerating Clean Energy Commercialization. A Strategic Partnership Approach

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Richard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Arent, Douglas J. [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Locklin, Ken [Impax Asset Management Group (United Kingdom)

    2016-04-01

    Technology development in the clean energy and broader clean tech space has proven to be challenging. Long-standing methods for advancing clean energy technologies from science to commercialization are best known for relatively slow, linear progression through research and development, demonstration, and deployment (RDD&D); and characterized by well-known valleys of death for financing. Investment returns expected by traditional venture capital investors have been difficult to achieve, particularly for hardware-centric innovations, and companies that are subject to project finance risks. Commercialization support from incubators and accelerators has helped address these challenges by offering more support services to start-ups; however, more effort is needed to fulfill the desired clean energy future. The emergence of new strategic investors and partners in recent years has opened up innovative opportunities for clean tech entrepreneurs, and novel commercialization models are emerging that involve new alliances among clean energy companies, RDD&D, support systems, and strategic customers. For instance, Wells Fargo and Company (WFC) and the National Renewable Energy Laboratory (NREL) have launched a new technology incubator that supports faster commercialization through a focus on technology development. The incubator combines strategic financing, technology and technical assistance, strategic customer site validation, and ongoing financial support.

  10. Strategic directions of computing at Fermilab

    Science.gov (United States)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  11. Strategic directions of computing at Fermilab

    International Nuclear Information System (INIS)

    Wolbers, S.

    1997-04-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R ampersand D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object- oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and project. R ampersand D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing

  12. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  13. Computing tools for accelerator design calculations

    International Nuclear Information System (INIS)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  14. Analogue computer display of accelerator beam optics

    International Nuclear Information System (INIS)

    Brand, K.

    1984-01-01

    Analogue computers have been used years ago by several authors for the design of magnetic beam handling systems. At Bochum a small analogue/hybrid computer was combined with a particular analogue expansion and logic control unit for beam transport work. This apparatus was very successful in the design and setup of the beam handling system of the tandem accelerator. The center of the stripper canal was the object point for the calculations, instead of the high energy acceleration tube a drift length was inserted into the program neglecting the weak focusing action of the tube. In the course of the installation of a second injector for heavy ions it became necessary to do better calculations. A simple method was found to represent accelerating sections on the computer and a particular way to simulate thin lenses was adopted. The analogue computer system proved its usefulness in the design and in studies of the characteristics of different accelerator installations over many years. The results of the calculations are in very good agreement with real accelerator data. The apparatus is the ideal tool to demonstrate beam optics to students and accelerator operators since the effect of a change of any of the parameters is immediately visible on the oscilloscope

  15. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  16. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  17. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.

    1977-03-01

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  18. Computer simulation of dynamic processes on accelerators

    International Nuclear Information System (INIS)

    Kol'ga, V.V.

    1979-01-01

    The problems of computer numerical investigation of motion of accelerated particles in accelerators and storages, an effect of different accelerator systems on the motion, determination of optimal characteristics of accelerated charged particle beams are considered. Various simulation representations are discussed which describe the accelerated particle dynamics, such as the enlarged particle method, the representation where a great number of discrete particle is substituted for a field of continuously distributed space charge, the method based on determination of averaged beam characteristics. The procedure is described of numerical studies involving the basic problems, viz. calculation of closed orbits, establishment of stability regions, investigation of resonance propagation determination of the phase stability region, evaluation of the space charge effect the problem of beam extraction. It is shown that most of such problems are reduced to solution of the Cauchy problem using a computer. The ballistic method which is applied to solution of the boundary value problem of beam extraction is considered. It is shown that introduction into the equation under study of additional members with the small positive regularization parameter is a general idea of the methods for regularization of noncorrect problems [ru

  19. Computational needs for the RIA accelerator systems

    International Nuclear Information System (INIS)

    Ostroumov, P.N.; Nolen, J.A.; Mustapha, B.

    2006-01-01

    This paper discusses the computational needs for the full design and simulation of the RIA accelerator systems. Beam dynamics simulations are essential to first define and optimize the architectural design for both the driver linac and the post-accelerator. They are also important to study different design options and various off-normal modes in order to decide on the most-performing and cost-effective design. Due to the high-intensity primary beams, the beam-stripper interaction is a source of both radioactivation and beam contamination and should be carefully investigated and simulated for proper beam collimation and shielding. The targets and fragment separators area needs also very special attention in order to reduce any radiological hazards by careful shielding design. For all these simulations parallel computing is an absolute necessity

  20. Applications of the Strategic Defense Initiative's Compact Accelerators

    National Research Council Canada - National Science Library

    Montanarelli, Nick

    1992-01-01

    ...) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI sponsored compact induction linear accelerator may replace Cobalt 60 radiation...

  1. Cloud computing strategic framework (FY13 - FY15).

    Energy Technology Data Exchange (ETDEWEB)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.; Cox, Philip M.; Rogers, G. Kelly

    2012-11-01

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  2. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  3. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  4. Computer codes for designing proton linear accelerators

    International Nuclear Information System (INIS)

    Kato, Takao

    1992-01-01

    Computer codes for designing proton linear accelerators are discussed from the viewpoint of not only designing but also construction and operation of the linac. The codes are divided into three categories according to their purposes: 1) design code, 2) generation and simulation code, and 3) electric and magnetic fields calculation code. The role of each category is discussed on the basis of experience at KEK (the design of the 40-MeV proton linac and its construction and operation, and the design of the 1-GeV proton linac). We introduce our recent work relevant to three-dimensional calculation and supercomputer calculation: 1) tuning of MAFIA (three-dimensional electric and magnetic fields calculation code) for supercomputer, 2) examples of three-dimensional calculation of accelerating structures by MAFIA, 3) development of a beam transport code including space charge effects. (author)

  5. Strategic engineering for cloud computing and big data analytics

    CERN Document Server

    Ramachandran, Muthu; Sarwar, Dilshad

    2017-01-01

    This book demonstrates the use of a wide range of strategic engineering concepts, theories and applied case studies to improve the safety, security and sustainability of complex and large-scale engineering and computer systems. It first details the concepts of system design, life cycle, impact assessment and security to show how these ideas can be brought to bear on the modeling, analysis and design of information systems with a focused view on cloud-computing systems and big data analytics. This informative book is a valuable resource for graduate students, researchers and industry-based practitioners working in engineering, information and business systems as well as strategy. .

  6. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  7. Strategic Plan for a Scientific Cloud Computing infrastructure for Europe

    CERN Document Server

    Lengert, Maryline

    2011-01-01

    Here we present the vision, concept and direction for forming a European Industrial Strategy for a Scientific Cloud Computing Infrastructure to be implemented by 2020. This will be the framework for decisions and for securing support and approval in establishing, initially, an R&D European Cloud Computing Infrastructure that serves the need of European Research Area (ERA ) and Space Agencies. This Cloud Infrastructure will have the potential beyond this initial user base to evolve to provide similar services to a broad range of customers including government and SMEs. We explain how this plan aims to support the broader strategic goals of our organisations and identify the benefits to be realised by adopting an industrial Cloud Computing model. We also outline the prerequisites and commitment needed to achieve these objectives.

  8. Symbolic mathematical computing: orbital dynamics and application to accelerators

    International Nuclear Information System (INIS)

    Fateman, R.

    1986-01-01

    Computer-assisted symbolic mathematical computation has become increasingly useful in applied mathematics. A brief introduction to such capabilitites and some examples related to orbital dynamics and accelerator physics are presented. (author)

  9. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  10. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  11. Computer codes for beam dynamics analysis of cyclotronlike accelerators

    Science.gov (United States)

    Smirnov, V.

    2017-12-01

    Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.

  12. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  13. Advanced Computing for 21st Century Accelerator Science and Technology

    International Nuclear Information System (INIS)

    Dragt, Alex J.

    2004-01-01

    Dr. Dragt of the University of Maryland is one of the Institutional Principal Investigators for the SciDAC Accelerator Modeling Project Advanced Computing for 21st Century Accelerator Science and Technology whose principal investigators are Dr. Kwok Ko (Stanford Linear Accelerator Center) and Dr. Robert Ryne (Lawrence Berkeley National Laboratory). This report covers the activities of Dr. Dragt while at Berkeley during spring 2002 and at Maryland during fall 2003

  14. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    International Nuclear Information System (INIS)

    Hules, John A.

    2008-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics

  15. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  16. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  17. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  18. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  19. Computer-aided waste management strategic planning and analysis

    International Nuclear Information System (INIS)

    Avci, H.I.; Kotek, T.J.; Koebnick, B.L.

    1995-01-01

    A computational model called WASTE-MGMT has been developed to assist in the evaluation of alternative waste management approaches in a complex setting involving multiple sites, waste streams, and processing options. The model provides the quantities and characteristics of wastes processed at any facility or shipped between any two sites as well as environmental emissions at any facility within the waste management system. The model input is defined by three types of fundamental waste management data: (1) waste inventories and characteristics at the point of generation; (2) treatment, storage, and disposal facility characteristics; and (3) definitions of alternative management approaches. The model has been successfully used in the preparation of the US Department of Energy (DOE) Environmental Management Programmatic.Environmental Impact Statement (EM PEIS). Certain improvements are either being implemented or planned that would extend the usefulness and applicability of the WASTE-MGMT model beyond the EM PEIS and info the. strategic planning for management of wastes under the responsibility of DOE or other agencies

  20. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  1. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  2. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  3. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  4. Computer-based training for particle accelerator personnel

    International Nuclear Information System (INIS)

    Silbar, R.R.

    1999-01-01

    A continuing problem at many laboratories is the training of new operators in the arcane technology of particle accelerators. Presently most of this training occurs on the job, under a mentor. Such training is expensive, and while it provides operational experience, it is frequently lax in providing the physics background needed to truly understand accelerator systems. Using computers in a self-paced, interactive environment can be more effective in meeting this training need. copyright 1999 American Institute of Physics

  5. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  6. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  7. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-01-01

    This paper describes some recent developments in computing and stresses their application to accelerator control systems. Among the advances that promise to have a significant impact are: i) low cost scientific workstations; ii) the use of ''windows'', pointing devices and menus in a multitasking operating system; iii) high resolution large-screen graphics monitors; iv) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, the authors examine the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  8. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-04-01

    This paper describes some recent developments in computing and stresses their application in accelerator control systems. Among the advances that promise to have a significant impact are (1) low cost scientific workstations; (2) the use of ''windows'', pointing devices and menus in a multi-tasking operating system; (3) high resolution large-screen graphics monitors; (4) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, this paper examines the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  9. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1988-09-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multi-user Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implementation with four months with a computer and instrumentation cost of approximately $100K. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking and operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the efficient implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. 3 refs

  10. Distributed computer controls for accelerator systems

    Science.gov (United States)

    Moore, T. L.

    1989-04-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed.

  11. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1989-01-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. (orig.)

  12. Computer simulations of compact toroid formation and acceleration

    International Nuclear Information System (INIS)

    Peterkin, R.E. Jr.; Sovinec, C.R.

    1990-01-01

    Experiments to form, accelerate, and focus compact toroid plasmas will be performed on the 9.4 MJ SHIVA STAR fast capacitor bank at the Air Force Weapons Laboratory during the 1990. The MARAUDER (magnetically accelerated rings to achieve ultrahigh directed energy and radiation) program is a research effort to accelerate magnetized plasma rings with the masses between 0.1 and 1.0 mg to velocities above 10 8 cm/sec and energies above 1 MJ. Research on these high-velocity compact toroids may lead to development of very fast opening switches, high-power microwave sources, and an alternative path to inertial confinement fusion. Design of a compact toroid accelerator experiment on the SHIVA STAR capacitor bank is underway, and computer simulations with the 2 1/2-dimensional magnetohydrodynamics code, MACH2, have been performed to guide this endeavor. The compact toroids are produced in a magnetized coaxial plasma gun, and the acceleration will occur in a configuration similar to a coaxial railgun. Detailed calculations of formation and equilibration of a low beta magnetic force-free configuration (curl B = kB) have been performed with MACH2. In this paper, the authors discuss computer simulations of the focusing and acceleration of the toroid

  13. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  14. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  15. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  16. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Directory of Open Access Journals (Sweden)

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  17. Quantum computing accelerator I/O : LDRD 52750 final report

    International Nuclear Information System (INIS)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-01-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work

  18. GPU acceleration of Dock6's Amber scoring computation.

    Science.gov (United States)

    Yang, Hailong; Zhou, Qiongqiong; Li, Bo; Wang, Yongjian; Luan, Zhongzhi; Qian, Depei; Li, Hanlu

    2010-01-01

    Dressing the problem of virtual screening is a long-term goal in the drug discovery field, which if properly solved, can significantly shorten new drugs' R&D cycle. The scoring functionality that evaluates the fitness of the docking result is one of the major challenges in virtual screening. In general, scoring functionality in docking requires a large amount of floating-point calculations, which usually takes several weeks or even months to be finished. This time-consuming procedure is unacceptable, especially when highly fatal and infectious virus arises such as SARS and H1N1, which forces the scoring task to be done in a limited time. This paper presents how to leverage the computational power of GPU to accelerate Dock6's (http://dock.compbio.ucsf.edu/DOCK_6/) Amber (J. Comput. Chem. 25: 1157-1174, 2004) scoring with NVIDIA CUDA (NVIDIA Corporation Technical Staff, Compute Unified Device Architecture - Programming Guide, NVIDIA Corporation, 2008) (Compute Unified Device Architecture) platform. We also discuss many factors that will greatly influence the performance after porting the Amber scoring to GPU, including thread management, data transfer, and divergence hidden. Our experiments show that the GPU-accelerated Amber scoring achieves a 6.5× speedup with respect to the original version running on AMD dual-core CPU for the same problem size. This acceleration makes the Amber scoring more competitive and efficient for large-scale virtual screening problems.

  19. 2-D and 3-D computations of curved accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.

    1991-01-01

    In order to save computer memory, a long accelerator magnet may be computed by treating the long central region and the end regions separately. The dipole magnets for the injector synchrotron of the Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), employ magnet iron consisting of parallel laminations, stacked with a uniform radius of curvature of 33.379 m. Laplace's equation for the magnetic scalar potential has a different form for a straight magnet (x-y coordinates), a magnet with surfaces curved about a common center (r-θ coordinates), and a magnet with parallel laminations like the APS injector dipole. Yet pseudo 2-D computations for the three geometries give basically identical results, even for a much more strongly curved magnet. Hence 2-D (x-y) computations of the central region and 3-D computations of the end regions can be combined to determine the overall magnetic behavior of the magnets. 1 ref., 6 figs

  20. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    International Nuclear Information System (INIS)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-01-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction

  1. Scientific Computing Strategic Plan for the Idaho National Laboratory

    International Nuclear Information System (INIS)

    Whiting, Eric Todd

    2015-01-01

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory's (INL's) challenge and charge, and is central to INL's ongoing success. Computing is an essential part of INL's future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing number of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.

  2. Personal computer control system for small size tandem accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Takayama, Hiroshi; Kawano, Kazuhiro; Shinozaki, Masataka [Nissin - High Voltage Co. Ltd., Kyoto (Japan)

    1996-12-01

    As the analysis apparatus using tandem accelerator has a lot of control parameter, numbers of control parts set on control panel are so many to make the panel more complex and its operativity worse. In order to improve these faults, development and design of a control system using personal computer for the control panel mainly constituted by conventional hardware parts were tried. Their predominant characteristics are shown as follows: (1) To make the control panel construction simpler and more compact, because the hardware device on the panel surface becomes the smallest limit as required by using a personal computer for man-machine interface. (2) To make control speed more rapid, because sequence control is closed within each block by driving accelerator system to each block and installing local station of the sequencer network at each block. (3) To make expandability larger, because of few improvement of the present hardware by interrupting the sequencer local station into the net and correcting image of the computer when increasing a new beamline. And, (4) to make control system cheaper, because of cheaper investment and easier programming by using the personal computer. (G.K.)

  3. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  4. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    Directory of Open Access Journals (Sweden)

    Wei Wang

    Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other

  5. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  6. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    Science.gov (United States)

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  7. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  8. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  9. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  10. Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing)

    Science.gov (United States)

    2016-04-01

    Quick deployment • Easier scale of services • Scalability • Rapid development, deployments, and change management • Agility • Efficiency • High...Resource Pooling: Outsources computing infrastructure, reducing or eliminating the need for organizations to offer extensive IT services. • Measured

  11. A Study on Strategic Provisioning of Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Md Whaiduzzaman

    2014-01-01

    Full Text Available Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  12. A study on strategic provisioning of cloud computing services.

    Science.gov (United States)

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  13. Accelerators and Beams, multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R.R.; Browman, A.A.; Mead, W.C.; Williams, R.A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user close-quote s rate of learning and length of retention of the material. They integrate interactive On-Screen Laboratories, hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer. copyright 1999 American Institute of Physics

  14. ''Accelerators and Beams,'' multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R. R.; Browman, A. A.; Mead, W. C.; Williams, R. A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user's rate of learning and length of retention of the material. They integrate interactive ''On-Screen Laboratories,'' hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer

  15. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  16. Strategic flexibility in computational estimation for Chinese- and Canadian-educated adults.

    Science.gov (United States)

    Xu, Chang; Wells, Emma; LeFevre, Jo-Anne; Imbo, Ineke

    2014-09-01

    The purpose of the present study was to examine factors that influence strategic flexibility in computational estimation for Chinese- and Canadian-educated adults. Strategic flexibility was operationalized as the percentage of trials on which participants chose the problem-based procedure that best balanced proximity to the correct answer with simplification of the required calculation. For example, on 42 × 57, the optimal problem-based solution is 40 × 60 because 2,400 is closer to the exact answer 2,394 than is 40 × 50 or 50 × 60. In Experiment 1 (n = 50), where participants had free choice of estimation procedures, Chinese-educated participants were more likely to choose the optimal problem-based procedure (80% of trials) than Canadian-educated participants (50%). In Experiment 2 (n = 48), participants had to choose 1 of 3 solution procedures. They showed moderate strategic flexibility that was equal across groups (60%). In Experiment 3 (n = 50), participants were given the same 3 procedure choices as in Experiment 2 but different instructions and explicit feedback. When instructed to respond quickly, both groups showed moderate strategic flexibility as in Experiment 2 (60%). When instructed to respond as accurately as possible or to balance speed and accuracy, they showed very high strategic flexibility (greater than 90%). These findings suggest that solvers will show very different levels of strategic flexibility in response to instructions, feedback, and problem characteristics and that these factors interact with individual differences (e.g., arithmetic skills, nationality) to produce variable response patterns.

  17. Computer codes and methods for simulating accelerator driven systems

    International Nuclear Information System (INIS)

    Sartori, E.; Byung Chan Na

    2003-01-01

    A large set of computer codes and associated data libraries have been developed by nuclear research and industry over the past half century. A large number of them are in the public domain and can be obtained under agreed conditions from different Information Centres. The areas covered comprise: basic nuclear data and models, reactor spectra and cell calculations, static and dynamic reactor analysis, criticality, radiation shielding, dosimetry and material damage, fuel behaviour, safety and hazard analysis, heat conduction and fluid flow in reactor systems, spent fuel and waste management (handling, transportation, and storage), economics of fuel cycles, impact on the environment of nuclear activities etc. These codes and models have been developed mostly for critical systems used for research or power generation and other technological applications. Many of them have not been designed for accelerator driven systems (ADS), but with competent use, they can be used for studying such systems or can form the basis for adapting existing methods to the specific needs of ADS's. The present paper describes the types of methods, codes and associated data available and their role in the applications. It provides Web addresses for facilitating searches for such tools. Some indications are given on the effect of non appropriate or 'blind' use of existing tools to ADS. Reference is made to available experimental data that can be used for validating the methods use. Finally, some international activities linked to the different computational aspects are described briefly. (author)

  18. Modern computer networks and distributed intelligence in accelerator controls

    International Nuclear Information System (INIS)

    Briegel, C.

    1991-01-01

    Appropriate hardware and software network protocols are surveyed for accelerator control environments. Accelerator controls network topologies are discussed with respect to the following criteria: vertical versus horizontal and distributed versus centralized. Decision-making considerations are provided for accelerator network architecture specification. Current trends and implementations at Fermilab are discussed

  19. Computation of Normal Conducting and Superconducting Linear Accelerator (LINAC) Availabilities

    International Nuclear Information System (INIS)

    Haire, M.J.

    2000-01-01

    A brief study was conducted to roughly estimate the availability of a superconducting (SC) linear accelerator (LINAC) as compared to a normal conducting (NC) one. Potentially, SC radio frequency cavities have substantial reserve capability, which allows them to compensate for failed cavities, thus increasing the availability of the overall LINAC. In the initial SC design, there is a klystron and associated equipment (e.g., power supply) for every cavity of an SC LINAC. On the other hand, a single klystron may service eight cavities in the NC LINAC. This study modeled that portion of the Spallation Neutron Source LINAC (between 200 and 1,000 MeV) that is initially proposed for conversion from NC to SC technology. Equipment common to both designs was not evaluated. Tabular fault-tree calculations and computer-event-driven simulation (EDS) computer computations were performed. The estimated gain in availability when using the SC option ranges from 3 to 13% under certain equipment and conditions and spatial separation requirements. The availability of an NC LINAC is estimated to be 83%. Tabular fault-tree calculations and computer EDS modeling gave the same 83% answer to within one-tenth of a percent for the NC case. Tabular fault-tree calculations of the availability of the SC LINAC (where a klystron and associated equipment drive a single cavity) give 97%, whereas EDS computer calculations give 96%, a disagreement of only 1%. This result may be somewhat fortuitous because of limitations of tabular fault-tree calculations. For example, tabular fault-tree calculations can not handle spatial effects (separation distance between failures), equipment network configurations, and some failure combinations. EDS computer modeling of various equipment configurations were examined. When there is a klystron and associated equipment for every cavity and adjacent cavity, failure can be tolerated and the SC availability was estimated to be 96%. SC availability decreased as

  20. Strengthening Deterrence for 21st Century Strategic Conflicts and Competition: Accelerating Adaptation and Integration - Annotated Bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Durkalec, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-11-01

    This was the fourth in a series of annual events convened at Livermore to exploring the emerging place of the “new domains” in U.S. deterrence strategies. The purposes of the series are to facilitate the emergence of a community of interest that cuts across the policy, military, and technical communities and to inform laboratory strategic planning. U.S. allies have also been drawn into the conversation, as U.S. deterrence strategies are in part about their protection. Discussion in these workshops is on a not-for-attribution basis. It is also makes no use of classified information. On this occasion, there were nearly 100 participants from a dozen countries.

  1. Recent Improvements to CHEF, a Framework for Accelerator Computations

    Energy Technology Data Exchange (ETDEWEB)

    Ostiguy, J.-F.; Michelotti, L.P.; /Fermilab

    2009-05-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5

  2. Recent Improvements to CHEF, a Framework for Accelerator Computations

    International Nuclear Information System (INIS)

    Ostiguy, J.-F.; Michelotti, L.P.

    2009-01-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5], it is also

  3. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  4. The control computer for the Chalk River electron test accelerator

    International Nuclear Information System (INIS)

    McMichael, G.E.; Fraser, J.S.; McKeown, J.

    1978-02-01

    A versatile control and data acquisition system has been developed for a modest-sized linear accelerator using mainly process I/O hardware and software. This report describes the evolution of the present system since 1972, the modifications needed to satisfy the changing requirements of the various accelerator physics experiments and the limitations of such a system in process control. (author)

  5. Computer applications: Automatic control system for high-voltage accelerator

    International Nuclear Information System (INIS)

    Bryukhanov, A.N.; Komissarov, P.Yu.; Lapin, V.V.; Latushkin, S.T.. Fomenko, D.E.; Yudin, L.I.

    1992-01-01

    An automatic control system for a high-voltage electrostatic accelerator with an accelerating potential of up to 500 kV is described. The electronic apparatus on the high-voltage platform is controlled and monitored by means of a fiber-optic data-exchange system. The system is based on CAMAC modules that are controlled by a microprocessor crate controller. Data on accelerator operation are represented and control instructions are issued by means of an alphanumeric terminal. 8 refs., 6 figs

  6. Computer codes for particle accelerator design and analysis: A compendium. Second edition

    International Nuclear Information System (INIS)

    Deaven, H.S.; Chan, K.C.D.

    1990-05-01

    The design of the next generation of high-energy accelerators will probably be done as an international collaborative efforts and it would make sense to establish, either formally or informally, an international center for accelerator codes with branches for maintenance, distribution, and consultation at strategically located accelerator centers around the world. This arrangement could have at least three beneficial effects. It would cut down duplication of effort, provide long-term support for the best codes, and provide a stimulating atmosphere for the evolution of new codes. It does not take much foresight to see that the natural evolution of accelerator design codes is toward the development of so-called Expert Systems, systems capable of taking design specifications of future accelerators and producing specifications for optimized magnetic transport and acceleration components, making a layout, and giving a fairly impartial cost estimate. Such an expert program would use present-day programs such as TRANSPORT, POISSON, and SUPERFISH as tools in the optimization process. Such a program would also serve to codify the experience of two generations of accelerator designers before it is lost as these designers reach retirement age. This document describes 203 codes that originate from 10 countries and are currently in use. The authors feel that this compendium will contribute to the dialogue supporting the international collaborative effort that is taking place in the field of accelerator physics today

  7. Computer Based Dose Control System on Linear Accelerator

    International Nuclear Information System (INIS)

    Taxwim; Djoko-SP; Widi-Setiawan; Agus-Budi Wiyatna

    2000-01-01

    The accelerator technology has been used for radio therapy. DokterKaryadi Hospital in Semarang use electron or X-ray linear accelerator (Linac)for cancer therapy. One of the control parameter of linear accelerator isdose rate. It is particle current or amount of photon rate to the target. Thecontrol of dose rate in linac have been done by adjusting repetition rate ofanode pulse train of electron source. Presently the control is stillproportional control. To enhance the quality of the control result (minimalstationer error, velocity and stability), the dose control system has beendesigned by using the PID (Proportional Integral Differential) controlalgorithm and the derivation of transfer function of control object.Implementation of PID algorithm control system is done by giving an input ofdose error (the different between output dose and dose rate set point). Theoutput of control system is used for correction of repetition rate set pointfrom pulse train of electron source anode. (author)

  8. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    Science.gov (United States)

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  9. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  10. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. The Extrapolation-Accelerated Multilevel Aggregation Method in PageRank Computation

    Directory of Open Access Journals (Sweden)

    Bing-Yuan Pu

    2013-01-01

    Full Text Available An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.

  12. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    Science.gov (United States)

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  13. Shredder: GPU-Accelerated Incremental Storage and Computation

    OpenAIRE

    Bhatotia, Pramod; Rodrigues, Rodrigo; Verma, Akshat

    2012-01-01

    Redundancy elimination using data deduplication and incremental data processing has emerged as an important technique to minimize storage and computation requirements in data center computing. In this paper, we present the design, implementation and evaluation of Shredder, a high performance content-based chunking framework for supporting incremental storage and computation systems. Shredder exploits the massively parallel processing power of GPUs to overcome the CPU bottlenecks of content-ba...

  14. Quantum Accelerators for High-Performance Computing Systems

    OpenAIRE

    Britt, Keith A.; Mohiyaddin, Fahd A.; Humble, Travis S.

    2017-01-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantu...

  15. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  16. Reactor and /or accelerator: general remarks on strategic considerations in sourcing/producing radiopharmaceuticals and radiotracer for the Philippines

    International Nuclear Information System (INIS)

    Nazarea, A.D.

    1996-01-01

    The most important sources of radionuclides in the world are particle accelerators and nuclear reactors. Since the late 1940's many radiotracers and radiopharmaceuticals have been innovated and conceived, designed, produced and applied in important industrial and clinical/ biomedical settings. For example in the health area, reactor-produced radionuclides have become indispensable for diagnostic imaging involving, in its most recent and advanced development, radioimmunoscintigraphy, which exploits the exquisite ligand-specificity of monoclonal antibodies, reagents which in turn are the products of advances in biotechnology. Thus far, one of the most indispensable radiopharmaceuticals has been 99m Tc, which is usually obtained as a daughter decay product of 99 Mo. In January 1991, some questions about the stability of the worldwide commercial supply of 99 Mo became highlighted when the major commercial world producer of 99 Mo, Nordion International, shut down its facilities temporarily in Canada due to contamination in its main reactor building (see for instance relevant newsbrief in J. Nuclear Medicine (1991): 'Industry agrees to join DOE study of domestic moly-99 production'). With the above background, my remarks will attempt to open discussions on strategic considerations relevant to questions of 'self reliance' in radiotracers/radiopharmaceutical production in the Philippines. For instance, the relevant question of sourcing local radionuclide needs from a fully functioning multipurpose cyclotron facility within the country that will then supply the needs of the local industrial, biomedical (including research) and health sectors; and possibly, eventually acquiring the capability to export to nearby countries longer-lived radiotracers and radiopharmaceuticals

  17. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  18. Lua(Jit) for computing accelerator beam physics

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    As mentioned in the 2nd developers meeting, I would like to open the debate with a special presentation on another language - Lua, and a tremendous technology - LuaJit. Lua is much less known at CERN, but it is very simple, much smaller than Python and its JIT is extremely performant. The language is a dynamic scripting language easy to learn and easy to embedded in applications. I will show how we use it in HPC for accelerator beam physics as a replacement for C, C++, Fortran and Python, with some benchmarks versus Python, PyPy4 and C/C++.

  19. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  20. Computational acceleration for MR image reconstruction in partially parallel imaging.

    Science.gov (United States)

    Ye, Xiaojing; Chen, Yunmei; Huang, Feng

    2011-05-01

    In this paper, we present a fast numerical algorithm for solving total variation and l(1) (TVL1) based image reconstruction with application in partially parallel magnetic resonance imaging. Our algorithm uses variable splitting method to reduce computational cost. Moreover, the Barzilai-Borwein step size selection method is adopted in our algorithm for much faster convergence. Experimental results on clinical partially parallel imaging data demonstrate that the proposed algorithm requires much fewer iterations and/or less computational cost than recently developed operator splitting and Bregman operator splitting methods, which can deal with a general sensing matrix in reconstruction framework, to get similar or even better quality of reconstructed images.

  1. Accelerators

    CERN Multimedia

    CERN. Geneva

    2001-01-01

    The talk summarizes the principles of particle acceleration and addresses problems related to storage rings like LEP and LHC. Special emphasis will be given to orbit stability, long term stability of the particle motion, collective effects and synchrotron radiation.

  2. Computer control of large accelerators design concepts and methods

    International Nuclear Information System (INIS)

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references

  3. Advanced Computational Models for Accelerator-Driven Systems

    International Nuclear Information System (INIS)

    Talamo, A.; Ravetto, P.; Gudowsk, W.

    2012-01-01

    In the nuclear engineering scientific community, Accelerator Driven Systems (ADSs) have been proposed and investigated for the transmutation of nuclear waste, especially plutonium and minor actinides. These fuels have a quite low effective delayed neutron fraction relative to uranium fuel, therefore the subcriticality of the core offers a unique safety feature with respect to critical reactors. The intrinsic safety of ADS allows the elimination of the operational control rods, hence the reactivity excess during burnup can be managed by the intensity of the proton beam, fuel shuffling, and eventually by burnable poisons. However, the intrinsic safety of a subcritical system does not guarantee that ADSs are immune from severe accidents (core melting), since the decay heat of an ADS is very similar to the one of a critical system. Normally, ADSs operate with an effective multiplication factor between 0.98 and 0.92, which means that the spallation neutron source contributes little to the neutron population. In addition, for 1 GeV incident protons and lead-bismuth target, about 50% of the spallation neutrons has energy below 1 MeV and only 15% of spallation neutrons has energies above 3 MeV. In the light of these remarks, the transmutation performances of ADS are very close to those of critical reactors.

  4. Computer control of large accelerators design concepts and methods

    Energy Technology Data Exchange (ETDEWEB)

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references.

  5. X-BAND LINEAR COLLIDER R and D IN ACCELERATING STRUCTURES THROUGH ADVANCED COMPUTING

    International Nuclear Information System (INIS)

    Li, Z

    2004-01-01

    This paper describes a major computational effort that addresses key design issues in the high gradient accelerating structures for the proposed X-band linear collider, GLC/NLC. Supported by the US DOE's Accelerator Simulation Project, SLAC is developing a suite of parallel electromagnetic codes based on unstructured grids for modeling RF structures with higher accuracy and on a scale previously not possible. The new simulation tools have played an important role in the R and D of X-Band accelerating structures, in cell design, wakefield analysis and dark current studies

  6. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  7. Gravitational Acceleration Effects on Macrosegregation: Experiment and Computational Modeling

    Science.gov (United States)

    Leon-Torres, J.; Curreri, P. A.; Stefanescu, D. M.; Sen, S.

    1999-01-01

    Experiments were performed under terrestrial gravity (1g) and during parabolic flights (10-2 g) to study the solidification and macrosegregation patterns of Al-Cu alloys. Alloys having 2% and 5% Cu were solidified against a chill at two different cooling rates. Microscopic and Electron Microprobe characterization was used to produce microstructural and macrosegregation maps. In all cases positive segregation occurred next to the chill because shrinkage flow, as expected. This positive segregation was higher in the low-g samples, apparently because of the higher heat transfer coefficient. A 2-D computational model was used to explain the experimental results. The continuum formulation was employed to describe the macroscopic transports of mass, energy, and momentum, associated with the solidification phenomena, for a two-phase system. The model considers that liquid flow is driven by thermal and solutal buoyancy, and by solidification shrinkage. The solidification event was divided into two stages. In the first one, the liquid containing freely moving equiaxed grains was described through the relative viscosity concept. In the second stage, when a fixed dendritic network was formed after dendritic coherency, the mushy zone was treated as a porous medium. The macrosegregation maps and the cooling curves obtained during experiments were used for validation of the solidification and segregation model. The model can explain the solidification and macrosegregation patterns and the differences between low- and high-gravity results.

  8. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  9. 3-D computations and measurements of accelerator magnets for the APS

    International Nuclear Information System (INIS)

    Turner, L.R.; Kim, S.H.; Kim, K.

    1993-01-01

    The Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), requires dipole, quadrupole, sextupole, and corrector magnets for each of its circular accelerator systems. Three-dimensional (3-D) field computations are needed to eliminate unwanted multipole fields from the ends of long quadrupole and dipole magnets and to guarantee that the flux levels in the poles of short magnets will not cause saturation. Measurements of the magnets show good agreement with the computations

  10. Command, Control, Communication, Computers and Information Technology (C4&IT). Strategic Plan, FY2008 - 2012

    National Research Council Canada - National Science Library

    2008-01-01

    ...&IT)/CG-6, Chief Information Officer (CIO), for the Coast Guard publishes this C4&IT Strategic Plan. The purpose of this plan is to provide a unifying strategy to better integrate and synchronize Coast Guard C4...

  11. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  12. Accelerate!

    Science.gov (United States)

    Kotter, John P

    2012-11-01

    The old ways of setting and implementing strategy are failing us, writes the author of Leading Change, in part because we can no longer keep up with the pace of change. Organizational leaders are torn between trying to stay ahead of increasingly fierce competition and needing to deliver this year's results. Although traditional hierarchies and managerial processes--the components of a company's "operating system"--can meet the daily demands of running an enterprise, they are rarely equipped to identify important hazards quickly, formulate creative strategic initiatives nimbly, and implement them speedily. The solution Kotter offers is a second system--an agile, networklike structure--that operates in concert with the first to create a dual operating system. In such a system the hierarchy can hand off the pursuit of big strategic initiatives to the strategy network, freeing itself to focus on incremental changes to improve efficiency. The network is populated by employees from all levels of the organization, giving it organizational knowledge, relationships, credibility, and influence. It can Liberate information from silos with ease. It has a dynamic structure free of bureaucratic layers, permitting a level of individualism, creativity, and innovation beyond the reach of any hierarchy. The network's core is a guiding coalition that represents each level and department in the hierarchy, with a broad range of skills. Its drivers are members of a "volunteer army" who are energized by and committed to the coalition's vividly formulated, high-stakes vision and strategy. Kotter has helped eight organizations, public and private, build dual operating systems over the past three years. He predicts that such systems will lead to long-term success in the 21st century--for shareholders, customers, employees, and companies themselves.

  13. Electromagnetic computer simulations of collective ion acceleration by a relativistic electron beam

    International Nuclear Information System (INIS)

    Galvez, M.; Gisler, G.R.

    1988-01-01

    A 2.5 electromagnetic particle-in-cell computer code is used to study the collective ion acceleration when a relativistic electron beam is injected into a drift tube partially filled with cold neutral plasma. The simulations of this system reveals that the ions are subject to electrostatic acceleration by an electrostatic potential that forms behind the head of the beam. This electrostatic potential develops soon after the beam is injected into the drift tube, drifts with the beam, and eventually settles to a fixed position. At later times, this electrostatic potential becomes a virtual cathode. When the permanent position of the electrostatic potential is at the edge of the plasma or further up, then ions are accelerated forward and a unidirectional ion flow is obtained otherwise a bidirectional ion flow occurs. The ions that achieve higher energy are those which drift with the negative potential. When the plasma density is varied, the simulations show that optimum acceleration occurs when the density ratio between the beam (n b ) and the plasma (n o ) is unity. Simulations were carried out by changing the ion mass. The results of these simulations corroborate the hypothesis that the ion acceleration mechanism is purely electrostatic, so that the ion acceleration depends inversely on the charge particle mass. The simulations also show that the ion maximum energy increased logarithmically with the electron beam energy and proportional with the beam current

  14. Effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing.

    Science.gov (United States)

    Yoo, Won-Gyu

    2015-01-01

    [Purpose] This study showed the effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing. [Subjects] Twenty-one male computer workers voluntarily consented to participate in this study. They consisted of 7 workers who could type 200-300 characteristics/minute, 7 workers who could type 300-400 characteristics/minute, and 7 workers who could type 400-500 chracteristics/minute. [Methods] This study was used to measure the acceleration and peak contact pressure of the fingertips for different typing speed groups using an accelerometer and CONFORMat system. [Results] The fingertip contact pressure was increased in the high typing speed group compared with the low and medium typing speed groups. The fingertip acceleration was increased in the high typing speed group compared with the low and medium typing speed groups. [Conclusion] The results of the present study indicate that a fast typing speed cause continuous pressure stress to be applied to the fingers, thereby creating pain in the fingers.

  15. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  16. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    Science.gov (United States)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  17. A contribution to the computation of the impedance in acceleration resonators

    International Nuclear Information System (INIS)

    Liu, Cong

    2016-05-01

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  18. A contribution to the computation of the impedance in acceleration resonators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong

    2016-05-15

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  19. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    Science.gov (United States)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  20. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  1. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  2. ACE3P Computations of Wakefield Coupling in the CLIC Two-Beam Accelerator

    International Nuclear Information System (INIS)

    Candel, Arno

    2010-01-01

    The Compact Linear Collider (CLIC) provides a path to a multi-TeV accelerator to explore the energy frontier of High Energy Physics. Its novel two-beam accelerator concept envisions rf power transfer to the accelerating structures from a separate high-current decelerator beam line consisting of power extraction and transfer structures (PETS). It is critical to numerically verify the fundamental and higher-order mode properties in and between the two beam lines with high accuracy and confidence. To solve these large-scale problems, SLAC's parallel finite element electromagnetic code suite ACE3P is employed. Using curvilinear conformal meshes and higher-order finite element vector basis functions, unprecedented accuracy and computational efficiency are achieved, enabling high-fidelity modeling of complex detuned structures such as the CLIC TD24 accelerating structure. In this paper, time-domain simulations of wakefield coupling effects in the combined system of PETS and the TD24 structures are presented. The results will help to identify potential issues and provide new insights on the design, leading to further improvements on the novel CLIC two-beam accelerator scheme.

  3. Proceedings of the conference on computer codes and the linear accelerator community

    International Nuclear Information System (INIS)

    Cooper, R.K.

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned

  4. Proceedings of the conference on computer codes and the linear accelerator community

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.K. (comp.)

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned.

  5. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Borsic, A; Attardo, E A; Halter, R J

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  6. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    Science.gov (United States)

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  7. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  8. Performance analysis and acceleration of explicit integration for large kinetic networks using batched GPU computations

    Energy Technology Data Exchange (ETDEWEB)

    Shyles, Daniel [University of Tennessee (UT); Dongarra, Jack J. [University of Tennessee, Knoxville (UTK); Guidry, Mike W. [ORNL; Tomov, Stanimire Z. [ORNL; Billings, Jay Jay [ORNL; Brock, Benjamin A. [ORNL; Haidar Ahmad, Azzam A. [ORNL

    2016-09-01

    Abstract—We demonstrate the systematic implementation of recently-developed fast explicit kinetic integration algorithms that solve efficiently N coupled ordinary differential equations (subject to initial conditions) on modern GPUs. We take representative test cases (Type Ia supernova explosions) and demonstrate two or more orders of magnitude increase in efficiency for solving such systems (of realistic thermonuclear networks coupled to fluid dynamics). This implies that important coupled, multiphysics problems in various scientific and technical disciplines that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible. As examples of such applications we present the computational techniques developed for our ongoing deployment of these new methods on modern GPU accelerators. We show that similarly to many other scientific applications, ranging from national security to medical advances, the computation can be split into many independent computational tasks, each of relatively small-size. As the size of each individual task does not provide sufficient parallelism for the underlying hardware, especially for accelerators, these tasks must be computed concurrently as a single routine, that we call batched routine, in order to saturate the hardware with enough work.

  9. FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.    In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the...

  10. Acceleration of FDTD mode solver by high-performance computing techniques.

    Science.gov (United States)

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  11. Department of Defense Strategic and Business Case Analyses for Commercial Products in Secure Mobile Computing

    Science.gov (United States)

    2011-06-01

    Solicitation / Modification of Contract. Fort Meade: National Security Agency. Mankiw , N. G. (2006). Essentials of Economics , 4 th Ed. Mason, OH: South...for current smartphone implementations. Results indicate growing strategic opportunities for the DoD to acquire more economical commercial handsets...opportunities for the DoD to acquire more economical commercial handsets and more flexible network services. The business cases may potentially save

  12. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  13. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  14. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  15. Program for computing inhomogeneous coaxial resonators and accelerating systems of the U-400 and ITs-100 cyclotrons

    International Nuclear Information System (INIS)

    Gul'bekyan, G.G.; Ivanov, Eh.L.

    1987-01-01

    The ''Line'' computer code for computing inhomogeneous coaxial resonators is described. The results obtained for the resonators of the U-400 cyclotron made it possible to increase the energy of accelerated ions up to 27 MeV/nucl. The computations fot eh ITs-100 cyclic implantator gave the opportunity to build a compact design with a low value of consumed RF power

  16. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  17. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    International Nuclear Information System (INIS)

    Frankel, R.S.

    1995-01-01

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation

  18. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  19. Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS

    International Nuclear Information System (INIS)

    Li Deming

    2001-01-01

    Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS is described. Also, the establishment of the 14 C particle measuring device and the improvement of the original power supply system are described

  20. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  1. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  2. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    We present performance results of a mixed-precision strategy developed to improve a recently developed massively parallel GPU-accelerated tool for fast and scalable simulation of unsteady fully nonlinear free surface water waves over uneven depths (Engsig-Karup et.al. 2011). The underlying wave......-preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  3. Computer simulation of 2-D and 3-D ion beam extraction and acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Ido, Shunji; Nakajima, Yuji [Saitama Univ., Urawa (Japan). Faculty of Engineering

    1997-03-01

    The two-dimensional code and the three-dimensional code have been developed to study the physical features of the ion beams in the extraction and acceleration stages. By using the two-dimensional code, the design of first electrode(plasma grid) is examined in regard to the beam divergence. In the computational studies by using the three-dimensional code, the axis-off model of ion beam is investigated. It is found that the deflection angle of ion beam is proportional to the gap displacement of the electrodes. (author)

  4. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  5. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul

    2017-03-26

    We provide a general discussion of Smolyak\\'s algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak\\'s work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak\\'s algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  6. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul; Wolfers, Soeren

    2017-01-01

    We provide a general discussion of Smolyak's algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak's work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak's algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  7. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  8. A new 3-D integral code for computation of accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.; Kettunen, L.

    1991-01-01

    For computing accelerator magnets, integral codes have several advantages over finite element codes; far-field boundaries are treated automatically, and computed field in the bore region satisfy Maxwell's equations exactly. A new integral code employing edge elements rather than nodal elements has overcome the difficulties associated with earlier integral codes. By the use of field integrals (potential differences) as solution variables, the number of unknowns is reduced to one less than the number of nodes. Two examples, a hollow iron sphere and the dipole magnet of Advanced Photon Source injector synchrotron, show the capability of the code. The CPU time requirements are comparable to those of three-dimensional (3-D) finite-element codes. Experiments show that in practice it can realize much of the potential CPU time saving that parallel processing makes possible. 8 refs., 4 figs., 1 tab

  9. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  10. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  11. Plasma accelerators

    International Nuclear Information System (INIS)

    Bingham, R.; Angelis, U. de; Johnston, T.W.

    1991-01-01

    Recently attention has focused on charged particle acceleration in a plasma by a fast, large amplitude, longitudinal electron plasma wave. The plasma beat wave and plasma wakefield accelerators are two efficient ways of producing ultra-high accelerating gradients. Starting with the plasma beat wave accelerator (PBWA) and laser wakefield accelerator (LWFA) schemes and the plasma wakefield accelerator (PWFA) steady progress has been made in theory, simulations and experiments. Computations are presented for the study of LWFA. (author)

  12. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    Science.gov (United States)

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  13. TU-FG-201-04: Computer Vision in Autonomous Quality Assurance of Linear Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Yu, H; Jenkins, C; Yu, S; Yang, Y; Xing, L [Stanford University, Stanford, CA (United States)

    2016-06-15

    Purpose: Routine quality assurance (QA) of linear accelerators represents a critical and costly element of a radiation oncology center. Recently, a system was developed to autonomously perform routine quality assurance on linear accelerators. The purpose of this work is to extend this system and contribute computer vision techniques for obtaining quantitative measurements for a monthly multi-leaf collimator (MLC) QA test specified by TG-142, namely leaf position accuracy, and demonstrate extensibility for additional routines. Methods: Grayscale images of a picket fence delivery on a radioluminescent phosphor coated phantom are captured using a CMOS camera. Collected images are processed to correct for camera distortions, rotation and alignment, reduce noise, and enhance contrast. The location of each MLC leaf is determined through logistic fitting and a priori modeling based on knowledge of the delivered beams. Using the data collected and the criteria from TG-142, a decision is made on whether or not the leaf position accuracy of the MLC passes or fails. Results: The locations of all MLC leaf edges are found for three different picket fence images in a picket fence routine to 0.1mm/1pixel precision. The program to correct for image alignment and determination of leaf positions requires a runtime of 21– 25 seconds for a single picket, and 44 – 46 seconds for a group of three pickets on a standard workstation CPU, 2.2 GHz Intel Core i7. Conclusion: MLC leaf edges were successfully found using techniques in computer vision. With the addition of computer vision techniques to the previously described autonomous QA system, the system is able to quickly perform complete QA routines with minimal human contribution.

  14. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  15. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    Science.gov (United States)

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  16. X-ray beam hardening correction for measuring density in linear accelerator industrial computed tomography

    International Nuclear Information System (INIS)

    Zhou Rifeng; Wang Jue; Chen Weimin

    2009-01-01

    Due to X-ray attenuation being approximately proportional to material density, it is possible to measure the inner density through Industrial Computed Tomography (ICT) images accurately. In practice, however, a number of factors including the non-linear effects of beam hardening and diffuse scattered radiation complicate the quantitative measurement of density variations in materials. This paper is based on the linearization method of beam hardening correction, and uses polynomial fitting coefficient which is obtained by the curvature of iron polychromatic beam data to fit other materials. Through theoretical deduction, the paper proves that the density measure error is less than 2% if using pre-filters to make the spectrum of linear accelerator range mainly 0.3 MeV to 3 MeV. Experiment had been set up at an ICT system with a 9 MeV electron linear accelerator. The result is satisfactory. This technique makes the beam hardening correction easy and simple, and it is valuable for measuring the ICT density and making use of the CT images to recognize materials. (authors)

  17. Continuous Analog of Accelerated OS-EM Algorithm for Computed Tomography

    Directory of Open Access Journals (Sweden)

    Kiyoko Tateishi

    2017-01-01

    Full Text Available The maximum-likelihood expectation-maximization (ML-EM algorithm is used for an iterative image reconstruction (IIR method and performs well with respect to the inverse problem as cross-entropy minimization in computed tomography. For accelerating the convergence rate of the ML-EM, the ordered-subsets expectation-maximization (OS-EM with a power factor is effective. In this paper, we propose a continuous analog to the power-based accelerated OS-EM algorithm. The continuous-time image reconstruction (CIR system is described by nonlinear differential equations with piecewise smooth vector fields by a cyclic switching process. A numerical discretization of the differential equation by using the geometric multiplicative first-order expansion of the nonlinear vector field leads to an exact equivalent iterative formula of the power-based OS-EM. The convergence of nonnegatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem for consistent inverse problems. We illustrate through numerical experiments that the convergence characteristics of the continuous system have the highest quality compared with that of discretization methods. We clarify how important the discretization method approximates the solution of the CIR to design a better IIR method.

  18. Accelerating Approximate Bayesian Computation with Quantile Regression: application to cosmological redshift distributions

    Science.gov (United States)

    Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.

    2018-02-01

    Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.

  19. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    International Nuclear Information System (INIS)

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  20. Computer-mediated communication as a channel for social resistance : The strategic side of SIDE

    NARCIS (Netherlands)

    Spears, R; Lea, M; Corneliussen, RA; Postmes, T; Ter Haar, W

    2002-01-01

    In two studies, the authors tested predictions derived from the social identity model of deindividuation effects (SIDE) concerning the potential of computer-mediated communication (CMC) to serve as a means to resist powerful out-groups. Earlier research using the SIDE model indicates that the

  1. Flexusi Interface Builder For Computer Based Accelerator Monitoring And Control System

    CERN Document Server

    Kurakin, V G; Kurakin, P V

    2004-01-01

    We have developed computer code for any desired graphics user interface designing for monitoring and control system at the executable level. This means that operator can build up measurement console consisting of virtual devices before or even during real experiment without recompiling source file. Such functionality results in number of advantages comparing with traditional programming. First of all any risk disappears to introduce bug into source code. Another important thing is the fact the both program developers and operator staff do not interface in developing ultimate product (measurement console). Thus, small team without detailed project can design even very complicated monitoring and control system. For the reason mentioned below, approach suggested is especially helpful for large complexes to be monitored and control, accelerator being among them. The program code consists of several modules, responsible for data acquisition, control and representation. Borland C++ Builder technologies based on VCL...

  2. Accelerated Synchrotron X-ray Diffraction Data Analysis on a Heterogeneous High Performance Computing System

    Energy Technology Data Exchange (ETDEWEB)

    Qin, J; Bauer, M A, E-mail: qin.jinhui@gmail.com, E-mail: bauer@uwo.ca [Computer Science Department, University of Western Ontario, London, ON N6A 5B7 (Canada)

    2010-11-01

    The analysis of synchrotron X-ray Diffraction (XRD) data has been used by scientists and engineers to understand and predict properties of materials. However, the large volume of XRD image data and the intensive computations involved in the data analysis makes it hard for researchers to quickly reach any conclusions about the images from an experiment when using conventional XRD data analysis software. Synchrotron time is valuable and delays in XRD data analysis can impact decisions about subsequent experiments or about materials that they are investigating. In order to improve the data analysis performance, ideally to achieve near real time data analysis during an XRD experiment, we designed and implemented software for accelerated XRD data analysis. The software has been developed for a heterogeneous high performance computing (HPC) system, comprised of IBM PowerXCell 8i processors and Intel quad-core Xeon processors. This paper describes the software and reports on the improved performance. The results indicate that it is possible for XRD data to be analyzed at the rate it is being produced.

  3. Accelerated Synchrotron X-ray Diffraction Data Analysis on a Heterogeneous High Performance Computing System

    International Nuclear Information System (INIS)

    Qin, J; Bauer, M A

    2010-01-01

    The analysis of synchrotron X-ray Diffraction (XRD) data has been used by scientists and engineers to understand and predict properties of materials. However, the large volume of XRD image data and the intensive computations involved in the data analysis makes it hard for researchers to quickly reach any conclusions about the images from an experiment when using conventional XRD data analysis software. Synchrotron time is valuable and delays in XRD data analysis can impact decisions about subsequent experiments or about materials that they are investigating. In order to improve the data analysis performance, ideally to achieve near real time data analysis during an XRD experiment, we designed and implemented software for accelerated XRD data analysis. The software has been developed for a heterogeneous high performance computing (HPC) system, comprised of IBM PowerXCell 8i processors and Intel quad-core Xeon processors. This paper describes the software and reports on the improved performance. The results indicate that it is possible for XRD data to be analyzed at the rate it is being produced.

  4. SOLVING BY PARALLEL COMPUTATION THE POISSON PROBLEM FOR HIGH INTENSITY BEAMS IN CIRCULAR ACCELERATORS

    International Nuclear Information System (INIS)

    LUCCIO, A.U.; DIMPERIO, N.L.; SAMULYAK, R.; BEEB-WANG, J.

    2001-01-01

    Simulation of high intensity accelerators leads to the solution of the Poisson Equation, to calculate space charge forces in the presence of acceleration chamber walls. We reduced the problem to ''two-and-a-half'' dimensions for long particle bunches, characteristic of large circular accelerators, and applied the results to the tracking code Orbit

  5. Convergence acceleration of two-phase flow calculations in FLICA-4. A thermal-hydraulic 3D computer code

    International Nuclear Information System (INIS)

    Toumi, I.

    1995-01-01

    Time requirements for 3D two-phase flow steady state calculations are generally long. Usually, numerical methods for steady state problems are iterative methods consisting in time-like methods that are marched to a steady state. Based on the eigenvalue spectrum of the iteration matrix for various flow configuration, two convergence acceleration techniques are discussed; over-relaxation and eigenvalue annihilation. This methods were applied to accelerate the convergence of three dimensional steady state two-phase flow calculations within the FLICA-4 computer code. These acceleration methods are easy to implement and no extra computer memory is required. Successful results are presented for various test problems and a saving of 30 to 50 % in CPU time have been achieved. (author). 10 refs., 4 figs

  6. Subcritical set coupled to accelerator (ADS) for transmutation of radioactive wastes: an approach of computational modelling

    International Nuclear Information System (INIS)

    Torres, Mirta B.; Dominguez, Dany S.

    2013-01-01

    Nuclear fission devices coupled to particle accelerators ADS are being widely studied. These devices have several applications, including nuclear waste transmutation and producing hydrogen, both applications with strong social and environmental impact. The essence of this work was to model an ADS geometry composed of small TRISO fuel loaded with a mixture of MOX uranium and thorium target material spallation of uranium, using methods of computational modeling probabilistic, in particular the MCNPX 2.6e program to evaluate the physical characteristics of the device and their ability to transmutation. As a result of the characterization of the spallation target, it can be concluded that production of neutrons per incident proton increases with increasing dimensions of the spallation target (thickness and radius), until it reached the maximum production of neutrons per incident proton or call the region saturation. The results obtained in modeling the ADS device bed kind of balls with respect to isotopic variation in the isotopes of plutonium and minor actinides considered in the analysis revealed that accumulation of mass of the isotopes of plutonium and minor actinides increase for subcritical configuration considered. In the particular case of the isotope 239 Pu, it is observed a reduction of the mass from the time of burning of 99 days. The increase of power in the core, whereas tungsten spallation targets and Lead is among the key future developments of this work

  7. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    Science.gov (United States)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  8. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation....... This model incorporates elements of central strategizing, autonomous entrepreneurial behavior, interactive information processing, and open communication systems that enhance the organization's ability to observe exogenous changes and respond effectively to them....

  9. The Computer Program LIAR for Beam Dynamics Calculations in Linear Accelerators

    International Nuclear Information System (INIS)

    Assmann, R.W.; Adolphsen, C.; Bane, K.; Raubenheimer, T.O.; Siemann, R.H.; Thompson, K.

    2011-01-01

    Linear accelerators are the central components of the proposed next generation of linear colliders. They need to provide acceleration of up to 750 GeV per beam while maintaining very small normalized emittances. Standard simulation programs, mainly developed for storage rings, do not meet the specific requirements for high energy linear accelerators. We present a new program LIAR ('LInear Accelerator Research code') that includes wakefield effects, a 6D coupled beam description, specific optimization algorithms and other advanced features. Its modular structure allows to use and to extend it easily for different purposes. The program is available for UNIX workstations and Windows PC's. It can be applied to a broad range of accelerators. We present examples of simulations for SLC and NLC.

  10. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  11. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer

    International Nuclear Information System (INIS)

    Martinez Piquer, T. A.; Yuste Santos, C.

    1986-01-01

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs

  12. Investigation of acceleration effects on missile aerodynamics using computational fluid dynamics

    CSIR Research Space (South Africa)

    Gledhill, Irvy MA

    2009-01-01

    Full Text Available In this paper the authors describe the implementation and validation of arbitrarily moving reference frames in the block-structured CFD-code EURANUS. The paper also present results from calculations on two applications involving accelerating...

  13. Approach to the open advanced facilities initiative for innovation (strategic use by industry) at the University of Tsukuba, Tandem Accelerator Complex

    International Nuclear Information System (INIS)

    Sasa, K.; Tagishi, Y.; Naramoto, H.; Kudo, H.; Kita, E.

    2010-01-01

    The University of Tsukuba, Tandem Accelerator Complex (UTTAC) possesses the 12UD Pelletron tandem accelerator and the 1 MV Tandetron accelerator for University's inter-department education research. We have actively advanced collaborative researches with other research institutes and industrial users. Since the Open Advanced Facilities Initiative for Innovation by the Ministry of Education, Culture, Sports, Science and Technology started in 2007, 12 industrial experiments have been carried out at the UTTAC. This report describes efforts by University's accelerator facility to get industrial users. (author)

  14. Analysis of Movement Acceleration of Down's Syndrome Teenagers Playing Computer Games.

    Science.gov (United States)

    Carrogi-Vianna, Daniela; Lopes, Paulo Batista; Cymrot, Raquel; Hengles Almeida, Jefferson Jesus; Yazaki, Marcos Lomonaco; Blascovi-Assis, Silvana Maria

    2017-12-01

    This study aimed to evaluate movement acceleration characteristics in adolescents with Down syndrome (DS) and typical development (TD), while playing bowling and golf videogames on the Nintendo ® Wii™. The sample comprised 21 adolescents diagnosed with DS and 33 with TD of both sexes, between 10 and 14 years of age. The arm swing accelerations of the dominant upper limb were collected as measures during the bowling and the golf games. The first valid measurement, verified by the software readings, recorded at the start of each of the games, was used in the analysis. In the bowling game, the groups presented significant statistical differences, with the maximum (M) peaks of acceleration for the Male Control Group (MCG) (M = 70.37) and Female Control Group (FCG) (M = 70.51) when compared with Male Down Syndrome Group (MDSG) (M = 45.33) and Female Down Syndrome Group (FDSG) (M = 37.24). In the golf game the groups also presented significant statistical differences, the only difference being that the maximum peaks of acceleration for both male groups were superior compared with the female groups, MCG (M = 74.80) and FCG (M = 56.80), as well as in MDSG (M = 45.12) and in FDSG (M = 30.52). It was possible to use accelerometry to evaluate the movement acceleration characteristics of teenagers diagnosed with DS during virtual bowling and golf games played on the Nintendo Wii console.

  15. Steady-state natural circulation analysis with computational fluid dynamic codes of a liquid metal-cooled accelerator driven system

    International Nuclear Information System (INIS)

    Abanades, A.; Pena, A.

    2009-01-01

    A new innovative nuclear installation is under research in the nuclear community for its potential application to nuclear waste management and, above all, for its capability to enhance the sustainability of nuclear energy in the future as component of a new nuclear fuel cycle in which its efficiency in terms of primary Uranium ore profit and radioactive waste generation will be improved. Such new nuclear installations are called accelerator driven system (ADS) and are the result of a profitable symbiosis between accelerator technology, high-energy physics and reactor technology. Many ADS concepts are based on the utilization of heavy liquid metal (HLM) coolants due to its neutronic and thermo-physical properties. Moreover, such coolants permit the operation in free circulation mode, one of the main aims of passive systems. In this paper, such operation regime is analysed in a proposed ADS design applying computational fluid dynamics (CFD)

  16. Strategic Entrepreneurship

    DEFF Research Database (Denmark)

    Klein, Peter G.; Barney, Jay B.; Foss, Nicolai Juul

    Strategic entrepreneurship is a newly recognized field that draws, not surprisingly, from the fields of strategic management and entrepreneurship. The field emerged officially with the 2001 special issue of the Strategic Management Journal on “strategic entrepreneurship”; the first dedicated...... periodical, the Strategic Entrepreneurship Journal, appeared in 2007. Strategic entrepreneurship is built around two core ideas. (1) Strategy formulation and execution involves attributes that are fundamentally entrepreneurial, such as alertness, creativity, and judgment, and entrepreneurs try to create...... and capture value through resource acquisition and competitive posi-tioning. (2) Opportunity-seeking and advantage-seeking—the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field—are pro-cesses that should be considered jointly. This entry...

  17. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators control system

    OpenAIRE

    Van den Eynden, M

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete Xenix PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specificatio...

  18. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators system

    OpenAIRE

    Charrue, P; Bland, A; Ghinet, F; Ribeiro, P

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete XENIX PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specificatio...

  19. Electron Fermi acceleration in collapsing magnetic traps: Computational and analytical models

    International Nuclear Information System (INIS)

    Gisler, G.; Lemons, D.

    1990-01-01

    The authors consider the heating and acceleration of electrons trapped on magnetic field lines between approaching magnetic mirrors. Such a collapsing magnetic trap and consequent electron energization can occur whenever a curved (or straight) flux tube drifts into a relatively straight (or curved) perpendicular shock. The relativistic, three-dimensional, collisionless test particle simulations show that an initial thermal electron distribution is bulk heated while a few individual electrons are accelerated to many times their original energy before they escape the trap. Upstream field-aligned beams and downstream pancake distributions perpendicular to the field are predicted. In the appropriate limit the simulation results agree well with a nonrelativistic analytic model of the distribution of escaping electrons which is based on the first adiabatic invariant and energy conservation between collisions with the mirrors. Space science and astrophysical applications are discussed

  20. Computer study of isotope production for medical and industrial applications in high power accelerators

    Science.gov (United States)

    Mashnik, S. G.; Wilson, W. B.; Van Riper, K. A.

    2001-07-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes. These methods are readily applicable both to accelerator and reactor environments and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements that may be expanded to other reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures, is available on the Web at http://t2.lanl.gov/publications/.

  1. A computer study of radionuclide production in high power accelerators for medical and industrial applications

    Science.gov (United States)

    Van Riper, K. A.; Mashnik, S. G.; Wilson, W. B.

    2001-05-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes by high-energy protons and neutrons. These methods are readily applicable to accelerator, and reactor, environments other than the particular model we considered and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements. These methods also are applicable to an expanded set of reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures is available on the Web at http://t2.lanl.gov/publications/publications.html, or, if not accessible, in hard copy from the authors.

  2. Computational and experimental investigation of plasma deflagration jets and detonation shocks in coaxial plasma accelerators

    Science.gov (United States)

    Subramaniam, Vivek; Underwood, Thomas C.; Raja, Laxminarayan L.; Cappelli, Mark A.

    2018-02-01

    We present a magnetohydrodynamic (MHD) numerical simulation to study the physical mechanisms underlying plasma acceleration in a coaxial plasma gun. Coaxial plasma accelerators are known to exhibit two distinct modes of operation depending on the delay between gas loading and capacitor discharging. Shorter delays lead to a high velocity plasma deflagration jet and longer delays produce detonation shocks. During a single operational cycle that typically consists of two discharge events, the plasma acceleration exhibits a behavior characterized by a mode transition from deflagration to detonation. The first of the discharge events, a deflagration that occurs when the discharge expands into an initially evacuated domain, requires a modification of the standard MHD algorithm to account for rarefied regions of the simulation domain. The conventional approach of using a low background density gas to mimic the vacuum background results in the formation of an artificial shock, inconsistent with the physics of free expansion. To this end, we present a plasma-vacuum interface tracking framework with the objective of predicting a physically consistent free expansion, devoid of the spurious shock obtained with the low background density approach. The interface tracking formulation is integrated within the MHD framework to simulate the plasma deflagration and the second discharge event, a plasma detonation, formed due to its initiation in a background prefilled with gas remnant from the deflagration. The mode transition behavior obtained in the simulations is qualitatively compared to that observed in the experiments using high framing rate Schlieren videography. The deflagration mode is further investigated to understand the jet formation process and the axial velocities obtained are compared against experimentally obtained deflagration plasma front velocities. The simulations are also used to provide insight into the conditions responsible for the generation and sustenance of

  3. Computer simulations of a single-laser double-gas-jet wakefield accelerator concept

    Directory of Open Access Journals (Sweden)

    R. G. Hemker

    2002-04-01

    Full Text Available We report in this paper on full scale 2D particle-in-cell simulations investigating laser wakefield acceleration. First we describe our findings of electron beam generation by a laser propagating through a single gas jet. Using realistic parameters which are relevant for the experimental setup in our laboratory we find that the electron beam resulting after the propagation of a 0.8 μm, 50 fs laser through a 1.5 mm gas jet has properties that would make it useful for further acceleration. Our simulations show that the electron beam is generated when the laser exits the gas jet, and the properties of the generated beam, especially its energy, depend only weakly on most properties of the gas jet. We therefore propose to use the first gas jet as a plasma cathode and then use a second gas jet placed immediately behind the first to provide additional acceleration. Our simulations of this proposed setup indicate the feasibility of this idea and also suggest ways to optimize the quality of the resulting beam.

  4. Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing

    Science.gov (United States)

    Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim

    2011-03-01

    Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.

  5. Strategizing Communication

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    beyond, but not past instrumental, rational plans in order to become better able to understand and manage the concrete, incremental practices and contexts in which communication becomes strategic. Thus, we argue that although strategic communicators do (and should) make plans, a plan in itself does...... of the specific communicative disciplines and practices employed by the organization and/or its individual members, be they marketing, public relations, corporate communication, branding, public affairs or social advocacy. In all cases, strategic communicators do well to focus more on the process of communicating...... for understanding and managing strategic communication processes....

  6. Computer-controlled back scattering and sputtering-experiment using a heavy-ion-accelerator

    International Nuclear Information System (INIS)

    Becker, H.; Birnbaum, M.; Degenhardt, K.H.; Mertens, P.; Tschammer, V.

    1978-12-01

    Control and data acquisition of a PDP 11/40 computer and CAMAC instrumentation are reported for an experiment that has been developed to measure sputtering in yields and energy losses for heavy 100 - 300 keV ions in thin metal foils. Besides a quadrupole mass filter or a bending magnet, a multichannel analyser is coupled to the computer, so that also pulse height analysis can be performed under computer control. CAMAC instrumentation and measuring programs are built in a modular form to enable an easy application to other experimental problems. (orig.) 891 KBE/orig. 892 BRE

  7. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  8. Accelerated time-of-flight (TOF) PET image reconstruction using TOF bin subsetization and TOF weighting matrix pre-computation

    International Nuclear Information System (INIS)

    Mehranian, Abolfazl; Kotasidis, Fotis; Zaidi, Habib

    2016-01-01

    FDG-PET study also revealed that for the same noise level, a higher contrast recovery can be obtained by increasing the number of TOF subsets. It can be concluded that the proposed TOF weighting matrix pre-computation and subsetization approaches enable to further accelerate and improve the convergence properties of OSEM and MLEM algorithms, thus opening new avenues for accelerated TOF PET image reconstruction. (paper)

  9. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  10. National Strategic Computing Initiative Strategic Plan

    Science.gov (United States)

    2016-07-01

    23 A.6 National Nanotechnology Initiative...program, lack the memory capacity to perform current and anticipated new classes of scientific and engineering applications, and be potentially...Initiative: https://www.nitrd.gov/nitrdgroups/index.php?title=Big_Data_(BD_SSG)  National Nanotechnology Initiative: http://www.nano.gov  Precision

  11. The Role of Implicit Motives in Strategic Decision-Making: Computational Models of Motivated Learning and the Evolution of Motivated Agents

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2015-11-01

    Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.

  12. Numerical Nuclear Second Derivatives on a Computing Grid: Enabling and Accelerating Frequency Calculations on Complex Molecular Systems.

    Science.gov (United States)

    Yang, Tzuhsiung; Berry, John F

    2018-06-04

    The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.

  13. Image processing and computer controls for video profile diagnostic system in the ground test accelerator (GTA)

    International Nuclear Information System (INIS)

    Wright, R.; Zander, M.; Brown, S.; Sandoval, D.; Gilpatrick, D.; Gibson, H.

    1992-01-01

    This paper describes the application of video image processing to beam profile measurements on the Ground Test Accelerator (GTA). A diagnostic was needed to measure beam profiles in the intermediate matching section (IMS) between the radio-frequency quadrupole (RFQ) and the drift tube linac (DTL). Beam profiles are measured by injecting puffs of gas into the beam. The light emitted from the beam-gas interaction is captured and processed by a video image processing system, generating the beam profile data. A general purpose, modular and flexible video image processing system, imagetool, was used for the GTA image profile measurement. The development of both software and hardware for imagetool and its integration with the GTA control system (GTACS) is discussed. The software includes specialized algorithms for analyzing data and calibrating the system. The underlying design philosophy of imagetool was tested by the experience of building and using the system, pointing the way for future improvements. (Author) (3 figs., 4 refs.)

  14. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  15. Cone Beam Computed Tomography Guidance for Setup of Patients Receiving Accelerated Partial Breast Irradiation

    International Nuclear Information System (INIS)

    White, Elizabeth A.; Cho, John; Vallis, Katherine A.; Sharpe, Michael B.; Lee, Grace B.Sc.; Blackburn, Helen; Nageeti, Tahani; McGibney, Carol; Jaffray, David A.

    2007-01-01

    Purpose: To evaluate the role of cone-beam CT (CBCT) guidance for setup error reduction and soft tissue visualization in accelerated partial breast irradiation (APBI). Methods and Materials: Twenty patients were recruited for the delivery of radiotherapy to the postoperative cavity (3850 cGy in 10 fractions over 5 days) using an APBI technique. Cone-beam CT data sets were acquired after an initial skin-mark setup and before treatment delivery. These were registered online using the ipsilateral lung and external contours. Corrections were executed for translations exceeding 3 mm. The random and systematic errors associated with setup using skin-marks and setup using CBCT guidance were calculated and compared. Results: A total of 315 CBCT data sets were analyzed. The systematic errors for the skin-mark setup were 2.7, 1.7, and 2.4 mm in the right-left, anterior-posterior, and superior-inferior directions, respectively. These were reduced to 0.8, 0.7, and 0.8 mm when CBCT guidance was used. The random errors were reduced from 2.4, 2.2, and 2.9 mm for skin-marks to 1.5, 1.5, and 1.6 mm for CBCT guidance in the right-left, anterior-posterior, and superior-inferior directions, respectively. Conclusion: A skin-mark setup for APBI patients is sufficient for current planning target volume margins for the population of patients studied here. Online CBCT guidance minimizes the occurrence of large random deviations, which may have a greater impact for the accelerated fractionation schedule used in APBI. It is also likely to permit a reduction in planning target volume margins and provide skin-line visualization and dosimetric evaluation of cardiac and lung volumes

  16. More power : Accelerating sequential Computer Vision algorithms using commodity parallel hardware

    NARCIS (Netherlands)

    Jaap van de Loosdrecht; K. Dijkstra

    2014-01-01

    The last decade has seen an increasing demand from the industrial field of computerized visual inspection. Applications rapidly become more complex and often with more demanding real time constraints. However, from 2004 onwards the clock frequency of CPUs has not increased significantly. Computer

  17. A new strategic neurosurgical planning tool for brainstem cavernous malformations using interactive computer graphics with multimodal fusion images.

    Science.gov (United States)

    Kin, Taichi; Nakatomi, Hirofumi; Shojima, Masaaki; Tanaka, Minoru; Ino, Kenji; Mori, Harushi; Kunimatsu, Akira; Oyama, Hiroshi; Saito, Nobuhito

    2012-07-01

    In this study, the authors used preoperative simulation employing 3D computer graphics (interactive computer graphics) to fuse all imaging data for brainstem cavernous malformations. The authors evaluated whether interactive computer graphics or 2D imaging correlated better with the actual operative field, particularly in identifying a developmental venous anomaly (DVA). The study population consisted of 10 patients scheduled for surgical treatment of brainstem cavernous malformations. Data from preoperative imaging (MRI, CT, and 3D rotational angiography) were automatically fused using a normalized mutual information method, and then reconstructed by a hybrid method combining surface rendering and volume rendering methods. With surface rendering, multimodality and multithreshold techniques for 1 tissue were applied. The completed interactive computer graphics were used for simulation of surgical approaches and assumed surgical fields. Preoperative diagnostic rates for a DVA associated with brainstem cavernous malformation were compared between conventional 2D imaging and interactive computer graphics employing receiver operating characteristic (ROC) analysis. The time required for reconstruction of 3D images was 3-6 hours for interactive computer graphics. Observation in interactive mode required approximately 15 minutes. Detailed anatomical information for operative procedures, from the craniotomy to microsurgical operations, could be visualized and simulated three-dimensionally as 1 computer graphic using interactive computer graphics. Virtual surgical views were consistent with actual operative views. This technique was very useful for examining various surgical approaches. Mean (±SEM) area under the ROC curve for rate of DVA diagnosis was significantly better for interactive computer graphics (1.000±0.000) than for 2D imaging (0.766±0.091; pcomputer graphics than with 2D images. Interactive computer graphics was also useful in helping to plan the surgical

  18. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  19. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    Science.gov (United States)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  20. Accelerating inference for diffusions observed with measurement error and large sample sizes using approximate Bayesian computation

    DEFF Research Database (Denmark)

    Picchini, Umberto; Forman, Julie Lyng

    2016-01-01

    a nonlinear stochastic differential equation model observed with correlated measurement errors and an application to protein folding modelling. An approximate Bayesian computation (ABC)-MCMC algorithm is suggested to allow inference for model parameters within reasonable time constraints. The ABC algorithm......In recent years, dynamical modelling has been provided with a range of breakthrough methods to perform exact Bayesian inference. However, it is often computationally unfeasible to apply exact statistical methodologies in the context of large data sets and complex models. This paper considers...... applications. A simulation study is conducted to compare our strategy with exact Bayesian inference, the latter resulting two orders of magnitude slower than ABC-MCMC for the considered set-up. Finally, the ABC algorithm is applied to a large size protein data. The suggested methodology is fairly general...

  1. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    Science.gov (United States)

    2014-04-01

    Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented

  2. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    Science.gov (United States)

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Strategic decision making

    OpenAIRE

    Stokman, Frans N.; Assen, Marcel A.L.M. van; Knoop, Jelle van der; Oosten, Reinier C.H. van

    2000-01-01

    This paper introduces a methodology for strategic intervention in collective decision making.The methodology is based on (1) a decomposition of the problem into a few main controversial issues, (2) systematic interviews of subject area specialists to obtain a specification of the decision setting,consisting of a list of stakeholders with their capabilities, positions, and salience on each of the issues; (3) computer simulation. The computer simulation models incorporate only the main processe...

  4. Strategic serendipity

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark; Lemmergaard, Jeanette

    2014-01-01

    This paper contributes to critical voices on the issue of strategic communication. It does so by exploring how an organisation can seize the moment of serendipity based on careful preparation of its issues management and communication channels. The focus of the study is the media coverage......-of-the-art knowledge and in-depth understanding of the affordances of different communication channels, we discuss the importance of establishing opportunities for serendipity in strategic communication planning. The contribution of the paper is to develop the concept of strategic serendipity and show how...

  5. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators control system

    CERN Document Server

    Van den Eynden, M

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete Xenix PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specification for the new PowerPC technology, the re-engineering aspects, the interfaces with other CERN wide projects, and the set up of a development environment. This project offers also support for other major SPS and LEP projects interested in the PowerPC microprocessor technology.

  6. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators system

    CERN Document Server

    Charrue, P; Ghinet, F; Ribeiro, P

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete XENIX PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specification for the new PowerPC technology, the re-engineering aspects, the interfaces with other CERN wide projects, and the set up of a development environment. This project offers also support for other major SPS and LEP projects interested in the PowerPC microprocessor technology.

  7. Accelerated Computing in Magnetic Resonance Imaging: Real-Time Imaging Using Nonlinear Inverse Reconstruction

    Directory of Open Access Journals (Sweden)

    Sebastian Schaetz

    2017-01-01

    Full Text Available Purpose. To develop generic optimization strategies for image reconstruction using graphical processing units (GPUs in magnetic resonance imaging (MRI and to exemplarily report on our experience with a highly accelerated implementation of the nonlinear inversion (NLINV algorithm for dynamic MRI with high frame rates. Methods. The NLINV algorithm is optimized and ported to run on a multi-GPU single-node server. The algorithm is mapped to multiple GPUs by decomposing the data domain along the channel dimension. Furthermore, the algorithm is decomposed along the temporal domain by relaxing a temporal regularization constraint, allowing the algorithm to work on multiple frames in parallel. Finally, an autotuning method is presented that is capable of combining different decomposition variants to achieve optimal algorithm performance in different imaging scenarios. Results. The algorithm is successfully ported to a multi-GPU system and allows online image reconstruction with high frame rates. Real-time reconstruction with low latency and frame rates up to 30 frames per second is demonstrated. Conclusion. Novel parallel decomposition methods are presented which are applicable to many iterative algorithms for dynamic MRI. Using these methods to parallelize the NLINV algorithm on multiple GPUs, it is possible to achieve online image reconstruction with high frame rates.

  8. Combining computation and experiment to accelerate the discovery of new hydrogen storage materials

    Science.gov (United States)

    Siegel, Donald

    2009-03-01

    The potential of emerging technologies such as fuel cells (FCs) and photovoltaics for environmentally-benign power generation has sparked renewed interest in the development of novel materials for high density energy storage. For applications in the transportation sector, the demands placed upon energy storage media are especially stringent, as a potential replacement for fossil-fuel-powered internal combustion engines -- namely, the proton exchange membrane FC -- utilizes hydrogen as a fuel. Although hydrogen has about three times the energy density of gasoline by weight, its volumetric energy density (even at 700 bar) is roughly a factor of six smaller. Consequently, the safe and efficient storage of hydrogen has been identified as one of the key materials-based challenges to realizing a transition to FC vehicles. This talk will present an overview of recent efforts at Ford aimed at developing new materials for reversible, solid state hydrogen storage. A tight coupling between first-principles modeling and experiments has greatly accelerated our efforts, and several examples illustrating the benefits of this approach will be presented.

  9. A computer code 'BEAM' for the ion optics calculation of the JAERI tandem accelerator system

    International Nuclear Information System (INIS)

    Kikuchi, Shiroh; Takeuchi, Suehiro

    1987-11-01

    The computer code BEAM is described, together with an outline of the formalism used for the ion optics calculation. The purpose of the code is to obtain the optimum parameters of devices, with which the ion beam is transported through the system without losses. The procedures of the calculation, especially those of searching for the parameters of quadrupole lenses, are discussed in detail. The flow of the code is illustrated as a whole and its constituent subroutines are explained individually. A few resultant beam trajectories and the parameters used to obtain them are shown as examples. (author)

  10. Strategic Supply

    National Research Council Canada - National Science Library

    Alexander, Kelly; Cole, Heather; Cural, Ahmet; Daugherty, Darryl; Howard, Russell; Keane, Thomas; Louie, K. Y; McNeely, Rosa; Mordente, Patrick; Petrillo, Robert

    2006-01-01

    ...; but rather, as an enabler across all industries. Therefore, this industry study looked at Strategic Supply as an integrated process performed by industries to obtain comparative and competitive advantage in the global marketplace...

  11. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...

  12. Strategic Responsiveness

    DEFF Research Database (Denmark)

    Pedersen, Carsten; Juul Andersen, Torben

    decision making is often conceived as ‘standing on the two feet’ of deliberate or intended strategic decisions by top management and emergent strategic decisions pursued by lower-level managers and employees. In this view, the paper proposes that bottom-up initiatives have a hard time surfacing...... in hierarchical organizations and that lowerlevel managers and employees, therefore, pursue various strategies to bypass the official strategy processes to act on emerging strategic issues and adapt to changing environmental conditions.......The analysis of major resource committing decisions is central focus in the strategy field, but despite decades of rich conceptual and empirical research we still seem distant from a level of understanding that can guide corporate practices under dynamic and unpredictable conditions. Strategic...

  13. Utilizing the Double-Precision Floating-Point Computing Power of GPUs for RSA Acceleration

    Directory of Open Access Journals (Sweden)

    Jiankuo Dong

    2017-01-01

    Full Text Available Asymmetric cryptographic algorithm (e.g., RSA and Elliptic Curve Cryptography implementations on Graphics Processing Units (GPUs have been researched for over a decade. The basic idea of most previous contributions is exploiting the highly parallel GPU architecture and porting the integer-based algorithms from general-purpose CPUs to GPUs, to offer high performance. However, the great potential cryptographic computing power of GPUs, especially by the more powerful floating-point instructions, has not been comprehensively investigated in fact. In this paper, we fully exploit the floating-point computing power of GPUs, by various designs, including the floating-point-based Montgomery multiplication/exponentiation algorithm and Chinese Remainder Theorem (CRT implementation in GPU. And for practical usage of the proposed algorithm, a new method is performed to convert the input/output between octet strings and floating-point numbers, fully utilizing GPUs and further promoting the overall performance by about 5%. The performance of RSA-2048/3072/4096 decryption on NVIDIA GeForce GTX TITAN reaches 42,211/12,151/5,790 operations per second, respectively, which achieves 13 times the performance of the previous fastest floating-point-based implementation (published in Eurocrypt 2009. The RSA-4096 decryption precedes the existing fastest integer-based result by 23%.

  14. FMIT accelerator

    International Nuclear Information System (INIS)

    Armstrong, D.D.

    1983-01-01

    A 35-MeV 100-mA cw linear accelerator is being designed by Los Alamos for use in the Fusion Materials Irradiation Test (FMIT) Facility. Essential to this program is the design, construction, and evaluation of performance of the accelerator's injector, low-energy beam transport, and radio-frequency quadrupole sections before they are shipped to the facility site. The installation and testing of some of these sections have begun as well as the testing of the rf, noninterceptive beam diagnostics, computer control, dc power, and vacuum systems. An overview of the accelerator systems and the performance to date is given

  15. An accelerated conjugate gradient algorithm to compute low-lying eigenvalues - a study for the Dirac operator in SU(2) lattice QCD

    International Nuclear Information System (INIS)

    Kalkreuter, T.; Simma, H.

    1995-07-01

    The low-lying eigenvalues of a (sparse) hermitian matrix can be computed with controlled numerical errors by a conjugate gradient (CG) method. This CG algorithm is accelerated by alternating it with exact diagonalizations in the subspace spanned by the numerically computed eigenvectors. We study this combined algorithm in case of the Dirac operator with (dynamical) Wilson fermions in four-dimensional SU(2) gauge fields. The algorithm is numerically very stable and can be parallelized in an efficient way. On lattices of sizes 4 4 - 16 4 an acceleration of the pure CG method by a factor of 4 - 8 is found. (orig.)

  16. Uncertainty quantification an accelerated course with advanced applications in computational engineering

    CERN Document Server

    Soize, Christian

    2017-01-01

    This book presents the fundamental notions and advanced mathematical tools in the stochastic modeling of uncertainties and their quantification for large-scale computational models in sciences and engineering. In particular, it focuses in parametric uncertainties, and non-parametric uncertainties with applications from the structural dynamics and vibroacoustics of complex mechanical systems, from micromechanics and multiscale mechanics of heterogeneous materials. Resulting from a course developed by the author, the book begins with a description of the fundamental mathematical tools of probability and statistics that are directly useful for uncertainty quantification. It proceeds with a well carried out description of some basic and advanced methods for constructing stochastic models of uncertainties, paying particular attention to the problem of calibrating and identifying a stochastic model of uncertainty when experimental data is available. < This book is intended to be a graduate-level textbook for stu...

  17. TE/TM scheme for computation of electromagnetic fields in accelerators

    International Nuclear Information System (INIS)

    Zagorodnov, Igor; Weiland, Thomas

    2005-01-01

    We propose a new two-level economical conservative scheme for short-range wake field calculation in three dimensions. The scheme does not have dispersion in the longitudinal direction and is staircase free (second order convergent). Unlike the finite-difference time domain method (FDTD), it is based on a TE/TM like splitting of the field components in time. Additionally, it uses an enhanced alternating direction splitting of the transverse space operator that makes the scheme computationally as effective as the conventional FDTD method. Unlike the FDTD ADI and low-order Strang methods, the splitting error in our scheme is only of fourth order. As numerical examples show, the new scheme is much more accurate on the long-time scale than the conventional FDTD approach

  18. From variability tolerance to approximate computing in parallel integrated architectures and accelerators

    CERN Document Server

    Rahimi, Abbas; Gupta, Rajesh K

    2017-01-01

    This book focuses on computing devices and their design at various levels to combat variability. The authors provide a review of key concepts with particular emphasis on timing errors caused by various variability sources. They discuss methods to predict and prevent, detect and correct, and finally conditions under which such errors can be accepted; they also consider their implications on cost, performance and quality. Coverage includes a comparative evaluation of methods for deployment across various layers of the system from circuits, architecture, to application software. These can be combined in various ways to achieve specific goals related to observability and controllability of the variability effects, providing means to achieve cross layer or hybrid resilience. · Covers challenges and opportunities in identifying microelectronic variability and the resulting errors at various layers in the system abstraction; · Enables readers to assess how various levels of circuit and system design can mitigate t...

  19. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU.

    Science.gov (United States)

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ∼0.25  s/excitation source. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU

    Science.gov (United States)

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ˜600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ˜0.25 s/excitation source.

  1. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  2. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. The stability of mechanical calibration for a kV cone beam computed tomography system integrated with linear accelerator

    International Nuclear Information System (INIS)

    Sharpe, Michael B.; Moseley, Douglas J.; Purdie, Thomas G.

    2006-01-01

    The geometric accuracy and precision of an image-guided treatment system were assessed. Image guidance is performed using an x-ray volume imaging (XVI) system integrated with a linear accelerator and treatment planning system. Using an amorphous silicon detector and x-ray tube, volumetric computed tomography images are reconstructed from kilovoltage radiographs by filtered backprojection. Image fusion and assessment of geometric targeting are supported by the treatment planning system. To assess the limiting accuracy and precision of image-guided treatment delivery, a rigid spherical target embedded in an opaque phantom was subjected to 21 treatment sessions over a three-month period. For each session, a volumetric data set was acquired and loaded directly into an active treatment planning session. Image fusion was used to ascertain the couch correction required to position the target at the prescribed iso-center. Corrections were validated independently using megavoltage electronic portal imaging to record the target position with respect to symmetric treatment beam apertures. An initial calibration cycle followed by repeated image-guidance sessions demonstrated the XVI system could be used to relocate an unambiguous object to within less than 1 mm of the prescribed location. Treatment could then proceed within the mechanical accuracy and precision of the delivery system. The calibration procedure maintained excellent spatial resolution and delivery precision over the duration of this study, while the linear accelerator was in routine clinical use. Based on these results, the mechanical accuracy and precision of the system are ideal for supporting high-precision localization and treatment of soft-tissue targets

  4. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    Science.gov (United States)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  5. Burnup calculations for KIPT accelerator driven subcritical facility using Monte Carlo computer codes-MCB and MCNPX

    International Nuclear Information System (INIS)

    Gohar, Y.; Zhong, Z.; Talamo, A.

    2009-01-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an electron accelerator driven subcritical (ADS) facility, using the KIPT electron accelerator. The neutron source of the subcritical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The electron beam has a uniform spatial distribution and electron energy in the range of 100 to 200 MeV. The main functions of the subcritical assembly are the production of medical isotopes and the support of the Ukraine nuclear power industry. Neutron physics experiments and material structure analyses are planned using this facility. With the 100 KW electron beam power, the total thermal power of the facility is ∼375 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products reduce continuously the reactivity during the operation, which reduces the neutron flux level and consequently the facility performance. To preserve the neutron flux level during the operation, fuel assemblies should be added after long operating periods to compensate for the lost reactivity. This process requires accurate prediction of the fuel burnup, the decay behavior of the fission produces, and the introduced reactivity from adding fresh fuel assemblies. The recent developments of the Monte Carlo computer codes, the high speed capability of the computer processors, and the parallel computation techniques made it possible to perform three-dimensional detailed burnup simulations. A full detailed three-dimensional geometrical model is used for the burnup simulations with continuous energy nuclear data libraries for the transport calculations and 63-multigroup or one group cross sections libraries for the depletion calculations. Monte Carlo Computer code MCNPX and MCB are utilized for this study. MCNPX transports the electrons and the

  6. Strategic Aspirations

    DEFF Research Database (Denmark)

    Christensen, Lars Thøger; Morsing, Mette; Thyssen, Ole

    2016-01-01

    are often encouraged by social norms, regulations, and institutions—for example, institutionalized standards for corporate social responsibility (CSR) reporting—they live through local articulations and enactments that allow organizations to discover who they are and who they might become. Strategic......Strategic aspirations are public announcements designed to inspire, motivate, and create expectations about the future. Vision statements or value declarations are examples of such talk, through which organizations announce their ideal selves and declare what they (intend to) do. While aspirations...... aspirations, in other words, have exploratory and inspirational potential—two features that are highly essential in complex areas such as sustainability and CSR. This entry takes a communicative focus on strategic aspirations, highlighting the value of aspirational talk, understood as ideals and intentions...

  7. An "Elective Replacement" Approach to Providing Extra Help in Math: The Talent Development Middle Schools' Computer- and Team-Assisted Mathematics Acceleration (CATAMA) Program.

    Science.gov (United States)

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephan B.

    1999-01-01

    Two studies evaluated the Computer- and Team-Assisted Mathematics Acceleration course (CATAMA) in Talent Development Middle Schools. The first study compared growth in math achievement for 96 seventh-graders (48 of whom participated in CATAMA and 48 of whom did not); the second study gathered data from interviews with, and observations of, CATAMA…

  8. Computational Approach for Securing Radiology-Diagnostic Data in Connected Health Network using High-Performance GPU-Accelerated AES.

    Science.gov (United States)

    Adeshina, A M; Hashim, R

    2017-03-01

    Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also

  9. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  10. Strategic Staffing

    Science.gov (United States)

    Clark, Ann B.

    2012-01-01

    Business and industry leaders do not flinch at the idea of placing top talent in struggling departments and divisions. This is not always the case in public education. The Charlotte-Mecklenburg Schools made a bold statement to its community in its strategic plan by identifying two key reform levers--(1) an effective principal leading each school;…

  11. Strategic Equilibrium

    NARCIS (Netherlands)

    van Damme, E.E.C.

    2000-01-01

    An outcome in a noncooperative game is said to be self-enforcing, or a strategic equilibrium, if, whenever it is recommended to the players, no player has an incentive to deviate from it.This paper gives an overview of the concepts that have been proposed as formalizations of this requirement and of

  12. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  13. Strategic Supply

    Science.gov (United States)

    2006-01-01

    leaders as Sears, Limited Brands, DHL, Circuit City, Cingular, Nestle and IKEA (Manugistics, 2006). The Strategic Supply Chain Industry Study Group...inventory turns have increased. Other global customers have also reaped the benefits of the Manugistics software. IKEA , Sweden’s retail icon...turned to Manugistics after a mid-1990s ERP implementation failed to fix their forecasting problems, which gave way to fluctuating inventory levels. IKEA

  14. Strategic Stillness

    DEFF Research Database (Denmark)

    Hupalo, Mariia

    2017-01-01

    Throughout the world, we can observe visible complexities, ambiguities and activities of continuously overlapping strategic pursuits of different interest groups. Seen this way, the materialities of parking systems can stage and determine contemporary mobilities in two ways: through decisions taken...... “from above” – design and planning regulations, and “from below” by humans who choose modes of transport, ways of interacting and time of travel. These entanglements of technology and culture are manifested in parking infrastructures....

  15. A dual computed tomography linear accelerator unit for stereotactic radiation therapy: a new approach without cranially fixated stereotactic frames

    International Nuclear Information System (INIS)

    Uematsu, Minoru; Fukui, Toshiharu; Shioda, Akira; Tokumitsu, Hideyuki; Takai, Kenji; Kojima, Tadaharu; Asai, Yoshiko; Kusano, Shoichi

    1996-01-01

    Purpose: To perform stereotactic radiation therapy (SRT) without cranially fixated stereotactic frames, we developed a dual computed tomography (CT) linear accelerator (linac) treatment unit. Methods and Materials: This unit is composed of a linac, CT, and motorized table. The linac and CT are set up at opposite ends of the table, which is suitable for both machines. The gantry axis of the linac is coaxial with that of the CT scanner. Thus, the center of the target detected with the CT can be matched easily with the gantry axis of the linac by rotating the table. Positioning is confirmed with the CT for each treatment session. Positioning and treatment errors with this unit were examined by phantom studies. Between August and December 1994, 8 patients with 11 lesions of primary or metastatic brain tumors received SRT with this unit. All lesions were treated with 24 Gy in three fractions to 30 Gy in 10 fractions to the 80% isodose line, with or without conventional external beam radiation therapy. Results: Phantom studies revealed that treatment errors with this unit were within 1 mm after careful positioning. The position was easily maintained using two tiny metallic balls as vertical and horizontal marks. Motion of patients was negligible using a conventional heat-flexible head mold and dental impression. The overall time for a multiple noncoplanar arcs treatment for a single isocenter was less than 1 h on the initial treatment day and usually less than 20 min on subsequent days. Treatment was outpatient-based and well tolerated with no acute toxicities. Satisfactory responses have been documented. Conclusion: Using this treatment unit, multiple fractionated SRT is performed easily and precisely without cranially fixated stereotactic frames

  16. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  17. RF accelerators for fusion and strategic defense

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1985-01-01

    RF linacs have a place in fusion, either in an auxiliary role for materials testing or for direct drivers in heavy-ion fusion. For SDI, the particle-beam technology is an attractive candidate for discrimination missions and also for lethality missions. The free-electron laser is also a forerunner among the laser candidates. in many ways, there is less physics development required for these devices and there is an existing high-power technology. But in all of these technologies, in order to scale them up and then space-base them, there is an enormous amount of work yet to be done

  18. Strategic Windows

    DEFF Research Database (Denmark)

    Risberg, Annette; King, David R.; Meglio, Olimpia

    We examine the importance of speed and timing in acquisitions with a framework that identifies management considerations for three interrelated acquisition phases (selection, deal closure and integration) from an acquiring firm’s perspective. Using a process perspective, we pinpoint items within ...... acquisition phases that relate to speed. In particular, we present the idea of time-bounded strategic windows in acquisitions consistent with the notion of kairòs, where opportunities appear and must be pursued at the right time for success to occur....

  19. Strategic Leadership

    Directory of Open Access Journals (Sweden)

    Mohammad Jaradat

    2017-01-01

    Full Text Available Leadership as a concept has been very useful in the last decades, but when it comes to definingand especially to applying strategic leadership theories into the day-to-day life of organizations,things become much more complicated. It is imperative that managers select their basic theoreticalneed in order to assess one organizations leadership. The following article aims to prove that it isnecessary to choose more than one theoretical instrument before applying them into a specificplan, which combines more than one theoretical approach for evaluating and improving strategicleadership into an organization.

  20. Strategic Management

    CERN Document Server

    Jeffs, Chris

    2008-01-01

    The Sage Course Companion on Strategic Management is an accessible introduction to the subject that avoids lengthy debate in order to focus on the core concepts. It will help the reader to develop their understanding of the key theories, whilst enabling them to bring diverse topics together in line with course requirements. The Sage Course Companion also provides advice on getting the most from your course work; help with analysing case studies and tips on how to prepare for examinations. Designed to compliment existing strategy textbooks, the Companion provides: -Quick and easy access to the

  1. Replayability in Strategic Computer Games

    OpenAIRE

    Pedersen, Kasper Allan

    2012-01-01

    The empirical material is not included in the file.; The idea for this master thesis originates in a curiosity concerning why so many of the strategy games from the 1990's are still being played today. Many of the games still have very active fans that modify their old favourite games for online play just so that they can share their passion with other fans around the world. What aspects about these strategy games is it that promotes the level of replayability that has kept them fresh while s...

  2. Lasers and particle beam for fusion and strategic defense

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    This special issue of the Journal of Fusion Energy consists of the edited transscripts of a symposium on the applications of laser and particle beams to fusion and strategic defense. Its eleven papers discuss these topics: the Strategic Defense Initiative; accelerators for heavy ion fusion; rf accelerators for fusion and strategic defense; Pulsed power, ICF, and the Strategic Defense Initiative; chemical lasers; the feasibility of KrF lasers for fusion; the damage resistance of coated optic; liquid crystal devices for laser systems; fusion neutral-particle beam research and its contribution to the Star Wars program; and induction linacs and free electron laser amplifiers for ICF devices and directed-energy weapons

  3. Evaluation using Monte Carlo simulations, of the effect of a shielding, called external shielding, for fotoneutrons generated in linear accelerators, using the computational model of Varian accelerator 2300 C/D operating in eight rotation angles of the GA

    International Nuclear Information System (INIS)

    Silva, Hugo R.; Silva, Ademir X.; Rebello, Wilson F.; Silva, Maria G.

    2011-01-01

    This paper aims to present the results obtained by Monte Carlo simulation of the effect of shielding against neutrons, called External Shielding, to be placed on the heads of linear accelerators used in radiotherapy. For this, it was used the radiation transport code Monte Carlo N-Particle - MCNPX, in which were developed computational model of the head of the linear accelerator Varian 2300 C/D. The equipment was simulated within a bunker, operating at energies of 10, 15 and 18 MV, considering the rotation of the gantry at eight different angles ( 0 deg, 45 deg, 90 deg, 135 deg, 180 deg, 225 deg, 270 deg and 315 deg), in all cases, the equipment was modeled without and with the shielding positioned attached to the head of the accelerator on its bottom. In each of these settings, it was calculated the Ambient Dose Equivalent due to neutron H * (10)n on points situated in the region of the patient (region of interest for evaluation of undesirable neutron doses on the patient) and in the maze of radiotherapy room (region of interest for shielding the access door to the bunker). It was observed for all energies of equipment operation as well as for all angles of inclination of the gantry, a significant reduction in the values of H * (10) n when the equipment operated with the external shielding, both in the region of the patient as in the region of the maze. (author)

  4. Strategic plan

    International Nuclear Information System (INIS)

    1993-01-01

    In November 1989, the Office of Environmental Restoration and Waste Management (EM) was formed within the US Department of Energy (DOE). The EM Program was born of the recognition that a significant national effort was necessary to clean up over 45 years' worth of environmental pollution from DOE operations, including the design and manufacture of nuclear materials and weapons. Within EM, the Deputy Assistant Secretary for Environmental Restoration (EM-40) has been assigned responsibility for the assessment and cleanup of areas and facilities that are no longer a part of active DOE operations, but may be contaminated with varying levels and quantifies of hazardous, radioactive, and n-mixed waste. Decontamination and decommissioning (D ampersand D) activities are managed as an integral part of Envirorunental Restoration cleanup efforts. The Office of Environmental Restoration ensures that risks to the environment and to human health and safety are either eliminated or reduced to prescribed, acceptable levels. This Strategic Plan has been developed to articulate the vision of the Deputy Assistant Secretary for Environmental Restoration and to crystallize the specific objectives of the Environmental Restoration Program. The document summarizes the key planning assumptions that guide or constrain the strategic planning effort, outlines the Environmental Restoration Program's specific objectives, and identifies barriers that could limit the Program's success

  5. Strategic Planning: What's so Strategic about It?

    Science.gov (United States)

    Strong, Bart

    2005-01-01

    The words "strategic" and "planning" used together can lead to confusion unless one spent the early years of his career in never-ending, team-oriented, corporate training sessions. Doesn't "strategic" have something to do with extremely accurate bombing or a defensive missile system or Star Wars or something? Don't "strategic" and "planning" both…

  6. 7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

    CERN Multimedia

    Anna Pantelia

    2013-01-01

    7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

  7. Strategic Leadership Primer (Third Edition)

    Science.gov (United States)

    2010-01-01

    decision making � STRATEGIC DECISION MAKING Strategic Change There are several strategic decisions that involved...The Ontology of Strategic Decision Making Strategic decisions are non-routine and involve both the art of leadership and the science of management...building consensus,”5 implicitly requires the capacity for strategic decision making� The Complexity of Strategic Decision Making Strategic

  8. A heterogeneous computing accelerated SCE-UA global optimization method using OpenMP, OpenCL, CUDA, and OpenACC.

    Science.gov (United States)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Liang, Ke; Hong, Yang

    2017-10-01

    The shuffled complex evolution optimization developed at the University of Arizona (SCE-UA) has been successfully applied in various kinds of scientific and engineering optimization applications, such as hydrological model parameter calibration, for many years. The algorithm possesses good global optimality, convergence stability and robustness. However, benchmark and real-world applications reveal the poor computational efficiency of the SCE-UA. This research aims at the parallelization and acceleration of the SCE-UA method based on powerful heterogeneous computing technology. The parallel SCE-UA is implemented on Intel Xeon multi-core CPU (by using OpenMP and OpenCL) and NVIDIA Tesla many-core GPU (by using OpenCL, CUDA, and OpenACC). The serial and parallel SCE-UA were tested based on the Griewank benchmark function. Comparison results indicate the parallel SCE-UA significantly improves computational efficiency compared to the original serial version. The OpenCL implementation obtains the best overall acceleration results however, with the most complex source code. The parallel SCE-UA has bright prospects to be applied in real-world applications.

  9. Implementation Of Strategic Management

    African Journals Online (AJOL)

    Administrator

    Creativity and innovation is the new game plan inherent in strategic .... The diagram below is a simplified operational model of strategic management, ..... Bryson (1995) outlines four benefits of strategic (planning) Management in his ... champions, good strategic planning teams, enough slack to handle potentially disruptive.

  10. Strategic Polarization.

    Science.gov (United States)

    Kalai, Adam; Kalai, Ehud

    2001-08-01

    In joint decision making, similarly minded people may take opposite positions. Consider the example of a marriage in which one spouse gives generously to charity while the other donates nothing. Such "polarization" may misrepresent what is, in actuality, a small discrepancy in preferences. It may be that the donating spouse would like to see 10% of their combined income go to charity each year, while the apparently frugal spouse would like to see 8% donated. A simple game-theoretic analysis suggests that the spouses will end up donating 10% and 0%, respectively. By generalizing this argument to a larger class of games, we provide strategic justification for polarization in many situations such as debates, shared living accommodations, and disciplining children. In some of these examples, an arbitrarily small disagreement in preferences leads to an arbitrarily large loss in utility for all participants. Such small disagreements may also destabilize what, from game-theoretic point of view, is a very stable equilibrium. Copyright 2001 Academic Press.

  11. Monte Carlo computation of Bremsstrahlung intensity and energy spectrum from a 15 MV linear electron accelerator tungsten target to optimise LINAC head shielding

    International Nuclear Information System (INIS)

    Biju, K.; Sharma, Amiya; Yadav, R.K.; Kannan, R.; Bhatt, B.C.

    2003-01-01

    The knowledge of exact photon intensity and energy distributions from the target of an electron target is necessary while designing the shielding for the accelerator head from radiation safety point of view. The computations were carried out for the intensity and energy distribution of photon spectrum from a 0.4 cm thick tungsten target in different angular directions for 15 MeV electrons using a validated Monte Carlo code MCNP4A. Similar results were computed for 30 MeV electrons and found agreeing with the data available in literature. These graphs and the TVT values in lead help to suggest an optimum shielding thickness for 15 MV Linac head. (author)

  12. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    International Nuclear Information System (INIS)

    Xu, Zuwei; Zhao, Haibo; Zheng, Chuguang

    2015-01-01

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule provides a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are

  13. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    Dubois, J.

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author) [fr

  14. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  15. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  16. The Talent Development Middle School. An Elective Replacement Approach to Providing Extra Help in Math--The CATAMA Program (Computer- and Team-Assisted Mathematics Acceleration). Report No. 21.

    Science.gov (United States)

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephen B.

    In Talent Development Middle Schools, students needing extra help in mathematics participate in the Computer- and Team-Assisted Mathematics Acceleration (CATAMA) course. CATAMA is an innovative combination of computer-assisted instruction and structured cooperative learning that students receive in addition to their regular math course for about…

  17. Can Accelerators Accelerate Learning?

    International Nuclear Information System (INIS)

    Santos, A. C. F.; Fonseca, P.; Coelho, L. F. S.

    2009-01-01

    The 'Young Talented' education program developed by the Brazilian State Funding Agency (FAPERJ)[1] makes it possible for high-schools students from public high schools to perform activities in scientific laboratories. In the Atomic and Molecular Physics Laboratory at Federal University of Rio de Janeiro (UFRJ), the students are confronted with modern research tools like the 1.7 MV ion accelerator. Being a user-friendly machine, the accelerator is easily manageable by the students, who can perform simple hands-on activities, stimulating interest in physics, and getting the students close to modern laboratory techniques.

  18. Can Accelerators Accelerate Learning?

    Science.gov (United States)

    Santos, A. C. F.; Fonseca, P.; Coelho, L. F. S.

    2009-03-01

    The 'Young Talented' education program developed by the Brazilian State Funding Agency (FAPERJ) [1] makes it possible for high-schools students from public high schools to perform activities in scientific laboratories. In the Atomic and Molecular Physics Laboratory at Federal University of Rio de Janeiro (UFRJ), the students are confronted with modern research tools like the 1.7 MV ion accelerator. Being a user-friendly machine, the accelerator is easily manageable by the students, who can perform simple hands-on activities, stimulating interest in physics, and getting the students close to modern laboratory techniques.

  19. Microprocessor based beam intensity and efficiency display system for the Fermilab accelerator

    International Nuclear Information System (INIS)

    Biwer, R.

    1979-01-01

    The Main Accelerator display system for the Fermilab accelerator gathers charge data and displays it including processed transfer efficiencies of each of the accelerators. To accomplish this, strategically located charge converters monitor the circulating internal beam of each of the Fermilab accelerators. Their outputs are processed via an asynchronously triggered, multiplexed analog-to-digital converter. The data is converted into a digital byte containing address code and data, then stores it into two 16-bit memories. One memory outputs the interleaved data as a data pulse train while the other interfaces directly to a local host computer for further analysis. The microprocessor based display unit synchronizes displayed data during normal operation as well as special storage modes. The display unit outputs data to the fron panel in the form of a numeric value and also makes digital-to-analog conversions of displayed data for external peripheral devices. 5 refs

  20. New challenges for HEP computing: RHIC [Relativistic Heavy Ion Collider] and CEBAF [Continuous Electron Beam Accelerator Facility

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1990-01-01

    We will look at two facilities; RHIC and CEBF. CEBF is in the construction phase, RHIC is about to begin construction. For each of them, we examine the kinds of physics measurements that motivated their construction, and the implications of these experiments for computing. Emphasis will be on on-line requirements, driven by the data rates produced by these experiments

  1. Application of the multicriterion optimization techniques and hierarchy of computational models to the research of ion acceleration due to laser-plasma interaction

    Science.gov (United States)

    Inovenkov, I. N.; Echkina, E. Yu.; Nefedov, V. V.; Ponomarenko, L. S.

    2017-12-01

    In this paper we discuss how a particles-in-cell computation code can be combined with methods of multicriterion optimization (in particular the Pareto optimal solutions of the multicriterion optimization problem) and a hierarchy of computational models approach to create an efficient tool for solving a wide array of problems related to the laser-plasma interaction. In case of the computational experiment the multicriterion optimization can be applied as follows: the researcher defines the objectives of the experiment - some computable scalar values (i.e. high kinetic energy of the ions leaving the domain, least possible number of electrons leaving domain in the given direction, etc). After that the parameters of the experiment which can be varied to achieve these objectives and the constrains on these parameters are chosen (e.g. amplitude and wave-length of the laser radiation, dimensions of the plasma slab(s)). The Pareto optimality of the vector of the parameters can be seen as this: x 0 is Pareto optimal if there exists no vector which would improve some criterion without causing a simultaneous degradation in at least one other criterion. These efficient set of parameter and constrains can be selected based on the preliminary calculations in the simplified models (one or two-dimensional) either analytical or numerical. The multistage computation of the Pareto set radically reduces the number of variants which are to be evaluated to achieve the given accuracy. During the final stage we further improve the results by recomputing some of the optimal variants on the finer grids, with more particles and/or in the frame of a more detailed model. As an example we have considered the ion acceleration caused by interaction of very intense and ultra-short laser pulses with plasmas and have calculated the optimal set of experiment parameters for optimizing number and average energy of high energy ions leaving the domain in the given direction and minimizing the expulsion

  2. Jesus the Strategic Leader

    National Research Council Canada - National Science Library

    Martin, Gregg

    2000-01-01

    Jesus was a great strategic leader who changed the world in many ways. Close study of what he did and how he did it reveals a pattern of behavior that is extremely useful and relevant to the modern strategic leader...

  3. Learning without experience: Understanding the strategic implications of deregulation and competition in the electricity industry

    Energy Technology Data Exchange (ETDEWEB)

    Lomi, A. [School of Economics, University of Bologna, Bologna (Italy); Larsen, E.R. [Dept. of Managements Systems and Information, City University Business School, London (United Kingdom)

    1998-11-01

    As deregulation of the electricity industry continues to gain momentum around the world, electricity companies face unprecedented challenges. Competitive complexity and intensity will increase substantially as deregulated companies find themselves competing in new industries, with new rules, against unfamiliar competitors - and without any history to learn from. We describe the different kinds of strategic issues that newly deregulated utility companies are facing, and the risks that strategic issues implicate. We identify a number of problems induced by experiential learning under conditions of competence-destroying change, and we illustrate ways in which companies can activate history-independent learning processes. We suggest that Micro worlds - a new generation of computer-based learning environments made possible by conceptual and technological progress in the fields of system dynamics and systems thinking - are particularly appropriate tools to accelerate and enhance organizational and managerial learning under conditions of increased competitive complexity. (au)

  4. Measurements and computer modeling of fast ion emission from plasma accelerators of the rod plasma injector type

    International Nuclear Information System (INIS)

    Malinowski, Karol; Sadowski, Marek J; Skladnik-Sadowska, Elzbieta

    2014-01-01

    This paper reports on the results of experimental studies and computer simulations of the emission of fast ion streams from so-called rod plasma injectors (RPI). Various RPI facilities have been used at the National Centre for Nuclear Research (NCBJ) for basic plasma studies as well as for material engineering. In fact, the RPI facilities have been studied experimentally for many years, particularly at the Institute for Nuclear Sciences (now the NCBJ), and numerous experimental data have been collected. Unfortunately, the ion emission characteristics have so far not been explained theoretically in a satisfactory way. In this paper, in order to explain these characteristics, use was made of a single-particle model. Taking into account the stochastic character of the ion emission, we applied a Monte Carlo method. The performed computer simulations of a pinhole image and energy spectrum of deuterons emitted from RPI-IBIS, which were computed on the basis of the applied model, appeared to be in reasonable agreement with the experimental data. (paper)

  5. Linear Accelerators

    International Nuclear Information System (INIS)

    Vretenar, M

    2014-01-01

    The main features of radio-frequency linear accelerators are introduced, reviewing the different types of accelerating structures and presenting the main characteristics aspects of linac beam dynamics

  6. Interacting with accelerators

    International Nuclear Information System (INIS)

    Dasgupta, S.

    1994-01-01

    Accelerators are research machines which produce energetic particle beam for use as projectiles to effect nuclear reactions. These machines along with their services and facilities may occupy very large areas. The man-machine interface of accelerators has evolved with technological changes in the computer industry and may be partitioned into three phases. The present paper traces the evolution of man-machine interface from the earliest accelerators to the present computerized systems incorporated in modern accelerators. It also discusses the advantages of incorporating expert system technology for assisting operators. (author). 8 ref

  7. Strategic information security

    CERN Document Server

    Wylder, John

    2003-01-01

    Introduction to Strategic Information SecurityWhat Does It Mean to Be Strategic? Information Security Defined The Security Professional's View of Information Security The Business View of Information SecurityChanges Affecting Business and Risk Management Strategic Security Strategic Security or Security Strategy?Monitoring and MeasurementMoving Forward ORGANIZATIONAL ISSUESThe Life Cycles of Security ManagersIntroductionThe Information Security Manager's Responsibilities The Evolution of Data Security to Information SecurityThe Repository Concept Changing Job Requirements Business Life Cycles

  8. Mapping the Information Trace in Local Field Potentials by a Computational Method of Two-Dimensional Time-Shifting Synchronization Likelihood Based on Graphic Processing Unit Acceleration.

    Science.gov (United States)

    Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You

    2017-12-01

    The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.

  9. Vol. 34 - Optimization of quench protection heater performance in high-field accelerator magnets through computational and experimental analysis

    CERN Document Server

    Salmi, Tiina

    2016-01-01

    Superconducting accelerator magnets with increasingly hi gh magnetic fields are being designed to improve the performance of the Large Hadron Collider (LHC) at CERN. One of the technical challenges is the magnet quench p rotection, i.e., preventing damage in the case of an unexpected loss of superc onductivity and the heat generation related to that. Traditionally this is d one by disconnecting the magnet current supply and using so-called protection he aters. The heaters suppress the superconducting state across a large fraction of the winding thus leading to a uniform dissipation of the stored energy. Preli minary studies suggested that the high-field Nb 3 Sn magnets under development for the LHC luminosity upgrade (HiLumi) could not be reliably protected using the existing heaters. In this thesis work I analyzed in detail the present state-of-the-art protection heater technology, aiming to optimize its perfo rmance and evaluate the prospects in high-field magnet protection. The heater efficiency analyses ...

  10. Strategic growth options

    NARCIS (Netherlands)

    Kulatilaka, N.; Perotti, E.C.

    1998-01-01

    We provide a strategic rationale for growth options under uncertainty and imperfect corn-petition. In a market with strategic competition, investment confers a greater capability to take advantage of future growth opportunities. This strategic advantage leads to the capture of a greater share of the

  11. Strategic marketing research

    NARCIS (Netherlands)

    Bijmolt, Tammo H.A.; Frambach, Ruud T.; Verhallen, Theo M.M.

    1996-01-01

    This article introduces the term “strategic marketing research” for the collection and analysis of data in support of strategic marketing management. In particular, strategic marketing research plays an important role in defining the market, analysis of the environment, and the formulation of

  12. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    Energy Technology Data Exchange (ETDEWEB)

    De Supinski, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Caliga, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  13. Computer-Aided Detection with a Portable Electrocardiographic Recorder and Acceleration Sensors for Monitoring Obstructive Sleep Apnea

    Directory of Open Access Journals (Sweden)

    Ji-Won Baek

    2014-03-01

    Full Text Available Obstructive sleep apnea syndrome is a sleep-related breathing disorder that is caused by obstruction of the upper airway. This condition may be related with many clinical sequelae such as cardiovascular disease, high blood pressure, stroke, diabetes, and clinical depression. To diagnosis obstructive sleep apnea, in-laboratory full polysomnography is considered as a standard test to determine the severity of respiratory disturbance. However, polysomnography is expensive and complicated to perform. In this research, we explore a computer-aided diagnosis system with portable ECG equipment and tri-accelerometer (x, y, and z-axes that can automatically analyze biosignals and test for OSA. Traditional approaches to sleep apnea data analysis have been criticized; however, there are not enough suggestions to resolve the existing problems. As an effort to resolve this issue, we developed an approach to record ECG signals and abdominal movements induced by breathing by affixing ECG-enabled electrodes onto a triaxial accelerometer. With the two signals simultaneously measured, the apnea data obtained would be more accurate, relative to cases where a single signal is measured. This would be helpful in diagnosing OSA. Moreover, a useful feature point can be extracted from the two signals after applying a signal processing algorithm, and the extracted feature point can be applied in designing a computer-aided diagnosis algorithm using a machine learning technique.

  14. Learning to think strategically.

    Science.gov (United States)

    1994-01-01

    Strategic thinking focuses on issues that directly affect the ability of a family planning program to attract and retain clients. This issue of "The Family Planning Manager" outlines the five steps of strategic thinking in family planning administration: 1) define the organization's mission and strategic goals; 2) identify opportunities for improving quality, expanding access, and increasing demand; 3) evaluate each option in terms of its compatibility with the organization's goals; 4) select an option; and 5) transform strategies into action. Also included in this issue is a 20-question test designed to permit readers to assess their "strategic thinking quotient" and a list of sample questions to guide a strategic analysis.

  15. Accelerator Service

    International Nuclear Information System (INIS)

    Champelovier, Y.; Ferrari, M.; Gardon, A.; Hadinger, G.; Martin, J.; Plantier, A.

    1998-01-01

    Since the cessation of the operation of hydrogen cluster accelerator in July 1996, four electrostatic accelerators were in operation and used by the peri-nuclear teams working in multidisciplinary collaborations. These are the 4 MV Van de Graaff accelerator, 2,5 MV Van de Graaff accelerator, 400 kV ion implanter as well as the 120 kV isotope separator

  16. Theoretical problems in accelerator physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses the following research on accelerators: computational methods; higher order mode suppression in accelerators structures; overmoded waveguide components and application to SLED II and power transport; rf sources; accelerator cavity design for a B factory asymmetric collider; and photonic band gap cavities

  17. GPU-based implementation of an accelerated SR-NLUT based on N-point one-dimensional sub-principal fringe patterns in computer-generated holograms

    Directory of Open Access Journals (Sweden)

    Hee-Min Choi

    2015-06-01

    Full Text Available An accelerated spatial redundancy-based novel-look-up-table (A-SR-NLUT method based on a new concept of the N-point one-dimensional sub-principal fringe pattern (N-point1-D sub-PFP is implemented on a graphics processing unit (GPU for fast calculation of computer-generated holograms (CGHs of three-dimensional (3-Dobjects. Since the proposed method can generate the N-point two-dimensional (2-D PFPs for CGH calculation from the pre-stored N-point 1-D PFPs, the loading time of the N-point PFPs on the GPU can be dramatically reduced, which results in a great increase of the computational speed of the proposed method. Experimental results confirm that the average calculation time for one-object point has been reduced by 49.6% and 55.4% compared to those of the conventional 2-D SR-NLUT methods for each case of the 2-point and 3-point SR maps, respectively.

  18. Computationally efficient dynamic modeling of robot manipulators with multiple flexible-links using acceleration-based discrete time transfer matrix method

    DEFF Research Database (Denmark)

    Zhang, Xuping; Sørensen, Rasmus; RahbekIversen, Mathias

    2018-01-01

    This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations, and then is linea......This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations......, and then is linearized based on the acceleration-based state vector. The transfer matrices for each type of components/elements are developed, and used to establish the system equations of a flexible robot manipulator by concatenating the state vector from the base to the end-effector. With this strategy, the size...... manipulators, and only involves calculating and transferring component/element dynamic equations that have small size. The numerical simulations and experimental testing of flexible-link manipulators are conducted to validate the proposed methodologies....

  19. Cultivating strategic thinking skills.

    Science.gov (United States)

    Shirey, Maria R

    2012-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author presents an overview of strategic leadership and offers approaches for cultivating strategic thinking skills.

  20. Commissioning the GTA accelerator

    International Nuclear Information System (INIS)

    Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Bowling, S.; Brown, S.; Cole, R.; Gilpatrick, J.D.; Garnett, R.; Guy, F.W.; Ingalls, W.B.; Johnson, K.F.; Kerstiens, D.; Little, C.; Lohsen, R.A.; Lloyd, S.; Lysenko, W.P.; Mottershead, C.T.; Neuschaefer, G.; Power, J.; Rusthoi, D.P.; Sandoval, D.P. Stevens, R.R. Jr.; Vaughn, G.; Wadlinger, E.A.; Yuan, V.; Connolly, R.; Weiss, R.; Saadatmand, K.

    1992-01-01

    The Ground Test Accelerator (GTA) is supported by the Strategic Defense command as part of their Neutral Particle Beam (NPB) program. Neutral particles have the advantage that in space they are unaffected by the earth's magnetic field and travel in straight lines unless they enter the earth's atmosphere and become charged by stripping. Heavy particles are difficult to stop and can probe the interior of space vehicles; hence, NPB can function as a discriminator between warheads and decoys. We are using GTA to resolve the physics and engineering issues related to accelerating, focusing, and steering a high-brightness, high-current H - beam and then neutralizing it. Our immediate goal is to produce a 24-MeV, 50mA device with a 2% duty factor

  1. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  2. Strategic planning for neuroradiologists.

    Science.gov (United States)

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Strategic Belief Management

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects....... The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars.......While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects...

  4. Multi-scale multi-physics computational chemistry simulation based on ultra-accelerated quantum chemical molecular dynamics method for structural materials in boiling water reactor

    International Nuclear Information System (INIS)

    Miyamoto, Akira; Sato, Etsuko; Sato, Ryo; Inaba, Kenji; Hatakeyama, Nozomu

    2014-01-01

    In collaboration with experimental experts we have reported in the present conference (Hatakeyama, N. et al., “Experiment-integrated multi-scale, multi-physics computational chemistry simulation applied to corrosion behaviour of BWR structural materials”) the results of multi-scale multi-physics computational chemistry simulations applied to the corrosion behaviour of BWR structural materials. In macro-scale, a macroscopic simulator of anode polarization curve was developed to solve the spatially one-dimensional electrochemical equations on the material surface in continuum level in order to understand the corrosion behaviour of typical BWR structural material, SUS304. The experimental anode polarization behaviours of each pure metal were reproduced by fitting all the rates of electrochemical reactions and then the anode polarization curve of SUS304 was calculated by using the same parameters and found to reproduce the experimental behaviour successfully. In meso-scale, a kinetic Monte Carlo (KMC) simulator was applied to an actual-time simulation of the morphological corrosion behaviour under the influence of an applied voltage. In micro-scale, an ultra-accelerated quantum chemical molecular dynamics (UA-QCMD) code was applied to various metallic oxide surfaces of Fe 2 O 3 , Fe 3 O 4 , Cr 2 O 3 modelled as same as water molecules and dissolved metallic ions on the surfaces, then the dissolution and segregation behaviours were successfully simulated dynamically by using UA-QCMD. In this paper we describe details of the multi-scale, multi-physics computational chemistry method especially the UA-QCMD method. This method is approximately 10,000,000 times faster than conventional first-principles molecular dynamics methods based on density-functional theory (DFT), and the accuracy was also validated for various metals and metal oxides compared with DFT results. To assure multi-scale multi-physics computational chemistry simulation based on the UA-QCMD method for

  5. How Strategic are Strategic Information Systems?

    Directory of Open Access Journals (Sweden)

    Alan Eardley

    1996-11-01

    Full Text Available There are many examples of information systems which are claimed to have created and sustained competitive advantage, allowed beneficial collaboration or simply ensured the continued survival of the organisations which used them These systems are often referred to as being 'strategic'. This paper argues that many of the examples of strategic information systems as reported in the literature are not sufficiently critical in determining whether the systems meet the generally accepted definition of the term 'strategic' - that of achieving sustainable competitive advantage. Eight of the information systems considered to be strategic are examined here from the standpoint of one widely-accepted 'competition' framework- Porter's model of industry competition . The framework is then used to question the linkage between the information systems and the mechanisms which are required for the enactment of strategic business objectives based on competition. Conclusions indicate that the systems are compatible with Porter's framework. Finally, some limitations of the framework are discussed and aspects of the systems which extend beyond the framework are highlighted

  6. The auroral electron accelerator

    International Nuclear Information System (INIS)

    Bryant, D.A.; Hall, D.S.

    1989-01-01

    A model of the auroral electron acceleration process is presented in which the electrons are accelerated resonantly by lower-hybrid waves. The essentially stochastic acceleration process is approximated for the purposes of computation by a deterministic model involving an empirically derived energy transfer function. The empirical function, which is consistent with all that is known of electron energization by lower-hybrid waves, allows many, possibly all, observed features of the electron distribution to be reproduced. It is suggested that the process occurs widely in both space and laboratory plasmas. (author)

  7. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  8. Collective ion acceleration

    International Nuclear Information System (INIS)

    Godfrey, B.B.; Faehl, R.J.; Newberger, B.S.; Shanahan, W.R.; Thode, L.E.

    1977-01-01

    Progress achieved in the understanding and development of collective ion acceleration is presented. Extensive analytic and computational studies of slow cyclotron wave growth on an electron beam in a helix amplifier were performed. Research included precise determination of linear coupling between beam and helix, suppression of undesired transients and end effects, and two-dimensional simulations of wave growth in physically realizable systems. Electrostatic well depths produced exceed requirements for the Autoresonant Ion Acceleration feasibility experiment. Acceleration of test ions to modest energies in the troughs of such waves was also demonstrated. Smaller efforts were devoted to alternative acceleration mechanisms. Langmuir wave phase velocity in Converging Guide Acceleration was calculated as a function of the ratio of electron beam current to space-charge limiting current. A new collective acceleration approach, in which cyclotron wave phase velocity is varied by modulation of electron beam voltage, is proposed. Acceleration by traveling Virtual Cathode or Localized Pinch was considered, but appears less promising. In support of this research, fundamental investigations of beam propagation in evacuated waveguides, of nonneutral beam linear eigenmodes, and of beam stability were carried out. Several computer programs were developed or enhanced. Plans for future work are discussed

  9. Identification and control of factors influencing flow-accelerated corrosion in HRSG units using computational fluid dynamics modeling, full-scale air flow testing, and risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pietrowski, Ronald L. [The Consolidated Edison Company of New York, Inc., New York, NY (United States)

    2010-11-15

    In 2009, Consolidated Edison's East River heat recovery steam generator units 10 and 20 both experienced economizer tube failures which forced each unit offline. Extensive inspections indicated that the primary failure mechanism was flow-accelerated corrosion (FAC). The inspections revealed evidence of active FAC in all 7 of the economizer modules, with the most advanced stages of degradation being noted in center modules. Analysis determined that various factors were influencing and enabling this corrosion mechanism. Computational fluid dynamics and full-scale air flow testing showed very turbulent feedwater flow prevalent in areas of the modules corresponding with the pattern of FAC damage observed through inspection. It also identified preferential flow paths, with higher flow velocities, in certain tubes directly under the inlet nozzles. A FAC risk analysis identified more general susceptibility to FAC in the areas experiencing damage due to feedwater pH, operating temperatures, local shear fluid forces, and the chemical composition of the original materials of construction. These, in combination, were the primary root causes of the failures. Corrective actions were identified, analyzed, and implemented, resulting in equipment replacements and repairs. (orig.)

  10. Preliminary study on X-ray fluorescence computed tomography imaging of gold nanoparticles: Acceleration of data acquisition by multiple pinholes scheme

    Science.gov (United States)

    Sasaya, Tenta; Sunaguchi, Naoki; Seo, Seung-Jum; Hyodo, Kazuyuki; Zeniya, Tsutomu; Kim, Jong-Ki; Yuasa, Tetsuya

    2018-04-01

    Gold nanoparticles (GNPs) have recently attracted attention in nanomedicine as novel contrast agents for cancer imaging. A decisive tomographic imaging technique has not yet been established to depict the 3-D distribution of GNPs in an object. An imaging technique known as pinhole-based X-ray fluorescence computed tomography (XFCT) is a promising method that can be used to reconstruct the distribution of GNPs from the X-ray fluorescence emitted by GNPs. We address the acceleration of data acquisition in pinhole-based XFCT for preclinical use using a multiple pinhole scheme. In this scheme, multiple projections are simultaneously acquired through a multi-pinhole collimator with a 2-D detector and full-field volumetric beam to enhance the signal-to-noise ratio of the projections; this enables fast data acquisition. To demonstrate the efficacy of this method, we performed an imaging experiment using a physical phantom with an actual multi-pinhole XFCT system that was constructed using the beamline AR-NE7A at KEK. The preliminary study showed that the multi-pinhole XFCT achieved a data acquisition time of 20 min at a theoretical detection limit of approximately 0.1 Au mg/ml and at a spatial resolution of 0.4 mm.

  11. Quantum optical device accelerating dynamic programming

    OpenAIRE

    Grigoriev, D.; Kazakov, A.; Vakulenko, S.

    2005-01-01

    In this paper we discuss analogue computers based on quantum optical systems accelerating dynamic programming for some computational problems. These computers, at least in principle, can be realized by actually existing devices. We estimate an acceleration in resolving of some NP-hard problems that can be obtained in such a way versus deterministic computers

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. Lo Strategic Management Accounting

    OpenAIRE

    G. INVERNIZZI

    2005-01-01

    Il saggio indaga gli aggregati informativi e gli elementi che compongono lo strategic management accounting. Sono quindi analizzate le funzioni svolte nei diversi stadi del processo di gestione strategica osservando il suo ruolo all’interno del management accounting. Infine sono approfonditi i rapporti fra i livelli della gestione strategica e lo strategic management accounting.

  14. Crowdnursing - Strategizing Shitstorms

    DEFF Research Database (Denmark)

    Christensen, Lars Holmgaard

    2018-01-01

    This paper will introduce a framework for distinguishing between shitstorm types and social media crises. In need of strategies for handling social media crowds the paper suggests a strategic approach that focus on the cultivation of social media crowds and offers a valuable conceptual...... understanding of crowdnursing as a strategic tool....

  15. Strategic Risk Assessment

    Science.gov (United States)

    Derleth, Jason; Lobia, Marcus

    2009-01-01

    This slide presentation provides an overview of the attempt to develop and demonstrate a methodology for the comparative assessment of risks across the entire portfolio of NASA projects and assets. It includes information about strategic risk identification, normalizing strategic risks, calculation of relative risk score, and implementation options.

  16. 11. Strategic planning.

    Science.gov (United States)

    2014-05-01

    There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.

  17. Manage "Human Capital" Strategically

    Science.gov (United States)

    Odden, Allan

    2011-01-01

    To strategically manage human capital in education means restructuring the entire human resource system so that schools not only recruit and retain smart and capable individuals, but also manage them in ways that support the strategic directions of the organization. These management practices must be aligned with a district's education improvement…

  18. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  19. Strategic Talk in Film.

    Science.gov (United States)

    Payr, Sabine; Skowron, Marcin; Dobrosovestnova, Anna; Trapp, Martin; Trappl, Robert

    2017-01-01

    Conversational robots and agents are being designed for educational and/or persuasive tasks, e.g., health or fitness coaching. To pursue such tasks over a long time, they will need a complex model of the strategic goal, a variety of strategies to implement it in interaction, and the capability of strategic talk. Strategic talk is incipient ongoing conversation in which at least one participant has the objective of changing the other participant's attitudes or goals. The paper is based on the observation that strategic talk can stretch over considerable periods of time and a number of conversational segments. Film dialogues are taken as a source to develop a model of the strategic talk of mentor characters. A corpus of film mentor utterances is annotated on the basis of the model, and the data are interpreted to arrive at insights into mentor behavior, especially into the realization and sequencing of strategies.

  20. Strategic planning in transition

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2012-01-01

    In this paper, we analyse how contested transitions in planning rationalities and spatial logics have shaped the processes and outputs of recent episodes of Danish ‘strategic spatial planning’. The practice of ‘strategic spatial planning’ in Denmark has undergone a concerted reorientation...... style of ‘strategic spatial planning’ with its associated spatial logics is continuously challenged by a persistent regulatory, top-down rationality of ‘strategic spatial planning’, rooted in spatial Keynesianism, which has long characterised the Danish approach. The findings reveal the emergence...... of a particularly Danish approach, retaining strong regulatory aspects. However this approach does not sit easily within the current neoliberal political climate, raising concerns of an emerging crisis of ‘strategic spatial planning’....

  1. On strategic spatial planning

    Directory of Open Access Journals (Sweden)

    Tošić Branka

    2014-01-01

    Full Text Available The goal of this paper is to explain the origin and development of strategic spatial planning, to show complex features and highlight the differences and/or advantages over traditional, physical spatial planning. Strategic spatial planning is seen as one of approaches in legally defined planning documents, and throughout the display of properties of sectoral national strategies, as well as issues of strategic planning at the local level in Serbia. The strategic approach is clearly recognized at the national and sub-national level of spatial planning in European countries and in our country. It has been confirmed by the goals outlined in documents of the European Union and Serbia that promote the grounds of territorial cohesion and strategic integrated planning, emphasizing cooperation and the principles of sustainable spatial development. [Projekat Ministarstva nauke Republike Srbije, br. 176017

  2. FY17 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the FY17 Division 1000 Science and Technology Strategic Plan. As this plan represents a continuation of the work we started last year, the four strategic themes (Mission Engagement, Bold Outcomes, Collaborative Environment, and Safety Imperative) remain the same, along with many of the goals. You will see most of the changes in the actions listed for each goal: We completed some actions, modified others, and added a few new ones. As I’ve stated previously, this is not a strategy to be pursued in tension with the Laboratory strategic plan. The Division 1000 strategic plan is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming months.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  6. Accelerated Metals Development by Computation

    Science.gov (United States)

    2008-02-01

    number of highly sophisticated pieces of equipment. A comprehensive list of the equipment acquired is given below. Emphasis should be made regarding...revealed no difference in breakdown potentials ( Ecrit ) as shown in Figure 4. Tests were completed in triplicate on a 6 mm polished finish for each...potential ( Ecrit ), and repassivation potential (Erepass) – are given in Table 4 along with characteristic current densities – corrosion current

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. Electrostatic accelerators

    OpenAIRE

    Hinterberger, F

    2006-01-01

    The principle of electrostatic accelerators is presented. We consider Cockcroft– Walton, Van de Graaff and Tandem Van de Graaff accelerators. We resume high voltage generators such as cascade generators, Van de Graaff band generators, Pelletron generators, Laddertron generators and Dynamitron generators. The speci c features of accelerating tubes, ion optics and methods of voltage stabilization are described. We discuss the characteristic beam properties and the variety of possible beams. We ...

  9. Electrostatic accelerators

    CERN Document Server

    Hinterberger, F

    2006-01-01

    The principle of electrostatic accelerators is presented. We consider Cockcroft– Walton, Van de Graaff and Tandem Van de Graaff accelerators. We resume high voltage generators such as cascade generators, Van de Graaff band generators, Pelletron generators, Laddertron generators and Dynamitron generators. The speci c features of accelerating tubes, ion optics and methods of voltage stabilization are described. We discuss the characteristic beam properties and the variety of possible beams. We sketch possible applications and the progress in the development of electrostatic accelerators.

  10. Accelerator development

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Because the use of accelerated heavy ions would provide many opportunities for new and important studies in nuclear physics and nuclear chemistry, as well as other disciplines, both the Chemistry and Physics Divisions are supporting the development of a heavy-ion accelerator. The design of greatest current interest includes a tandem accelerator with a terminal voltage of approximately 25 MV injecting into a linear accelerator with rf superconducting resonators. This combined accelerator facility would be capable of accelerating ions of masses ranging over the entire periodic table to an energy corresponding to approximately 10 MeV/nucleon. This approach, as compared to other concepts, has the advantages of lower construction costs, lower operating power, 100 percent duty factor, and high beam quality (good energy resolution, good timing resolution, small beam size, and small beam divergence). The included sections describe the concept of the proposed heavy-ion accelerator, and the development program aiming at: (1) investigation of the individual questions concerning the superconducting accelerating resonators; (2) construction and testing of prototype accelerator systems; and (3) search for economical solutions to engineering problems. (U.S.)

  11. The preparation of computer-supported methods for the analysis of radiation situation at particle accelerators and their exemplary application to the cooler synchrotron COSY

    International Nuclear Information System (INIS)

    Moll, J.

    1991-01-01

    In this thesis the applicability of modern particle transport programs for the radiation protection of the new cooler synchrotron COSY has been investigated. Monte Carlo codes as the program system HERMES offer a large flexibility in geometry simulation and a wide spread applicability up to the GeV energy region because of its implemented physical models. These codes are very suitable for detailed analysis of radiation sources within the accelerator and for dose rate estimations behind radiation shields. Detailed calculations of double differentional flux spectra and surface dose rate distributions at a cylindrical standard target for different materials (iron, copper and lead) were done for 2.5GeV protons incident. In case of deep penetration problems it turns out, that the CPU-time requirement of the Monte Carlo codes increases rapidely. Therefore the Monte Carlo codes were coupled with a one-dimentional S n transport code to treat deep penetration problems in a reasonable computer CPU-time. For this purpose a new high energy neutron-γ transport library was evaluated taking into account the upper energy limit of COSY with 2.8GeV. The needed cross sections have been calculated by using the physical models of one of the HERMES-codes. Several Fortran-routines were developed to get an automatically procedure, which is used to evaluate the results and determine Legendre polynominals of order P3, which had to be inserted as coefficients into the library. The coupling procedure between the programs has been automated by means of several Fortran-routines, which does transformation, normalisation and formatting of the flux-data. (orig./HST) [de

  12. Electromagnetic modeling in accelerator designs

    International Nuclear Information System (INIS)

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described

  13. Shady strategic behavior : Recognizing strategic behavior of Dark Triad followers

    NARCIS (Netherlands)

    Schyns, Birgit; Wisse, Barbara; Sanders, Stacey

    2018-01-01

    The importance of strategic behavior in organizations has long been recognized. However, so far the literature has primarily focused on leaders’ strategic behavior, largely ignoring followers’ strategic behavior. In the present paper, we take a follower trait perspective to strategic follower

  14. Is strategic stockpiling essential?

    International Nuclear Information System (INIS)

    Anon.

    2007-01-01

    As mentioned by the European Commission, a consultant has surveyed stakeholders on the concept of setting up strategic stockpiles of natural gas, namely to boost the security of Europe's supply, much like the strategic stockpiling for petroleum products the OECD member countries carried out after the petroleum crisis. If strategic stockpiling consists in blocking off a quantity of gas in addition to the usable stockpile, the AFG believes it is necessary to assess the implications of such a measure and to examine the security gain it would actually offer compared to the measures that have already been implemented to secure supplies. (author)

  15. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  16. Restriction of the use of hazardous substances (RoHS in the personal computer segment: analysis of the strategic adoption by the manufacturers settled in Brazil

    Directory of Open Access Journals (Sweden)

    Ademir Brescansin

    2015-09-01

    Full Text Available The enactment of the RoHS Directive (Restriction of Hazardous Substances in 2003, limiting the use of certain hazardous substances in electronic equipment has forced companies to adjust their products to comply with this legislation. Even in the absence of similar legislation in Brazil, manufacturers of personal computers which are located in this country have been seen to adopt RoHS for products sold in the domestic market and abroad. The purpose of this study is to analyze whether these manufacturers have really adopted RoHS, focusing on their motivations, concerns, and benefits. This is an exploratory study based on literature review and interviews with HP, Dell, Sony, Lenovo, Samsung, LG, Itautec, and Positivo, using summative content analysis. The results showed that initially, global companies adopted RoHS to market products in Europe, and later expanded this practice to all products. Brazilian companies, however, adopted RoHS to participate in the government’s sustainable procurement bidding processes. It is expected that this study can assist manufacturers in developing strategies for reducing or eliminating hazardous substances in their products and processes, as well as help the government to formulate public policies on reducing risks of environmental contamination.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Computational investigation of 99Mo, 89Sr, and 131I production rates in a subcritical UO2(NO32 aqueous solution reactor driven by a 30-MeV proton accelerator

    Directory of Open Access Journals (Sweden)

    Z. Gholamzadeh

    2015-12-01

    Full Text Available The use of subcritical aqueous homogenous reactors driven by accelerators presents an attractive alternative for producing 99Mo. In this method, the medical isotope production system itself is used to extract 99Mo or other radioisotopes so that there is no need to irradiate common targets. In addition, it can operate at much lower power compared to a traditional reactor to produce the same amount of 99Mo by irradiating targets. In this study, the neutronic performance and 99Mo, 89Sr, and 131I production capacity of a subcritical aqueous homogenous reactor fueled with low-enriched uranyl nitrate was evaluated using the MCNPX code. A proton accelerator with a maximum 30-MeV accelerating power was used to run the subcritical core. The computational results indicate a good potential for the modeled system to produce the radioisotopes under completely safe conditions because of the high negative reactivity coefficients of the modeled core. The results show that application of an optimized beam window material can increase the fission power of the aqueous nitrate fuel up to 80%. This accelerator-based procedure using low enriched uranium nitrate fuel to produce radioisotopes presents a potentially competitive alternative in comparison with the reactor-based or other accelerator-based methods. This system produces ∼1,500 Ci/wk (∼325 6-day Ci of 99Mo at the end of a cycle.

  20. Acceleration of the FERM nodal program

    International Nuclear Information System (INIS)

    Nakata, H.

    1985-01-01

    It was tested three acceleration methods trying to reduce the number of outer iterations in the FERM nodal program. The results obtained indicated that the Chebychev polynomial acceleration method with variable degree results in a economy of 50% in the computer time. Otherwise, the acceleration method by source asymptotic extrapolation or by zonal rebalance did not result in economy of the global computer time, however some acceleration had been verified in outer iterations. (M.C.K.) [pt

  1. Acceleration of the nodal program FERM

    International Nuclear Information System (INIS)

    Nakata, H.

    1985-01-01

    Acceleration of the nodal FERM was tried by three acceleration schemes. Results of the calculations showed the best acceleration with the Tchebyshev method where the savings in the computing time were of the order of 50%. Acceleration with the Assymptotic Source Extrapoltation Method and with the Coarse-Mesh Rebalancing Method did not result in any improvement on the global computational time, although a reduction in the number of outer iterations was observed. (Author) [pt

  2. STRATEGIC PARTNERSHIP OF UKRAINE: DECLARATIONS AND REALITIES

    Directory of Open Access Journals (Sweden)

    Nataliya Demchenko

    2015-11-01

    Full Text Available The strategic partnership of cooperation is a higher step than conventional relationships. Conditioned by specific interests of the parties, such cooperation is possible between those partners who have mutual territorial claims and have mutual commitment to the territorial integrity. At the same time with many partners (it’s quantity is about 20, Ukraine has no simple partnership and cooperation, a lot of them reseived the status of "strategic partners, but often they are not the states, whose national interests in strategic areas correspondes to the current interests of Ukraine. It should be noted that today among the countries that have been declared as the strategic partners of Ukraine, not all of them support national interests in the present. Ukraine, appeared as an independent state, began use new methods of international cooperation, without adequately developed strategy for their use. Some problems facing the country, can be solved, other must be taken into account in determining its development strategy. Therefore, the subject of the research is global and specific problems that consider issues of economic security and partnership in Ukraine in modern conditions. The objective of the paper is to study options for a strategic partnership in Ukraine by improving the institutional mechanism to coordinate the integration processes. The article is based on studies of foreign and domestic scientists. Practical implications. Formation of effective international cooperation of Ukraine in the context of globalization, the choice of strategic partners on the basis of mutually beneficial cooperation. Results. The analysis of Ukraine’s cooperation with Russia; the features of the largest modern regional associations; reasonably objective need for Ukraine’s integration into the regional associations; recommendations on the necessary measures to accelerate the process of deepening Ukraine’s integration with the EU.

  3. RECIRCULATING ACCELERATION

    International Nuclear Information System (INIS)

    BERG, J.S.; GARREN, A.A.; JOHNSTONE, C.

    2000-01-01

    This paper compares various types of recirculating accelerators, outlining the advantages and disadvantages of various approaches. The accelerators are characterized according to the types of arcs they use: whether there is a single arc for the entire recirculator or there are multiple arcs, and whether the arc(s) are isochronous or non-isochronous

  4. LIBO accelerates

    CERN Multimedia

    2002-01-01

    The prototype module of LIBO, a linear accelerator project designed for cancer therapy, has passed its first proton-beam acceleration test. In parallel a new version - LIBO-30 - is being developed, which promises to open up even more interesting avenues.

  5. Strategic agility for nursing leadership.

    Science.gov (United States)

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  6. Strategic Issues for Training.

    Science.gov (United States)

    Pollitt, David, Ed.

    1999-01-01

    Contains precis of 18 articles on strategic management issues, including management development, on-the-job training, corporate scholarship, educational technology, coaching, investing in intellectual capital, and knowledge management. (SK)

  7. Full closure strategic analysis.

    Science.gov (United States)

    2014-07-01

    The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...

  8. The strategic security officer.

    Science.gov (United States)

    Hodges, Charles

    2014-01-01

    This article discusses the concept of the strategic security officer, and the potential that it brings to the healthcare security operational environment. The author believes that training and development, along with strict hiring practices, can enable a security department to reach a new level of professionalism, proficiency and efficiency. The strategic officer for healthcare security is adapted from the "strategic corporal" concept of US Marine Corps General Charles C. Krulak which focuses on understanding the total force implications of the decisions made by the lowest level leaders within the Corps (Krulak, 1999). This article focuses on the strategic organizational implications of every security officer's decisions in the constantly changing and increasingly volatile operational environment of healthcare security.

  9. Strategic Communication Institutionalized

    DEFF Research Database (Denmark)

    Kjeldsen, Anna Karina

    2013-01-01

    of institutionalization when strategic communication is not yet visible as organizational practice, and how can such detections provide explanation for the later outcome of the process? (2) How can studies of strategic communication benefit from an institutional perspective? How can the virus metaphor generate a deeper...... understanding of the mechanisms that interact from the time an organization is exposed to a new organizational idea such as strategic communication until it surfaces in the form of symptoms such as mission and vision statements, communication manuals and communication positions? The first part of the article...... focuses on a discussion of the virus metaphor as an alternative to the widespread fashion metaphor for processes of institutionalization. The second part of the article provides empirical examples of the virus metaphor employed, examples that are drawn from a study of the institutionalization of strategic...

  10. Strategic ecosystems of Colombia

    International Nuclear Information System (INIS)

    Marquez Calle German

    2002-01-01

    The author relates the ecosystems in Colombia, he makes a relationship between ecosystems and population, utility of the ecosystems, transformation of the ecosystems and poverty and he shows a methodology of identification of strategic ecosystems

  11. Strategic Defense Initiative Overview

    National Research Council Canada - National Science Library

    1990-01-01

    ... to Third World and other nations. I will then discuss the scope of the SDI effort, the evolving strategic defense system architectures and theater defense, our compliancy with the ABM Treaty, technology spinoffs resulting from SDI...

  12. Value oriented strategic marketing

    Directory of Open Access Journals (Sweden)

    Milisavljević Momčilo

    2013-01-01

    Full Text Available Changes in today's business environment require companies to orient to strategic marketing. The company accepting strategic marketing has a proactive approach and focus on continuous review and reappraisal of existing and seeking new strategic business areas. Difficulties in achieving target profit and growth require turning marketing from the dominant viewpoint of the tangible product to creating superior value and developing relationships with customers. Value orientation implies gaining competitive advantage through continuous research and understanding of what value represents to the consumers and discovering new ways to meet their required values. Strategic marketing investment requires that the investment in the creation of values should be regularly reviewed in order to ensure a focus on customers with high profit potential and environmental value. This increases customer satisfaction and retention and long-term return on investment of companies.

  13. Leading Strategic Leader Teams

    National Research Council Canada - National Science Library

    Burleson, Willard M

    2008-01-01

    .... Although only 1 to 2 percent of the Army's senior leaders will attain a command position of strategic leadership, they are assisted by others, not only by teams specifically designed and structured...

  14. High-brightness H/sup -/ accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1987-01-01

    Neutral particle beam (NPB) devices based on high-brightness H/sup -/ accelerators are an important component of proposed strategic defense systems. The basic rational and R and D program are outlined and examples given of the underlying technology thrusts toward advanced systems. Much of the research accomplished in the past year is applicable to accelerator systems in general; some of these activities are discussed

  15. Accelerating Inspire

    CERN Document Server

    AUTHOR|(CDS)2266999

    2017-01-01

    CERN has been involved in the dissemination of scientific results since its early days and has continuously updated the distribution channels. Currently, Inspire hosts catalogues of articles, authors, institutions, conferences, jobs, experiments, journals and more. Successful orientation among this amount of data requires comprehensive linking between the content. Inspire has lacked a system for linking experiments and articles together based on which accelerator they were conducted at. The purpose of this project has been to create such a system. Records for 156 accelerators were created and all 2913 experiments on Inspire were given corresponding MARC tags. Records of 18404 accelerator physics related bibliographic entries were also tagged with corresponding accelerator tags. Finally, as a part of the endeavour to broaden CERN's presence on Wikipedia, existing Wikipedia articles of accelerators were updated with short descriptions and links to Inspire. In total, 86 Wikipedia articles were updated. This repo...

  16. Installation Strategic Planning Guidebook

    Science.gov (United States)

    2012-05-01

    Installation natural resource concerns (for example, wetlands , number of endangered species, water use restrictions, encroachment on training lands...Koehler Publishing, 1994 7. Strategy Safari – A Guided Tour Through the Wilds of Strategic Management by Henry Mintzberg, Bruce Ahlstrand, and...T. (1987). NY: Knopf 36. Shaping Strategic Planning: Frogs, Dragons, Bees and Turkey Tails. Pfeiffer, J. W., Goodstein, L. D. & Nolan, T. M. (1989

  17. A National Strategic Narrative

    Science.gov (United States)

    2011-01-01

    strategic ecosystem . In other words, the U.S. should stop trying to dominate and direct global events. The best we can do is to build our capital so...prosperity and security – within a “strategic ecosystem ,” at home and abroad; that in complexity and uncertainty, there are opportunities and hope, as well...law; sovereignty without tyranny, with assured freedom of expression; and an environment for entrepreneurial freedom and global prosperity, with access

  18. Alibaba's strategic drift

    OpenAIRE

    Kim, Young-Chan; Chen, Pi-Chi

    2016-01-01

    It is fundamental in both a theoretical and practical sense, to analyse the strategies of successful e-businesses who were formulated and operated alongside incumbent competitors. Thus, there have been an array of strategic arguments concerning the rapidly-burgeoning virtual powerhouse Alibaba, who amidst a sea of fortified competitors, found their ground to become one of the most prominent e-businesses of the decade. At the commencing stages, Alibaba lacked a specific strategic goal, aside f...

  19. Processes of Strategic Renewal,

    OpenAIRE

    Harald Aadne, John; Mahnke, Volker

    2010-01-01

    We discuss strategic renewal from a competence perspective. We argue that the management of speed and timing in this process is viewed distinctively when perceived through a cognitive lens. Managers need more firmly grounded process-understanding. The key idea of this paper is to dynamically conceptualize key activities of strategic renewal, and possible sources of break-down as they relate to the managment of speed and timing. Based on a case from the media industry, we identi...

  20. 2015 Enterprise Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-08-01

    This document aligns with the Department of Energy Strategic Plan for 2014-2018 and provides a framework for integrating our missions and direction for pursuing DOE’s strategic goals. The vision is a guide to advancing world-class science and engineering, supporting our people, modernizing our infrastructure, and developing a management culture that operates a safe and secure enterprise in an efficient manner.

  1. Strategic market segmentation

    Directory of Open Access Journals (Sweden)

    Maričić Branko R.

    2015-01-01

    Full Text Available Strategic planning of marketing activities is the basis of business success in modern business environment. Customers are not homogenous in their preferences and expectations. Formulating an adequate marketing strategy, focused on realization of company's strategic objectives, requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation. Strategic planning imposes a need to plan marketing activities according to strategically important segments on the long term basis. At the same time, there is a need to revise and adapt marketing activities on the short term basis. There are number of criteria based on which market segmentation is performed. The paper will consider effectiveness and efficiency of different market segmentation criteria based on empirical research of customer expectations and preferences. The analysis will include traditional criteria and criteria based on behavioral model. The research implications will be analyzed from the perspective of selection of the most adequate market segmentation criteria in strategic planning of marketing activities.

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. Spacetime transformations from a uniformly accelerated frame

    International Nuclear Information System (INIS)

    Friedman, Yaakov; Scarr, Tzvi

    2013-01-01

    We use the generalized Fermi–Walker transport to construct a one-parameter family of inertial frames which are instantaneously comoving to a uniformly accelerated observer. We explain the connection between our approach and that of Mashhoon. We show that our solutions of uniformly accelerated motion have constant acceleration in the comoving frame. Assuming the weak hypothesis of locality, we obtain local spacetime transformations from a uniformly accelerated frame K′ to an inertial frame K. The spacetime transformations between two uniformly accelerated frames with the same acceleration are Lorentz. We compute the metric at an arbitrary point of a uniformly accelerated frame. (paper)

  10. Electron accelerator

    International Nuclear Information System (INIS)

    Abramyan.

    1981-01-01

    The USSR produces an electron accelerator family of a simple design powered straight from the mains. The specifications are given of accelerators ELITA-400, ELITA-3, ELT-2, TEUS-3 and RIUS-5 with maximum electron energies of 0.3 to 5 MeV, a mean power of 10 to 70 kW operating in both the pulsed and the continuous (TEUS-3) modes. Pulsed accelerators ELITA-400 and ELITA-3 and RIUS-5 in which TESLA resonance transformers are used are characterized by their compact size. (Ha)

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  18. Centralized digital control of accelerators

    International Nuclear Information System (INIS)

    Melen, R.E.

    1983-09-01

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors

  19. Horizontal Accelerator

    Data.gov (United States)

    Federal Laboratory Consortium — The Horizontal Accelerator (HA) Facility is a versatile research tool available for use on projects requiring simulation of the crash environment. The HA Facility is...

  20. Acceleration theorems

    International Nuclear Information System (INIS)

    Palmer, R.

    1994-06-01

    Electromagnetic fields can be separated into near and far components. Near fields are extensions of static fields. They do not radiate, and they fall off more rapidly from a source than far fields. Near fields can accelerate particles, but the ratio of acceleration to source fields at a distance R, is always less than R/λ or 1, whichever is smaller. Far fields can be represented as sums of plane parallel, transversely polarized waves that travel at the velocity of light. A single such wave in a vacuum cannot give continuous acceleration, and it is shown that no sums of such waves can give net first order acceleration. This theorem is proven in three different ways; each method showing a different aspect of the situation

  1. LINEAR ACCELERATOR

    Science.gov (United States)

    Christofilos, N.C.; Polk, I.J.

    1959-02-17

    Improvements in linear particle accelerators are described. A drift tube system for a linear ion accelerator reduces gap capacity between adjacent drift tube ends. This is accomplished by reducing the ratio of the diameter of the drift tube to the diameter of the resonant cavity. Concentration of magnetic field intensity at the longitudinal midpoint of the external sunface of each drift tube is reduced by increasing the external drift tube diameter at the longitudinal center region.

  2. Strategic management for university hospitals

    Directory of Open Access Journals (Sweden)

    Martha Isabel Riaño-Casallas

    2016-10-01

    Full Text Available Introduction: There are several approaches and schools that support strategic management processes. University hospitals require the implementation of a strategic approach to their management, since they are a particular type of organization with the triple mission of providing health care, education and research. Objective: To propose a strategic profile for a university hospital. Materials and methods: The theoretical framework of strategic management was analyzed and some particular components of hospital management were studied; based on these criteria, the strategic management process in three high complexity hospitals of Bogotá, D.C. was examined and a profile of both the objectives and the functional strategies for the hospital was proposed. Results: The main strategic thinking schools are presented; the processes and components of strategic management are described, and a strategic management profile for a university hospital is proposed. Conclusion: The strategic orientation of management for an institution with the characteristics of a university hospital facilitates achieving organizational objectives.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. FY16 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the Division 1000 Science and Technology Strategic Plan. This plan was created with considerable participation from all levels of management in Division 1000, and is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. The plan is characterized by four strategic themes: Mission Engagement, Bold Outcomes, Collaborative Environment, and the Safety Imperative. Each theme is accompanied by a brief vision statement, several goals, and planned actions to support those goals throughout FY16. I want to be clear that this is not a strategy to be pursued in tension with the Laboratory strategic plan. Rather, it is intended to describe “how” we intend to show up for the “what” described in Sandia’s Strategic Plan. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming year.

  5. Strategic Control in Decision Making under Uncertainty

    Science.gov (United States)

    Venkatraman, Vinod; Huettel, Scott

    2012-01-01

    Complex economic decisions – whether investing money for retirement or purchasing some new electronic gadget – often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, to evaluate outcomes against a variety of contexts, and to flexibly match behavior to changes in the environment. In recent years, substantial research implicates the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision making. This region contains a functional topography such that the posterior dmPFC supports response-related control while the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue both for generalized contributions of the dmPFC to cognitive control, and for specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are also likely to be critical for decision making in other domains, including interpersonal interactions in social settings. PMID:22487037

  6. Strategizing in multiple ways

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Madsen, Charlotte Øland; Rasmussen, Jørgen Gulddahl

    2013-01-01

    Strategy processes are kinds of wayfaring where different actors interpret a formally defined strat-egy differently. In the everyday practice of organizations strategizing takes place in multiple ways through narratives and sensible actions. This forms a meshwork of polyphonic ways to enact one...... and the same strategy. The paper focusses on such processes as they develop in a Danish service company. It is done on the basis of an empirical and longitudinal study of a strategy process in the Service Company where the strategic purpose was to implement value-based management. The theme to be developed...... based on this development paper is whether one can understand these diver-gent strategic wayfaring processes as constructive for organizations....

  7. Thinking strategically about capitation.

    Science.gov (United States)

    Boland, P

    1997-05-01

    All managed care stakeholders--health plan members, employers, providers, community organizations, and government entitites--share a common interest in reducing healthcare costs while improving the quality of care health plan members receive. Although capitation is a usually thought of primarily as a payment mechanism, it can be a powerful tool providers and health plans can use to accomplish these strategic objectives and others, such as restoring and maintaining the health of plan members or improving a community's health status. For capitation to work effectively as a strategic tool, its use must be tied to a corporate agenda of partnering with stakeholders to achieve broader strategic goals. Health plans and providers must develop a partnership strategy in which each stakeholder has well-defined roles and responsibilities. The capitation structure must reinforce interdependence, shift focus from meeting organizational needs to meeting customer needs, and develop risk-driven care strategies.

  8. Tourism and Strategic Planning

    DEFF Research Database (Denmark)

    Pasgaard, Jens Christian

    2012-01-01

    the potential of ‘the extraordinary’ tourism-dominated space. As highlighted in the introduction, this report does not present any systematic analysis of strategic planning processes; neither does it provide any unequivocal conclusions. Rather, the report presents a collection of so-called ‘detours......The main purpose of this report is to explore and unfold the complexity of the tourism phenomenon in order to qualify the general discussion of tourism-related planning challenges. Throughout the report I aim to demonstrate the strategic potential of tourism in a wider sense and more specifically......’ – a collection of theoretical discussions and case studies with the aim to inspire future strategic planning. Due to the complexity and heterogeneity of the phenomenon I use a non-linear and non-chronological report format with the ambition to create a new type of overview. In this regard the report is intended...

  9. NASA strategic plan

    Science.gov (United States)

    1994-01-01

    The NASA Strategic Plan is a living document. It provides far-reaching goals and objectives to create stability for NASA's efforts. The Plan presents NASA's top-level strategy: it articulates what NASA does and for whom; it differentiates between ends and means; it states where NASA is going and what NASA intends to do to get there. This Plan is not a budget document, nor does it present priorities for current or future programs. Rather, it establishes a framework for shaping NASA's activities and developing a balanced set of priorities across the Agency. Such priorities will then be reflected in the NASA budget. The document includes vision, mission, and goals; external environment; conceptual framework; strategic enterprises (Mission to Planet Earth, aeronautics, human exploration and development of space, scientific research, space technology, and synergy); strategic functions (transportation to space, space communications, human resources, and physical resources); values and operating principles; implementing strategy; and senior management team concurrence.

  10. Strategic Self-Ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    We examine strategic self-ignorance—the use of ignorance as an excuse to overindulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals — a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58 percent) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  11. Strategic self-ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    2016-01-01

    We examine strategic self-ignorance—the use of ignorance as an excuse to over-indulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals—a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58%) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. Strategic environmental assessment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    1997-01-01

    The integration of environmental considerations into strategic decision making is recognized as a key to achieving sustainability. In the European Union a draft directive on Strategic Environmental Assessment (SEA) is currently being reviewed by the member states. The nature of the proposed SEA...... that the SEA directive will influence the decision-making process positively and will help to promote improved environmental decisions. However, the guidelines for public participation are not sufficient and the democratic element is strongly limited. On the basis of these findings, recommendations relating...

  14. Complex Strategic Choices

    DEFF Research Database (Denmark)

    Leleur, Steen

    to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students...... resulting in new material stemming from and focusing on practical application of a systemic approach. The outcome is a coherent and flexible approach named systemic planning. The inclusion of both the theoretical and practical aspects of systemic planning makes this book a key resource for researchers...

  15. Strategic financial analysis: the CFO's role in strategic planning.

    Science.gov (United States)

    Litos, D M

    1985-03-01

    Strategic financial analysis, the financial information support system for the strategic planning process, provides information vital to maintaining a healthy bottom line. This article, the third in HCSM's series on the organizational components of strategic planning, reviews the role of the chief financial officer in determining which programs and services will best meet the future needs of the institution.

  16. Mapping strategic diversity: strategic thinking from a variety of perspectives

    NARCIS (Netherlands)

    Jacobs, D.

    2010-01-01

    In his influential work, Strategy Safari, Henry Mintzberg and his colleagues presented ten schools of strategic thought. In this impressive book, Dany Jacobs demonstrates that the real world of strategic management is much wider and richer. In Mapping Strategic Diversity, Jacobs distinguishes

  17. Strategic planning and managerial control

    OpenAIRE

    Mihaela Ghicajanu

    2004-01-01

    In this paper present relationship among strategic planning and managerial control process. For begin I want present few elements about strategic planning and managerial control in order to identify link inter these elements.

  18. New Military Strategic Communications System

    National Research Council Canada - National Science Library

    Baldwin, Robert F

    2007-01-01

    ... audience through unified action. The Quadrennial Defense Review Roadmap for Strategic Communications and the Department of Defense, Report of the Defense Science Board Task Force on Strategic Communication both concluded that the US...

  19. Accelerator microanalysis

    International Nuclear Information System (INIS)

    Tuniz, C.

    1997-01-01

    Particle accelerators have been developed more than sixty years ago to investigate nuclear and atomic phenomena. A major shift toward applications of accelerators in the study of materials structure and composition in inter-disciplinary projects has been witnessed in the last two decades. The Australian Nuclear Science and Technology Organisation (ANSTO) has developed advanced research programs based on the use of particle and photon beams. Atmospheric pollution problems are investigated at the 3 MV Van de Graff accelerator using ion beam analysis techniques to detect toxic elements in aerosol particles. High temperature superconductor and semiconductor materials are characterised using the recoil of iodine and other heavy ions produced at ANTARES, the 10-MV Tandem accelerator. A heavy-ion microprobe is presently being developed at ANTARES to map elemental concentrations of specific elements with micro-size resolution. An Accelerator mass Spectrometry (AMS) system has been developed at ANSTO for the ultra-sensitive detection of Carbon-14, Iodine-129 and other long-lived radioisotopes. This AMS spectrometer is a key instrument for climate change studies and international safeguards. ANSTO is also managing the Australian Synchrotron Research program based on facilities developed at the Photon Factory (Japan) and at the Advanced Photon Source (USA). Advanced projects in biology, materials chemistry, structural condensed matter and other disciplines are being promoted by a consortium involving Australian universities and research institutions. This paper will review recent advances in the use of particle accelerators, with a particular emphasis on applications developed at ANSTO and related to problems of international concern, such as global environmental change, public health and nuclear proliferation

  20. High performance proton accelerators

    International Nuclear Information System (INIS)

    Favale, A.J.

    1989-01-01

    In concert with this theme this paper briefly outlines how Grumman, over the past 4 years, has evolved from a company that designed and fabricated a Radio Frequency Quadrupole (RFQ) accelerator from the Los Alamos National Laboratory (LANL) physics and specifications to a company who, as prime contractor, is designing, fabricating, assembling and commissioning the US Army Strategic Defense Commands (USA SDC) Continuous Wave Deuterium Demonstrator (CWDD) accelerator as a turn-key operation. In the case of the RFQ, LANL scientists performed the physics analysis, established the specifications supported Grumman on the mechanical design, conducted the RFQ tuning and tested the RFQ at their laboratory. For the CWDD Program Grumman has the responsibility for the physics and engineering designs, assembly, testing and commissioning albeit with the support of consultants from LANL, Lawrence Berkeley Laboratory (LBL) and Brookhaven National laboratory. In addition, Culham Laboratory and LANL are team members on CWDD. LANL scientists have reviewed the physics design as well as a USA SDC review board. 9 figs

  1. The strategic entrepreneurial thinking imperative

    OpenAIRE

    S. Dhliwayo; J. J. Van Vuuren

    2007-01-01

    Purpose: The aim of this paper is to demonstrate that strategic entrepreneurial thinking is a unitary concept which should be viewed as a standalone construct. Design/Methodology/Approach: The concept strategic entrepreneurial thinking is modelled from an analysis of strategic thinking and entrepreneurial thinking from available literature. The strategic entrepreneurial mindset imperative is then emphasised and confirmed. Findings: This paper's finding is that there is no diff...

  2. A Handbook for Strategic Planning

    Science.gov (United States)

    1994-01-01

    Total Quality Leadership, 48 mtrategic direction, strategic intent , organizational planning, 11tinaiCMc MIisiing.mysteusth nking, gap analysis 17 1CUPMtlI...Department of the Nawy vision, guiding principles, and strategic goals. Washington, DC: Author. Hamel, G., & Prahalad , C. K. (May-June 1989). Strategic ...professoional oirgani/atioins. strategic planning. Adv;ice mInav also take .,V resouirces, perimt, thet [QI 0 )fice, the form of recoiln~inedatioins onl

  3. Scenario-based strategizing

    DEFF Research Database (Denmark)

    Lehr, Thomas; Lorenz, Ullrich; Willert, Markus

    2017-01-01

    -based efficacy and robustness. To facilitate the colla- borative strategizing in teams, we propose a matrix with robustness and efficacy as the two axes, which we call the Parmenides Matrix. We assess the impact of the novel approach by applying it in two cases, at a govern- mental agency (German Environmental...

  4. Strategic decision making

    NARCIS (Netherlands)

    Stokman, Frans N.; Assen, Marcel A.L.M. van; Knoop, Jelle van der; Oosten, Reinier C.H. van

    2000-01-01

    This paper introduces a methodology for strategic intervention in collective decision making.The methodology is based on (1) a decomposition of the problem into a few main controversial issues, (2) systematic interviews of subject area specialists to obtain a specification of the decision

  5. Strategic Marketing for Agribusiness.

    Science.gov (United States)

    Welch, Mary A., Ed.

    1993-01-01

    The steps for strategic market planning are discussed including: (1) assessing the situation with market conditions, customers, competitors, and your firm; and (2) crafting a strategy to prioritize target markets, develop a core strategy, and create a marketing mix. Examples of agribusiness successes are presented. The booklet concludes with a…

  6. Strategic planning for marketers.

    Science.gov (United States)

    Wilson, I

    1978-12-01

    The merits of strategic planning as a marketing tool are discussed in this article which takes the view that although marketers claim to be future-oriented, they focus too little attention on long-term planning and forecasting. Strategic planning, as defined by these authors, usually encompasses periods of between five and twenty-five years and places less emphasis on the past as an absolute predictor of the future. It takes a more probabilistic view of the future than conventional marketing strategy and looks at the corporation as but one component interacting with the total environment. Inputs are examined in terms of environmental, social, political, technological and economic importance. Because of its futuristic orientation, an important tenant of strategic planning is the preparation of several alternative scenarios ranging from most to least likely. By planning for a wide-range of future market conditions, a corporation is more able to be flexible by anticipating the course of future events, and is less likely to become a captive reactor--as the authors believe is now the case. An example of strategic planning at General Elecric is cited.

  7. Strategic port development

    DEFF Research Database (Denmark)

    Olesen, Peter Bjerg; Dukovska-Popovska, Iskra; Steger-Jensen, Kenn

    2012-01-01

    This paper proposes a framework for strategic development of a port’s collaboration with its hinterland. The framework is based on literature relevant to port development and undertakes market perspective by considering import/export data relevant for the region of interest. The series of steps...

  8. Naming as Strategic Communication

    DEFF Research Database (Denmark)

    Schmeltz, Line; Kjeldsen, Anna Karina

    2016-01-01

    This article presents a framework for understanding corporate name change as strategic communication. From a corporate branding perspective, the choice of a new name can be seen as a wish to stand out from a group of similar organizations. Conversely, from an institutional perspective, name change...

  9. Strategic self-management

    DEFF Research Database (Denmark)

    Mørch, Sven; Pultz, Sabina; Strøbæk, Pernille Solveig

    2017-01-01

    perspective. Based on our data analysis, this study contributes theoretically by further-developing the concept of ‘strategic self-management’ in an educational context. We conclude that this concept is suitable for encapsulating how young people make sense of, and deal with, their educational biographies...

  10. Capabilities for Strategic Adaptation

    DEFF Research Database (Denmark)

    Distel, Andreas Philipp

    This dissertation explores capabilities that enable firms to strategically adapt to environmental changes and preserve competitiveness over time – often referred to as dynamic capabilities. While dynamic capabilities being a popular research domain, too little is known about what these capabiliti...

  11. ACNW - 1998 strategic plan

    International Nuclear Information System (INIS)

    1998-01-01

    This plan provides strategic direction to The Advisory Committee on Nuclear Waste (ACNW) in 1998 and beyond for focusing on issues most important to the NRC in carrying out its mission of protecting public health and safety, promoting the common defense and security, and protecting the environment

  12. Strategic Targeted Advertising

    NARCIS (Netherlands)

    A. Galeotti; J.L. Moraga-Gonzalez (José Luis)

    2003-01-01

    textabstractWe present a strategic game of pricing and targeted-advertising. Firms can simultaneously target price advertisements to different groups of customers, or to the entire market. Pure strategy equilibria do not exist and thus market segmentation cannot occur surely. Equilibria exhibit

  13. What is strategic management?

    Science.gov (United States)

    Jasper, Melanie; Crossan, Frank

    2012-10-01

    To discuss the theoretical concept of strategic management and explore its relevance for healthcare organisations and nursing management. Despite being a relatively new approach, the growth of strategic management within organisations has been consistently and increasingly promoted. However, comprehensive definitions are scarce and commonalities of interpretation are limited. This paper presents an exploratory discussion of the construct of strategic management, drawing on the literature and questioning its relevance within health-care organisations. Literature relating to strategic management across a number of fields was accessed, drawing primarily on meta-studies within management literature, to identify key concepts and attempt to present a consistent definition. The concept within health care is explored in relation to nursing management. Inconsistency in definitions and utilisation of key concepts within this management approach results in the term being loosely applied in health-care organisations without recourse to foundational principles and a deep understanding of the approach as a theory as opposed to an applied term. Nurse managers are increasingly asked to adopt the 'next-best-thing' in managerial theories, yet caution needs to be taken in nurses agreeing to use systems that lack an evidence base in terms of both efficacy and relevance of context. © 2012 Blackwell Publishing Ltd.

  14. Scenario-based strategizing

    DEFF Research Database (Denmark)

    Lehr, Thomas; Lorenz, Ullrich; Willert, Markus

    2017-01-01

    For over 40 years, scenarios have been promoted as a key technique for forming strategies in uncertain en- vironments. However, many challenges remain. In this article, we discuss a novel approach designed to increase the applicability of scenario-based strategizing in top management teams. Drawi...... Ministry) and a firm affected by disruptive change (Bosch, leading global supplier of technology and solutions)....

  15. Measuring strategic success.

    Science.gov (United States)

    Gish, Ryan

    2002-08-01

    Strategic triggers and metrics help healthcare providers achieve financial success. Metrics help assess progress toward long-term goals. Triggers signal market changes requiring a change in strategy. All metrics may not move in concert. Organizations need to identify indicators, monitor performance.

  16. AECB strategic plan 1999

    International Nuclear Information System (INIS)

    1999-03-01

    This strategic plan provides the direction and focus required to successfully carry out our mandate in an efficient and effective manner over the next two to three years. It gives broad corporate direction by identifying where efforts need to be focussed, and therefore provides guidance for setting priorities and allocating resources. While we cannot ignore any aspect of our mandate, we must recognize that we will always have more work to do than can be accomplished within the resources available to us. Therefore we must set priorities and develop appropriate management systems to ensure that our major efforts and our resources are being directed towards those priorities. Our strategic plan is not a static document. We will always be faced with new challenges, and our strategies for meeting those challenges will also have to change. Therefore our strategic plan must be seen as a guide that reflects both the ever-changing environment and our ability to deal with new or evolving changes effectively. This plan is not intended to be a detailed operational plan. Each directorate must develop its own operational plans and procedures based on the directions in this strategic plan, and on corporate priorities and policies. (author)

  17. Essays on Strategic Communication

    NARCIS (Netherlands)

    Z. Sharif (Zara)

    2016-01-01

    markdownabstractThree essays on strategic communication are discussed in this dissertation. These essays consider different settings in which a decision maker has to rely on another agent for information. In each essay, we analyze how much information the sender is able to credibly communicate to

  18. Status report on the accelerators operation

    International Nuclear Information System (INIS)

    Biri, S.; Kormany, Z.; Berzi, I.; Racz, R.

    2011-01-01

    During 2011 our particle accelerators operated as scheduled, safely and without major or long breakdowns. The utilization rates of the accelerators were similar to the preceding year. The cyclotron delivered 1735 hours and the 40- years old 5 MeV Van de Graaff generator supplied more than 1900 hours. The 1 MeV Van de Graaff accelerator was also operated for several short basic physics experiments last year (84 hours) with requests for much more beamtime in 2012. The plasma and beam-on-target time at the ECR ion source was less than in previous years (322 hours) due to several time consuming technical developments in this laboratory. The isotope separator, as ion beam accelerator was utilized only for a few hours in 2011, since the research and development in this lab focused on other fields. Nevertheless it is continuously available for research requiring special deposition techniques and for isotope tracing studies. After judging and accepting the importance and quality of our accelerators and staff the title 'Strategic Research Infrastructure' was addressed to the Accelerator Center by the Hungarian authorities. In order to get access to the accelerators a new system was introduced. The beamtime (within the frames of the capacities) is available for everyone with equal chance if an acceptable scientific program is provided together with the request. The users have to contact our Program Advisory Committee (PAC). Since last year the requests - both from external or local users - must be delivered by filling out and e-sending on on-line form available in the homepage of the institute. In the next sub-chapters the 2011 operation and development details at the cyclotron, VdG-5 and ECR accelerators are summarized. Cyclotron operation. The operation of the cyclotron in 2011 was concentrated to the usual 9 months; January, July and August were reserved for maintenance and holidays. The overall working time of the accelerator was 2603 hours; the time used for systematic

  19. Strategic Culture: the Concept and the Directions of Research

    Directory of Open Access Journals (Sweden)

    Эдуард Николаевич Ожиганов

    2012-06-01

    Full Text Available The definition and estimation of political qualification of the ruling groups and long-term prognosis of their activities is a paramount task of strategical analysis. The ruling groups have their own interests and strategical manipulations with them (both successful and poor constitute the important part of their game behavior, which effectiveness in defined periods is more or less computational. The game behavior of the ruling groups by-turn depends on the characteristics of strategic culture. This link is evident under their comparative analysis.

  20. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  1. Peaceful Development and Strategic Opportunity

    Institute of Scientific and Technical Information of China (English)

    Yang Yi

    2006-01-01

    @@ The international strategic situation and environment China faces have changed dramatically since September 11. China has closely followed and adapted itself to the ever-changing situation, seized strategic opportunity, adjusted its global strategy, adhered to peaceful development and displayed diplomacy and strategic flexibility. These are manifested in the following four aspects:

  2. Strategic Planning and Financial Management

    Science.gov (United States)

    Conneely, James F.

    2010-01-01

    Strong financial management is a strategy for strategic planning success in student affairs. It is crucial that student affairs professionals understand the necessity of linking their strategic planning with their financial management processes. An effective strategic planner needs strong financial management skills to implement the plan over…

  3. Being Strategic in HE Management

    Science.gov (United States)

    West, Andrew

    2008-01-01

    The call to be strategic--and with it the concept of strategic management--can bring to mind a wide range of definitions, and there is now a huge array of academic literature supporting the different schools of thought. At a basic level, however, strategic thinking is probably most simply about focusing on the whole, rather than the part. In…

  4. Strategic Partnerships in Higher Education

    Science.gov (United States)

    Ortega, Janet L.

    2013-01-01

    The purpose of this study was to investigate the impacts of strategic partnerships between community colleges and key stakeholders; to specifically examine strategic partnerships; leadership decision-making; criteria to evaluate strategic partnerships that added value to the institution, value to the students, faculty, staff, and the local…

  5. The Possibilities of Strategic Finance

    Science.gov (United States)

    Chaffee, Ellen

    2010-01-01

    Strategic finance is aligning financial decisions--regarding revenues, creating and maintaining institutional assets, and using those assets--with the institution's mission and strategic plan. The concept known as "strategic finance" increasingly is being seen as a useful perspective for helping boards and presidents develop a sustainable…

  6. Microfoundations of strategic decision effectiveness

    NARCIS (Netherlands)

    Jansen, R.J.G.; Van Santen, Sarah

    2017-01-01

    How do organizations make effective strategic decisions? In this study we build on research on the microfoundations of strategy and strategic decision-making to study the underpinnings of strategic decision effectiveness. We argue that the process-effectiveness link can be more fully understood if

  7. Operation of the accelerator

    International Nuclear Information System (INIS)

    GANIL Team

    1992-01-01

    The operation of the GANIL accelerator during 1991 and the first half of 1992 is reported. Results obtained with new beams, metallic beams and the first tests with the new injector system using an ECR source installed on a 100 kV platform are also given. Statistics of operation and beam characteristics are presented. The computer control system is also discussed. (K.A.) 7 refs.; 3 figs.; 8 tabs

  8. Strategic arms limitation

    Science.gov (United States)

    Allen Greb, G.; Johnson, Gerald W.

    1983-10-01

    Following World War II, American scientists and politicians proposed in the Baruch plan a radical solution to the problem of nuclear weapons: to eliminate them forever under the auspices of an international nuclear development authority. The Soviets, who as yet did not possess the bomb, rejected this plan. Another approach suggested by Secretary of War Henry Stimson to negotiate directly with the Soviet Union was not accepted by the American leadership. These initial arms limitation failures both reflected and exacerbated the hostile political relationship of the superpowers in the 1950s and 1960s. Since 1969, the more modest focus of the Soviet-American arms control process has been on limiting the numbers and sizes of both defensive and offensive strategic systems. The format for this effort has been the Strategic Arms Limitatins Talks (Salt) and more recently the Strategic Arms Reduction Talks (START). Both sides came to these negotiations convinced that nuclear arsenals had grown so large that some for of mutual restraint was needed. Although the SALT/START process has been slow and ponderous, it has produced several concrete the agreements and collateral benefits. The 1972 ABM Treaty restricts the deployment of ballistic missile defense systems, the 1972 Interim Agreement places a quantitative freeze on each side's land based and sea based strategic launchers, and the as yet unratified 1979 SALT II Treaty sets numerical limits on all offensive strategic systems and sublimits on MIRVed systems. Collateral benefits include improved verification procedures, working definitions and counting rules, and permanent bureaucratic apparatus which enhance stability and increase the chances for achieving additional agreements.

  9. Accelerator operations

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    This section is concerned with the operation of both the tandem-linac system and the Dynamitron, two accelerators that are used for entirely different research. Developmental activities associated with the tandem and the Dynamitron are also treated here, but developmental activities associated with the superconducting linac are covered separately because this work is a program of technology development in its own right

  10. CNSTN Accelerator

    International Nuclear Information System (INIS)

    Habbassi, Afifa; Trabelsi, Adel

    2010-01-01

    This project give a big idea about the measurement of the linear accelerator in the CNSTN. During this work we control dose distribution for different product. For this characterisation we have to make an installation qualification ,operational qualification,performance qualification and of course for every step we have to control temperature and the dose ,even the distribution of the last one.

  11. Accelerators course

    CERN Multimedia

    CERN. Geneva HR-RFA; Métral, E

    2006-01-01

    1a) Introduction and motivation 1b) History and accelerator types 2) Transverse beam dynamics 3a) Longitudinal beam dynamics 3b) Figure of merit of a synchrotron/collider 3c) Beam control 4) Main limiting factors 5) Technical challenges

  12. Accelerator operations

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Operations of the SuperHILAC, the Bevatron/Bevalac, and the 184-inch Synchrocyclotron during the period from October 1977 to September 1978 are discussed. These include ion source development, accelerator facilities, the Heavy Ion Spectrometer System, and Bevelac biomedical operations

  13. Accelerator update

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    When the Accelerator Conference, combined International High Energy and US Particle versions, held in Dallas in May, was initially scheduled, progress nearby for the US Superconducting Supercollider was high on the preliminary agenda. With the SSC voted down by Congress in October 1993, this was no longer the case. However the content of the meeting, in terms of both its deep implications for ambitious new projects and the breadth of its scope, showed that the worldwide particle accelerator field is far from being moribund. A traditional feature of such accelerator conferences is the multiplicity of parallel sessions. No one person can attend all sessions, so that delegates can follow completely different paths and emerge with totally different impressions. Despite this overload, and despite the SSC cancellation, the general picture is one of encouraging progress over a wide range of major new projects throughout the world. At the same time, spinoff from, and applications of, accelerators and accelerator technology are becoming increasingly important. Centrestage is now CERN's LHC proton-proton collider, where a test string of superconducting magnets is operating over long periods at the nominal LHC field of 8.36 tesla or more. The assignment of the underground areas in the existing 27- kilometre LEP tunnel is now quasidefinitive (see page 3). For CERN's existing big machine, the LEP electron-positron collider, ongoing work concentrates on boosting performance using improved optics and bunch trains. But the main objective is the LEP2 scheme using superconducting accelerating cavities to boost the beam energy (see page 6). After some initial teething problems, production and operation of these cavities appears to have been mastered, at least under test conditions. A highlight at CERN last year was the first run with lead ions (December 1994, page 15). Handling these heavy particles with systems originally designed for protons calls for ingenuity. The SPS

  14. Accelerator update

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1995-09-15

    When the Accelerator Conference, combined International High Energy and US Particle versions, held in Dallas in May, was initially scheduled, progress nearby for the US Superconducting Supercollider was high on the preliminary agenda. With the SSC voted down by Congress in October 1993, this was no longer the case. However the content of the meeting, in terms of both its deep implications for ambitious new projects and the breadth of its scope, showed that the worldwide particle accelerator field is far from being moribund. A traditional feature of such accelerator conferences is the multiplicity of parallel sessions. No one person can attend all sessions, so that delegates can follow completely different paths and emerge with totally different impressions. Despite this overload, and despite the SSC cancellation, the general picture is one of encouraging progress over a wide range of major new projects throughout the world. At the same time, spinoff from, and applications of, accelerators and accelerator technology are becoming increasingly important. Centrestage is now CERN's LHC proton-proton collider, where a test string of superconducting magnets is operating over long periods at the nominal LHC field of 8.36 tesla or more. The assignment of the underground areas in the existing 27- kilometre LEP tunnel is now quasidefinitive (see page 3). For CERN's existing big machine, the LEP electron-positron collider, ongoing work concentrates on boosting performance using improved optics and bunch trains. But the main objective is the LEP2 scheme using superconducting accelerating cavities to boost the beam energy (see page 6). After some initial teething problems, production and operation of these cavities appears to have been mastered, at least under test conditions. A highlight at CERN last year was the first run with lead ions (December 1994, page 15). Handling these heavy particles with systems originally designed for protons calls for ingenuity. The SPS has managed

  15. Accelerators for heavy ion fusion

    International Nuclear Information System (INIS)

    Bangerter, R.O.

    1985-10-01

    Large fusion devices will almost certainly produce net energy. However, a successful commercial fusion energy system must also satisfy important engineering and economic constraints. Inertial confinement fusion power plants driven by multi-stage, heavy-ion accelerators appear capable of meeting these constraints. The reasons behind this promising outlook for heavy-ion fusion are given in this report. This report is based on the transcript of a talk presented at the Symposium on Lasers and Particle Beams for Fusion and Strategic Defense at the University of Rochester on April 17-19, 1985

  16. Cognitive Characteristics of Strategic and Non-strategic Gamblers.

    Science.gov (United States)

    Mouneyrac, Aurélie; Lemercier, Céline; Le Floch, Valérie; Challet-Bouju, Gaëlle; Moreau, Axelle; Jacques, Christian; Giroux, Isabelle

    2018-03-01

    Participation in strategic and non-strategic games is mostly explained in the literature by gender: men gamble on strategic games, while women gamble on non-strategic games. However, little is known about the underlying cognitive factors that could also distinguish strategic and non-strategic gamblers. We suggest that cognitive style and need for cognition also explain participation in gambling subtypes. From a dual-process perspective, cognitive style is the tendency to reject or accept the fast, automatic answer that comes immediately in response to a problem. Individuals that preferentially reject the automatic response use an analytic style, which suggest processing information in a slow way, with deep treatment. The intuitive style supposes a reliance on fast, automatic answers. The need for cognition provides a motivation to engage in effortful activities. One hundred and forty-nine gamblers (53 strategic and 96 non-strategic) answered the Cognitive Reflection Test, Need For Cognition Scale, and socio-demographic questions. A logistic regression was conducted to evaluate the influence of gender, cognitive style and need for cognition on participation in strategic and non-strategic games. Our results show that a model with both gender and cognitive variables is more accurate than a model with gender alone. Analytic (vs. intuitive) style, high (vs. low) need for cognition and being male (vs. female) are characteristics of strategic gamblers (vs. non-strategic gamblers). This study highlights the importance of considering the cognitive characteristics of strategic and non-strategic gamblers in order to develop preventive campaigns and treatments that fit the best profiles for gamblers.

  17. Strategic Planning in U.S. Municipalities

    Directory of Open Access Journals (Sweden)

    James VAN RAVENSWAY

    2015-12-01

    Full Text Available Strategic planning started in the U.S. as a corporate planning endeavor. By the 1960’s, it had become a major corporate management tool in the Fortune 500. At fi rst, it was seen as a way of interweaving policies, values and purposes with management, resources and market information in a way that held the organization together. By the 1950’s, the concept was simplifi ed somewhat to focus on SWOT as a way of keeping the corporation afl oat in a more turbulent world. The public sector has been under pressure for a long time to become more effi cient, effective and responsive. Many have felt that the adoption of business practices would help to accomplish that. One tool borrowed from business has been strategic planning. At the local government level, strategic planning became popular starting in the 1980’s, and the community’s planning offi ce was called on to lead the endeavor. The planning offi ce was often the advocate of the process. Urban planning offi ces had been doing long-range plans for decades, but with accelerating urban change a more rapid action-oriented response was desired. The paper describes this history and process in the East Lansing, Michigan, U.S., where comprehensive community plans are the result of a multi-year visioning process and call for action- oriented, strategies for targeted parts of the community.

  18. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  19. Accelerating Value Creation with Accelerators

    DEFF Research Database (Denmark)

    Jonsson, Eythor Ivar

    2015-01-01

    and developing the best business ideas and support the due diligence process. Even universities are noticing that the learning experience of the action learning approach is an effective way to develop capabilities and change cultures. Accelerators related to what has historically been associated...

  20. Innovation and strategic competitiveness

    Directory of Open Access Journals (Sweden)

    Jović Mile B.

    2003-01-01

    Full Text Available Paper discussed relationships of innovation to achieving strategic competitiveness in today globalized economic environment. Special attention is devoted to the nature of competitive advantages on global industries as well national level. Competitive advantage is a firm's ability to transform inputs into goods and services at a profit on a sustained basis, better than competitors. Comparative advantage resides in the factor endowments and created endowments of particular regions. Beside the traditional endowment approach (land, natural resources, labor and the size of the local population it is emphasized the importance of created one such as skilled labor, the technology and knowledge base, government support and culture. Creating corporate or country competitiveness roadmap there are no substantial difference - innovative as well strategic approach is essential.

  1. Strategic innovation portfolio management

    Directory of Open Access Journals (Sweden)

    Stanković Ljiljana

    2015-01-01

    Full Text Available In knowledge-based economy, strategic innovation portfolio management becomes more and more important and critical factor of enterprise's success. Value creation for all the participants in value chain is more successful if it is based on efficient resource allocation and improvement of innovation performances. Numerous researches have shown that companies with best position on the market found their competitiveness on efficient development and exploitation of innovations. In decision making process, enterprise's management is constantly faced with challenge to allocate resources and capabilities as efficiently as possible, in both short and long term. In this paper authors present preliminary results of realized empirical research related to strategic innovation portfolio management in ten chosen enterprises in Serbia. The structure of the paper includes the following parts: theoretical background, explanation of research purpose and methodology, discussion of the results and concluding remarks, including limitations and directions for further research.

  2. Scope of strategic marketing

    Directory of Open Access Journals (Sweden)

    Bradley Frank

    2004-01-01

    Full Text Available Marketing is a philosophy that leads to the process by which organizations, groups and individuals obtain what they need and want by identifying value, providing it, communicating it and delivering it to others. The core concepts of marketing are customers needs, wants and values; products, exchange, communications and relationships. Marketing is strategically concerned with the direction and scope of the long-term activities performed by the organization to obtain a competitive advantage. The organization applies its resources within a changing environment to satisfy customer needs while meeting stakeholder expectations. Implied in this view of strategic marketing is the requirement to develop a strategy to cope with competitors, identify market opportunities, develop and commercialize new products and services, allocate resources among marketing activities and design an appropriate organizational structure to ensure the perform once desired is achieved.

  3. The Strategic Design Perspective

    DEFF Research Database (Denmark)

    Rasmussen, Jørgen

    2006-01-01

    In this article I argue, and exemplify, that designers working in close relation with industry, can have significant influence on the nature and the quality of the products (or services) produced by the companies. In order to achieve this, the designer must become one of the decisive factors...... in the strategic framework of the company. This is done by taking the design process to the business floor of the company and using the design competence to innovate, not only products, but the fundamental concepts for product selection. Using design as a strategic tool will cause entirely new products to emerge...... and will make entire groups of products change into services. In this way new markets will appear to benefit the innovative companies, and if designers do their job well, the focus will be on the users, and thereby benefiting them as well. An example of this process is Novo Nordisk. This pharmaceutical company...

  4. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  5. Dynamic Strategic Information Transmission

    OpenAIRE

    Mikhail Golosov; Vasiliki Skreta; Aleh Tsyvinski; Andrea Wilson

    2011-01-01

    This paper studies strategic information transmission in a dynamic environment where, each period, a privately informed expert sends a message and a decision maker takes an action. Our main result is that, in contrast to a static environment, full information revelation is possible. The gradual revelation of information and the eventual full revelation is supported by the dynamic rewards and punishments. The construction of a fully revealing equilibrium relies on two key features. The first f...

  6. Strategic Targeted Advertising

    OpenAIRE

    Andrea Galeotti; Jose Luis Moraga

    2003-01-01

    textabstractWe present a strategic game of pricing and targeted-advertising. Firms can simultaneously target price advertisements to different groups of customers, or to the entire market. Pure strategy equilibria do not exist and thus market segmentation cannot occur surely. Equilibria exhibit random advertising --to induce an unequal distribution of information in the market-- and random pricing --to obtain profits from badly informed buyers--. We characterize a positive profits equilibrium...

  7. Strategic Marketing in Tourism

    OpenAIRE

    Silvia Muhcina; Brailoiu Liviu

    2012-01-01

    Tourism is a very dynamic economic sector because is very depended of environmental changes, especially now, when the global economy pass through successive crises. For the competitive organizations, the success means to transform their specific activity in a more market oriented business. The objectives of any organization must be fixed going from a better understanding of the markets. Strategic marketing means to know and analyze the consumers’ needs and the market which organization refers...

  8. Strategic Corporate Social Responsibility

    OpenAIRE

    Planer-Friedrich, Lisa; Sahm, Marco

    2017-01-01

    We examine the strategic use of Corporate Social Responsibility (CSR) in imperfectly competitive markets. The level of CSR determines the weight a firm puts on consumer surplus in its objective function before it decides upon supply. First, we consider symmetric Cournot competition and show that the endogenous level of CSR is positive for any given number of firms. However, positive CSR levels imply smaller equilibrium profits. Second, we find that an incumbent monopolist can use CSR as an en...

  9. The strategic research positioning:

    DEFF Research Database (Denmark)

    Viala, Eva Silberschmidt

    to provide new insights into ‘immigrant’ parents’ perspective on home/school partnership in Denmark. The majority of the immigrant parents came from non-Western countries, and they had already been ‘labelled’ difficult in terms of home/school partnership. This calls for what I call ‘strategic research...... positioning’, meaning critical reflections about the relationship and power balance between the researcher and the researched. The paper focus' on challenges and dilemmas linked to this position....

  10. Making Risk Management Strategic

    DEFF Research Database (Denmark)

    Sax, Johanna; Andersen, Torben Juul

    2018-01-01

    Enterprise risk management (ERM) is an established management practice and is increasing in prominence as more firms spend substantial resources implementing ERM frameworks, partially induced by regulatory requirements. Yet, there is a lack of knowledge as to whether such frameworks add value and...... outcomes. The study develops a new multidimensional measure of adherence to ERM practices where earlier studies typically have relied on dichotomous proxies. We discuss the implications of these findings for ERM practice and strategic management in general....

  11. Laser acceleration

    Science.gov (United States)

    Tajima, T.; Nakajima, K.; Mourou, G.

    2017-02-01

    The fundamental idea of Laser Wakefield Acceleration (LWFA) is reviewed. An ultrafast intense laser pulse drives coherent wakefield with a relativistic amplitude robustly supported by the plasma. While the large amplitude of wakefields involves collective resonant oscillations of the eigenmode of the entire plasma electrons, the wake phase velocity ˜ c and ultrafastness of the laser pulse introduce the wake stability and rigidity. A large number of worldwide experiments show a rapid progress of this concept realization toward both the high-energy accelerator prospect and broad applications. The strong interest in this has been spurring and stimulating novel laser technologies, including the Chirped Pulse Amplification, the Thin Film Compression, the Coherent Amplification Network, and the Relativistic Mirror Compression. These in turn have created a conglomerate of novel science and technology with LWFA to form a new genre of high field science with many parameters of merit in this field increasing exponentially lately. This science has triggered a number of worldwide research centers and initiatives. Associated physics of ion acceleration, X-ray generation, and astrophysical processes of ultrahigh energy cosmic rays are reviewed. Applications such as X-ray free electron laser, cancer therapy, and radioisotope production etc. are considered. A new avenue of LWFA using nanomaterials is also emerging.

  12. Laser acceleration

    International Nuclear Information System (INIS)

    Tajima, T.; Nakajima, K.; Mourou, G.

    2017-01-01

    The fundamental idea of LaserWakefield Acceleration (LWFA) is reviewed. An ultrafast intense laser pulse drives coherent wakefield with a relativistic amplitude robustly supported by the plasma. While the large amplitude of wake fields involves collective resonant oscillations of the eigenmode of the entire plasma electrons, the wake phase velocity ∼ c and ultra fastness of the laser pulse introduce the wake stability and rigidity. A large number of worldwide experiments show a rapid progress of this concept realization toward both the high-energy accelerator prospect and broad applications. The strong interest in this has been spurring and stimulating novel laser technologies, including the Chirped Pulse Amplification, the Thin Film Compression, the Coherent Amplification Network, and the Relativistic Mirror Compression. These in turn have created a conglomerate of novel science and technology with LWFA to form a new genre of high field science with many parameters of merit in this field increasing exponentially lately. This science has triggered a number of worldwide research centers and initiatives. Associated physics of ion acceleration, X-ray generation, and astrophysical processes of ultrahigh energy cosmic rays are reviewed. Applications such as X-ray free electron laser, cancer therapy, and radioisotope production etc. are considered. A new avenue of LWFA using nano materials is also emerging.

  13. Accelerating networks

    International Nuclear Information System (INIS)

    Smith, David M D; Onnela, Jukka-Pekka; Johnson, Neil F

    2007-01-01

    Evolving out-of-equilibrium networks have been under intense scrutiny recently. In many real-world settings the number of links added per new node is not constant but depends on the time at which the node is introduced in the system. This simple idea gives rise to the concept of accelerating networks, for which we review an existing definition and-after finding it somewhat constrictive-offer a new definition. The new definition provided here views network acceleration as a time dependent property of a given system as opposed to being a property of the specific algorithm applied to grow the network. The definition also covers both unweighted and weighted networks. As time-stamped network data becomes increasingly available, the proposed measures may be easily applied to such empirical datasets. As a simple case study we apply the concepts to study the evolution of three different instances of Wikipedia, namely, those in English, German, and Japanese, and find that the networks undergo different acceleration regimes in their evolution

  14. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  15. Interactive Design of Accelerators (IDA)

    International Nuclear Information System (INIS)

    Barton, M.Q.

    1987-01-01

    IDA is a beam transport line calculation program which runs interactively on an IBM PC computer. It can be used for a large fraction of the usual calculations done for beam transport systems or periods of accelerators or storage rings. Because of the interactive screen editor nature of the data input, this program permits one to rather quickly arrive at general properties of a beam line or an accelerator period

  16. Computationally efficient dynamic modeling of robot manipulators with multiple flexible-links using acceleration-based discrete time transfer matrix method

    DEFF Research Database (Denmark)

    Zhang, Xuping; Sørensen, Rasmus; RahbekIversen, Mathias

    2018-01-01

    , and then is linearized based on the acceleration-based state vector. The transfer matrices for each type of components/elements are developed, and used to establish the system equations of a flexible robot manipulator by concatenating the state vector from the base to the end-effector. With this strategy, the size...... manipulators, and only involves calculating and transferring component/element dynamic equations that have small size. The numerical simulations and experimental testing of flexible-link manipulators are conducted to validate the proposed methodologies....

  17. Advanced concepts for acceleration

    International Nuclear Information System (INIS)

    Keefe, D.

    1986-07-01

    Selected examples of advanced accelerator concepts are reviewed. Such plasma accelerators as plasma beat wave accelerator, plasma wake field accelerator, and plasma grating accelerator are discussed particularly as examples of concepts for accelerating relativistic electrons or positrons. Also covered are the pulsed electron-beam, pulsed laser accelerator, inverse Cherenkov accelerator, inverse free-electron laser, switched radial-line accelerators, and two-beam accelerator. Advanced concepts for ion acceleration discussed include the electron ring accelerator, excitation of waves on intense electron beams, and two-wave combinations

  18. Equipartitioning in linear accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1981-01-01

    Emittance growth has long been a concern in linear accelerators, as has the idea that some kind of energy balance, or equipartitioning, between the degrees of freedom, would ameliorate the growth. M. Prome observed that the average transverse and longitudinal velocity spreads tend to equalize as current in the channel is increased, while the sum of the energy in the system stays nearly constant. However, only recently have we shown that an equipartitioning requirement on a bunched injected beam can indeed produce remarkably small emittance growth. The simple set of equations leading to this condition are outlined below. At the same time, Hofmann, using powerful analytical and computational methods, has investigated collective instabilities in transported beams and has identified thresholds and regions in parameter space where instabilities occur. This is an important generalization. Work that he will present at this conference shows that the results are essentially the same in r-z coordinates for transport systems, and evidence is presented that shows transport system boundaries to be quite accurate in computer simulations of accelerating systems also. Discussed are preliminary results of efforts to design accelerators that avoid parameter regions where emittance is affected by the instabilities identified by Hofmann. These efforts suggest that other mechanisms are present. The complicated behavior of the RFQ linac in this framework also is shown

  19. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  20. To the problem of software development for two-sided communication system between an operator and computers in the automatic control system of elementary particle accelerators

    International Nuclear Information System (INIS)

    Asyaev, Yu.V.; Burtsev, V.L.; Makarov, V.V.; Nikitin, V.D.; Solov'ev, G.N.

    1975-01-01

    Principles, composition and basic characteristics of the software for a system of operators communication with minicomputers within the automatic accelerator control system are described. It is indicated that the external language of communication proposed in the system should meet the requirements for a provision of dialogue between the operator and the minicomputer operating in the real-time mode with an accelerator. There is a set of standard procedures allowing the operator to interact with the system in a sufficiently complete and flexible manner. These standard procedures are provided in the external operator's language in the form of a set of the control statements. The basic programs of the software system for communication are MONITOR, INTERPRETER and DRIVER programs. The MONITOR program algorithm employs the following principles - service of the input/output requests in the system is made on the relative priority basis: the maximum priority is assigned to operator's requests, the requests of the dispatcher and application programs are at a lower priority level

  1. Application of local area networks to accelerator control systems at the Stanford Linear Accelerator

    International Nuclear Information System (INIS)

    Fox, J.D.; Linstadt, E.; Melen, R.

    1983-03-01

    The history and current status of SLAC's SDLC networks for distributed accelerator control systems are discussed. These local area networks have been used for instrumentation and control of the linear accelerator. Network topologies, protocols, physical links, and logical interconnections are discussed for specific applications in distributed data acquisition and control system, computer networks and accelerator operations

  2. Accelerators and the Accelerator Community

    Energy Technology Data Exchange (ETDEWEB)

    Malamud, Ernest; Sessler, Andrew

    2008-06-01

    In this paper, standing back--looking from afar--and adopting a historical perspective, the field of accelerator science is examined. How it grew, what are the forces that made it what it is, where it is now, and what it is likely to be in the future are the subjects explored. Clearly, a great deal of personal opinion is invoked in this process.

  3. Accelerators and the Accelerator Community

    International Nuclear Information System (INIS)

    Malamud, Ernest; Sessler, Andrew

    2008-01-01

    In this paper, standing back--looking from afar--and adopting a historical perspective, the field of accelerator science is examined. How it grew, what are the forces that made it what it is, where it is now, and what it is likely to be in the future are the subjects explored. Clearly, a great deal of personal opinion is invoked in this process

  4. The Strategic Communication Plan: Effective Communication for Strategic Leaders

    National Research Council Canada - National Science Library

    Reeder, Melanie

    1998-01-01

    .... It addresses the purpose, developmental process, content, and implementation of a strategic communication plan offering specific recommendations for the creation and effective use of a successful plan...

  5. Computers in engineering. 1988

    International Nuclear Information System (INIS)

    Tipnis, V.A.; Patton, E.M.

    1988-01-01

    These proceedings discuss the following subjects: Knowledge base systems; Computers in designing; uses of artificial intelligence; engineering optimization and expert systems of accelerators; and parallel processing in designing

  6. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  7. Ring accelerators

    International Nuclear Information System (INIS)

    Gisler, G.; Faehl, R.

    1983-01-01

    We present two-dimensional simulations in (r-z) and r-theta) cylinderical geometries of imploding-liner-driven accelerators of rings of charged particles. We address issues of azimuthal and longitudinal stability of the rings. We discuss self-trapping designs in which beam injection and extraction is aided by means of external cusp fields. Our simulations are done with the 2-1/2-D particle-in-cell plasma simulation code CLINER, which combines collisionless, electromagnetic PIC capabilities with a quasi-MHD finite element package

  8. accelerating cavity

    CERN Multimedia

    On the inside of the cavity there is a layer of niobium. Operating at 4.2 degrees above absolute zero, the niobium is superconducting and carries an accelerating field of 6 million volts per metre with negligible losses. Each cavity has a surface of 6 m2. The niobium layer is only 1.2 microns thick, ten times thinner than a hair. Such a large area had never been coated to such a high accuracy. A speck of dust could ruin the performance of the whole cavity so the work had to be done in an extremely clean environment.

  9. Strategic sizing of energy storage facilities in electricity markets

    DEFF Research Database (Denmark)

    Nasrolahpour, Ehsan; Kazempour, Seyyedjalal; Zareipour, Hamidreza

    2016-01-01

    This paper proposes a model to determine the optimasize of an energy storage facility from a strategic investor’s perspective. This investor seeks to maximize its profit through making strategic planning, i.e., storage sizing, and strategic operational, i.e., offering and bidding, decisions. We...... consider the uncertainties associated with rival generators’ offering strategies and future load levels in the proposed model. The strategic investment decisions include the sizes of charging device, discharging device and energy reservoir. The proposed model is a stochastic bi-level optimization problem......; the planning and operation decisions are made in the upper-level, and market clearing is modeled in the lower-level under different operating scenarios. To make the proposed model computationally tractable, an iterative solution technique based on Benders’ decomposition is implemented. This provides a master...

  10. Modern control techniques for accelerators

    International Nuclear Information System (INIS)

    Goodwin, R.W.; Shea, M.F.

    1984-01-01

    Beginning in the mid to late sixties, most new accelerators were designed to include computer based control systems. Although each installation differed in detail, the technology of the sixties and early to mid seventies dictated an architecture that was essentially the same for the control systems of that era. A mini-computer was connected to the hardware and to a console. Two developments have changed the architecture of modern systems: the microprocessor and local area networks. This paper discusses these two developments and demonstrates their impact on control system design and implementation by way of describing a possible architecture for any size of accelerator. Both hardware and software aspects are included

  11. THE MODELS OF STRATEGIC MANAGEMENT OF INFOCOMM BUSINESS

    Directory of Open Access Journals (Sweden)

    M. A. Lyashenko

    2015-01-01

    and communication business made in article of the analysis one general idea of formation of strategy of management of infocommunication business which consists in full recognition of inevitability of globalization processes in the modern world at the accelerated development of information technologies was selected. In these conditions, the companies use such strategic means of the competition, as: increase of productivity, mastering of the new markets, creation of new business models and attraction of talents on a global scale.

  12. Strategic management of population programs

    OpenAIRE

    Bernhart, Michael H.

    1992-01-01

    Formal strategic planning and management appear to contribute to organizational effectiveness. The author surveys the literature on strategic management in private/for-profit organizations and applies lessons from that literature to population programs. Few would argue that population programs would not benefit from strategic planning and management, but it would be inadvisable to initiate the process when the organization is faced with a short-term crisis; during or immediately before a chan...

  13. Strategic planning: today's hot buttons.

    Science.gov (United States)

    Bohlmann, R C

    1998-01-01

    The first generation of mergers and managed care hasn't slowed down group practices' need for strategic planning. Even groups that already went through one merger are asking about new mergers or ownership possibilities, the future of managed care, performance standards and physician unhappiness. Strategic planning, including consideration of bench-marking, production of ancillary services and physician involvement, can help. Even if only a short, general look at the future, strategic planning shows the proactive leadership needed in today's environment.

  14. Cosmic ray acceleration mechanisms

    International Nuclear Information System (INIS)

    Cesarsky, C.J.

    1982-09-01

    We present a brief summary of some of the most popular theories of cosmic ray acceleration: Fermi acceleration, its application to acceleration by shocks in a scattering medium, and impulsive acceleration by relativistic shocks

  15. Strategic Human Resources Management

    Directory of Open Access Journals (Sweden)

    Marta Muqaj

    2016-07-01

    Full Text Available Strategic Human Resources Management (SHRM represents an important and sensitive aspect of the functioning and development of a company, business, institution, state, public or private agency of a country. SHRM is based on a point of view of the psychological practices, especially by investing on empowerment, broad training and teamwork. This way it remains the primary resource to maintain stability and competitiveness. SHRM has lately evolved on fast and secure steps, and the transformation from Management of Human Resources to SHRM is becoming popular, but it still remains impossible to exactly estimate how much SHRM has taken place in updating the practices of HRM in organizations and institutions in general. This manuscript aims to make a reflection on strategic management, influence factors in its practices on some organizations. Researchers aim to identify influential factors that play key roles in SHRM, to determine its challenges and priorities which lay ahead, in order to select the most appropriate model for achieving a desirable performance. SHRM is a key factor in the achievement of the objectives of the organization, based on HR through continuous performance growth, it’s a complex process, unpredictable and influenced by many outside and inside factors, which aims to find the shortest way to achieve strategic competitive advantages, by creating structure planning, organizing, thinking values, culture, communication, perspectives and image of the organization. While traditional management of HR is focused on the individual performance of employees, the scientific one is based on the organizational performance, the role of the HRM system as main factor on solving business issues and achievement of competitive advantage within its kind.

  16. Strategic management process in hospitals.

    Science.gov (United States)

    Zovko, V

    2001-01-01

    Strategic management is concerned with strategic choices and strategic implementation; it provides the means by which organizations meet their objectives. In the case of hospitals it helps executives and all employees to understand the real purpose and long term goals of the hospital. Also, it helps the hospital find its place in the health care service provision chain, and enables the hospital to coordinate its activities with other organizations in the health care system. Strategic management is a tool, rather than a solution, that helps executives to identify root causes of major problems in the hospital.

  17. Strategic thinking for radiology

    OpenAIRE

    Schilling, Ronald B.

    1998-01-01

    We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the eff...

  18. Strategic Aspects of Bundling

    International Nuclear Information System (INIS)

    Podesta, Marion

    2008-01-01

    The increase of bundle supply has become widespread in several sectors (for instance in telecommunications and energy fields). This paper review relates strategic aspects of bundling. The main purpose of this paper is to analyze profitability of bundling strategies according to the degree of competition and the characteristics of goods. Moreover, bundling can be used as price discrimination tool, screening device or entry barriers. In monopoly case bundling strategy is efficient to sort consumers in different categories in order to capture a maximum of surplus. However, when competition increases, the profitability on bundling strategies depends on correlation of consumers' reservations values. (author)

  19. Guam Strategic Energy Plan

    Energy Technology Data Exchange (ETDEWEB)

    Conrad, M. D.

    2013-07-01

    Describes various energy strategies available to Guam to meet the territory's goal of diversifying fuel sources and reducing fossil energy consumption 20% by 2020.The information presented in this strategic energy plan will be used by the Guam Energy Task Force to develop an energy action plan. Available energy strategies include policy changes, education and outreach, reducing energy consumption at federal facilities, and expanding the use of a range of energy technologies, including buildings energy efficiency and conservation, renewable electricity production, and alternative transportation. The strategies are categorized based on the time required to implement them.

  20. Strategic CSR in Afghanistan

    DEFF Research Database (Denmark)

    Azizi, Sameer

    . Based on an analysis of five CSR projects, it can be assessed that Roshan enhances its competitive advantage through CSR in internal, external, and wider-society levels. It is analyzed that Roshan influences its competitive context both from inside-out and out-side in dimensions, and that the CSR......’, but is based on a ‘license to operate’ motivation, where businesses have free room for maneuvering CSR towards their strategic priorities and business goals. Whether this creates a ‘shared value’ for both business and in particularly for the society is however still questionable....

  1. Strategic Global Climate Command?

    Science.gov (United States)

    Long, J. C. S.

    2016-12-01

    Researchers have been exploring geoengineering because Anthropogenic GHG emissions could drive the globe towards unihabitability for people, wildlife and vegetation. Potential global deployment of these technologies is inherently strategic. For example, solar radiation management to reflect more sunlight might be strategically useful during a period of time where the population completes an effort to cease emissions and carbon removal technologies might then be strategically deployed to move the atmospheric concentrations back to a safer level. Consequently, deployment of these global technologies requires the ability to think and act strategically on the part of the planet's governments. Such capacity most definitely does not exist today but it behooves scientists and engineers to be involved in thinking through how global command might develop because the way they do the research could support the development of a capacity to deploy intervention rationally -- or irrationally. Internationalizing research would get countries used to working together. Organizing the research in a step-wise manner where at each step scientists become skilled at explaining what they have learned, the quality of the information they have, what they don't know and what more they can do to reduce or handle uncertainty, etc. Such a process can increase societal confidence in being able to make wise decisions about deployment. Global capacity will also be enhanced if the sceintific establishment reinvents misssion driven research so that the programs will identify the systemic issues invovled in any proposed technology and systematically address them with research while still encouraging individual creativity. Geoengineering will diverge from climate science in that geoengineering research needs to design interventions for some publically desirable goal and investigates whether a proposed intervention will acheive desired outcomes. The effort must be a systems-engineering design problem

  2. Encouraging environmentally strategic technologies

    International Nuclear Information System (INIS)

    Heaton, G.R.

    1994-01-01

    Having moved beyond its initial absorption with controlling new technology, environmental policy today must focus more strongly on promoting the development and adoption of new technologies. World Resource Institute's (WRI) ongoing study of 'environmentally strategic technology' is addressed to this fundamental policy issue. The study proposes criteria for identifying such technology, offers a specific list, suggests the kinds of public policy changes necessary to encourage their development and finally presents a comparison of critical technology lists (from the White House, the European Community, Japan and the US Department of Defense). (TEC)

  3. Strategic Urban Governance

    DEFF Research Database (Denmark)

    Pagh, Jesper

    2014-01-01

    The days of long-term predict-and-provide planning that saw its heydays in the post-war decades are long gone. As our late-modern time presents us with an evermore complex and contrasting view of the world, planning has become a much more fragmented and ambivalent affair. That a country or a city...... should be run like a private corporation has increasingly become common sense, and thus the competition among entities – be it countries, regions or cities – to a greater and greater extent defines success and the means to achieve it. What has been collected under the umbrella term Strategic Urban...

  4. Neurocognitive dysfunction in strategic and non-strategic gamblers.

    Science.gov (United States)

    Grant, Jon E; Odlaug, Brian L; Chamberlain, Samuel R; Schreiber, Liana R N

    2012-08-07

    It has been theorized that there may be subtypes of pathological gambling, particularly in relation to the main type of gambling activities undertaken. Whether or not putative pathological gambling subtypes differ in terms of their clinical and cognitive profiles has received little attention. Subjects meeting DSM-IV criteria for pathological gambling were grouped into two categories of preferred forms of gambling - strategic (e.g., cards, dice, sports betting, stock market) and non-strategic (e.g., slots, video poker, pull tabs). Groups were compared on clinical characteristics (gambling severity, and time and money spent gambling), psychiatric comorbidity, and neurocognitive tests assessing motor impulsivity and cognitive flexibility. Seventy-seven subjects were included in this sample (45.5% females; mean age: 42.7±14.9) which consisted of the following groups: strategic (n=22; 28.6%) and non-strategic (n=55; 71.4%). Non-strategic gamblers were significantly more likely to be older, female, and divorced. Money spent gambling did not differ significantly between groups although one measure of gambling severity reflected more severe problems for strategic gamblers. Strategic and non-strategic gamblers did not differ in terms of cognitive function; both groups showed impairments in cognitive flexibility and inhibitory control relative to matched healthy volunteers. These preliminary results suggest that preferred form of gambling may be associated with specific clinical characteristics but are not dissociable in terms of cognitive inflexibility and motor impulsivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Energy Innovation Acceleration Program

    Energy Technology Data Exchange (ETDEWEB)

    Wolfson, Johanna [Fraunhofer USA Inc., Center for Sustainable Energy Systems, Boston, MA (United States)

    2015-06-15

    The Energy Innovation Acceleration Program (IAP) – also called U-Launch – has had a significant impact on early stage clean energy companies in the Northeast and on the clean energy economy in the Northeast, not only during program execution (2010-2014), but continuing into the future. Key results include: Leverage ratio of 105:1; $105M in follow-on funding (upon $1M investment by EERE); At least 19 commercial products launched; At least 17 new industry partnerships formed; At least $6.5M in revenue generated; >140 jobs created; 60% of assisted companies received follow-on funding within 1 year of program completion; In addition to the direct measurable program results summarized above, two primary lessons emerged from our work executing Energy IAP:; Validation and demonstration awards have an outsized, ‘tipping-point’ effect for startups looking to secure investments and strategic partnerships. An ecosystem approach is valuable, but an approach that evaluates the needs of individual companies and then draws from diverse ecosystem resources to fill them, is most valuable of all.

  6. Operationalizing strategic marketing.

    Science.gov (United States)

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  7. A control and data processing system for neutron time-of-flight experiments at the Harwell linear accelerator based on a PDP-11/45 mini-computer

    International Nuclear Information System (INIS)

    Chapman, W.S.; Boyce, D.A.; Brisland, J.B.; Langman, A.E.; Morris, D.V.; Schomberg, M.G.; Webb, D.A.

    1977-05-01

    The subject is treated in sections, entitled: introduction (experimental method, need for the PDP-11/45 based system); features required in the control and data processing system; description of the selected system configuration (PDP 11/45 mini-computer and RSX-11 D operating system, the single parameter experimental stations (the CAMAC units, the time-of-flight scaler)); description of the applications software; system performance. (U.K.)

  8. FAMOUS, faster: using parallel computing techniques to accelerate the FAMOUS/HadCM3 climate model with a focus on the radiative transfer algorithm

    Directory of Open Access Journals (Sweden)

    P. Hanappe

    2011-09-01

    Full Text Available We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations.

    The modified algorithm runs more than 50 times faster on the CELL's Synergistic Processing Element than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60 % of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

  9. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan; Siddiqui, Shahzeb; Feki, Saber

    2014-01-01

    the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures

  10. Trends in accelerator control systems

    International Nuclear Information System (INIS)

    Crowley-Milling, M.C.

    1984-04-01

    Over the years, we have seen a revolution in control systems that has followed the ever decreasing cost of computer power and memory. It started with the data gathering, when people distrusted the computer to perform control actions correctly, through the stage of using a computer to perform control actions correctly, through the stage of using a computer system to provide a convenient remote look and adjust facility, to the present day, when more and more emphasis is being placed on using a computer system to simulate or model all or parts of the accelerator, feed in the required performance and calling for the computers to set the various parameters and then measure the actual performance, with iteration if necessary. The progress that has been made in the fields of architecture, communications, computers, interface, software design and operator interface is reviewed

  11. Inverse Free Electron Laser accelerator

    International Nuclear Information System (INIS)

    Fisher, A.; Gallardo, J.; van Steenbergen, A.; Sandweiss, J.

    1992-09-01

    The study of the INVERSE FREE ELECTRON LASER, as a potential mode of electron acceleration, is being pursued at Brookhaven National Laboratory. Recent studies have focussed on the development of a low energy, high gradient, multi stage linear accelerator. The elementary ingredients for the IFEL interaction are the 50 MeV Linac e - beam and the 10 11 Watt CO 2 laser beam of BNL's Accelerator Test Facility (ATF), Center for Accelerator Physics (CAP) and a wiggler. The latter element is designed as a fast excitation unit making use of alternating stacks of Vanadium Permendur (VaP) ferromagnetic laminations, periodically interspersed with conductive, nonmagnetic laminations, which act as eddy current induced field reflectors. Wiggler parameters and field distribution data will be presented for a prototype wiggler in a constant period and in a ∼ 1.5 %/cm tapered period configuration. The CO 2 laser beam will be transported through the IFEL interaction region by means of a low loss, dielectric coated, rectangular waveguide. Short waveguide test sections have been constructed and have been tested using a low power cw CO 2 laser. Preliminary results of guide attenuation and mode selectivity will be given, together with a discussion of the optical issues for the IFEL accelerator. The IFEL design is supported by the development and use of 1D and 3D simulation programs. The results of simulation computations, including also wiggler errors, for a single module accelerator and for a multi-module accelerator will be presented

  12. Strategic Planning for Higher Education.

    Science.gov (United States)

    Kotler, Philip; Murphy, Patrick E.

    1981-01-01

    The framework necessary for achieving a strategic planning posture in higher education is outlined. The most important benefit of strategic planning for higher education decision makers is that it forces them to undertake a more market-oriented and systematic approach to long- range planning. (Author/MLW)

  13. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...... deficits are important problems today...

  14. Strategic Interactions in Franchise Relationships

    NARCIS (Netherlands)

    Croonen, Evelien Petronella Maria

    2006-01-01

    This dissertation deals with understanding strategic interactions between franchisors and franchisees. The empirical part of this study consists of in-depth case studies in four franchise systems in the Dutch drugstore industry. The case studies focus on a total of eight strategic change processes

  15. The Science of Strategic Communication

    Science.gov (United States)

    The field of Strategic Communication involves a focused effort to identify, develop, and present multiple types of communication media on a given subject. A Strategic Communication program recognizes the limitations of the most common communication models (primarily “one s...

  16. Strategic delegation improves cartel stability

    NARCIS (Netherlands)

    Han, M.A.

    2010-01-01

    Fershtman and Judd (1987) and Sklivas (1987) show how strategic delegation in the one-shot Cournot game reduces firm profits. However, with infinitely repeated interaction, strategic delegation allows for an improvement in cartel stability compared to the infinitely repeated standard Cournot game,

  17. Strategic Partnerships in International Development

    Science.gov (United States)

    Treat, Tod; Hartenstine, Mary Beth

    2013-01-01

    This chapter provides a framework and recommendations for development of strategic partnerships in a variety of cultural contexts. Additionally, this study elucidates barriers and possibilities in interagency collaborations. Without careful consideration regarding strategic partnerships' approaches, functions, and goals, the ability to…

  18. The Ethics of Strategic Ambiguity.

    Science.gov (United States)

    Paul, Jim; Strbiak, Christy A.

    1997-01-01

    Examines the concept of strategic ambiguity in communication, and addresses the ethics of strategic ambiguity from an intrapersonal perspective that considers the congruity of communicators' espoused-ethics, ethics-in-use, and behavior, where ethical judgements are based on the congruity between espoused-ethics and actual behavior. Poses questions…

  19. Strategic Planning Is an Oxymoron

    Science.gov (United States)

    Bassett, Patrick F.

    2012-01-01

    The thinking on "strategic thinking" has evolved significantly over the years. In the previous century, the independent school strategy was to focus on long-range planning, blithely projecting 10 years into the future. For decades this worked well enough, but in the late 20th century, independent schools shifted to "strategic planning," with its…

  20. Strategic Planning and Online Learning

    Science.gov (United States)

    McLaughlin-Graham, Karen; Berge, Zane L.

    2005-01-01

    Strategic planning is a critical part of sustaining distance education. Through such planning, the organization can solve business problems that involve training and education in an effective and often cost savings manner compared to in-person training efforts. This paper examines the strategic planning process as it relates to sustaining distance…

  1. Strategic Aspects of Cost Management

    Directory of Open Access Journals (Sweden)

    Angelika I. Petrova

    2013-01-01

    Full Text Available This report is a summary of a research done on the area of Strategic Cost Management (SCM. This report includes a detailed discussion and application of Life Cycle Costing (LCC which a company can use to achieve its strategic objects in today's dynamic business environment. Hence, the main focus of this report is on LCC as mentioned

  2. Equipartitioning in linear accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1982-01-01

    Emittance growth has long been a concern in linear accelerators, as has the idea that some kind of energy balance, or equipartitioning, between the degrees of freedom, would ameliorate the growth. M. Prome observed that the average transverse and longitudinal velocity spreads tend to equalize as current in the channel is increased, while the sum of the energy in the system stays nearly constant. However, only recently have we shown that an equipartitioning requirement on a bunched injected beam can indeed produce remarkably small emittance growth. The simple set of equations leading to this condition are outlined. At the same time, Hofmann has investigated collective instabilities in transported beams and has identified thresholds and regions in parameter space where instabilities occur. Evidence is presented that shows transport system boundaries to be quite accurate in computer simulations of accelerating systems. Discussed are preliminary results of efforts to design accelerators that avoid parameter regions where emittance is affected by the instabilities identified by Hofmann. These efforts suggest that other mechanisms are present. The complicated behavior of the RFQ linac in this framework also is shown

  3. Particle Accelerator Focus Automation

    Science.gov (United States)

    Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João

    2017-08-01

    The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  4. Particle Accelerator Focus Automation

    Directory of Open Access Journals (Sweden)

    Lopes José

    2017-08-01

    Full Text Available The Laboratório de Aceleradores e Tecnologias de Radiação (LATR at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+ and proton (H+ beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  5. Using Model to Plan of Strategic Objectives

    OpenAIRE

    Terezie Bartusková; Jitka Baňařová; Zuzana Kusněřová

    2012-01-01

    Importance of strategic planning is unquestionable. However, the practical implementation of a strategic plan faces too many obstacles. The aim of the article is explained the importance of strategic planning and to find how companies in Moravian-Silesian Region deal with strategic planning, and to introduce the model, which helps to set strategic goals in financial indicators area. This model should be part of the whole process of strategic planning and can be use to predict the future value...

  6. Multipactoring studies in accelerating structures

    International Nuclear Information System (INIS)

    Kravachuk, L.V.; Puntus, V.A.; Romanov, G.V.; Tarsov, S.G.

    1992-01-01

    A multipactor discharge takes place in the accelerating tanks of the Moscow meson factory linac. The RF power level, the place and the characteristics of the discharge were determined based on experimental results and the computer simulation. The results of the investigation are given. (Author) 5 refs

  7. Standard test method for accelerated leach test for diffusive releases from solidified waste and a computer program to model diffusive, fractional leaching from cylindrical waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method provides procedures for measuring the leach rates of elements from a solidified matrix material, determining if the releases are controlled by mass diffusion, computing values of diffusion constants based on models, and verifying projected long-term diffusive releases. This test method is applicable to any material that does not degrade or deform during the test. 1.1.1 If mass diffusion is the dominant step in the leaching mechanism, then the results of this test can be used to calculate diffusion coefficients using mathematical diffusion models. A computer program developed for that purpose is available as a companion to this test method (Note 1). 1.1.2 It should be verified that leaching is controlled by diffusion by a means other than analysis of the leach test solution data. Analysis of concentration profiles of species of interest near the surface of the solid waste form after the test is recommended for this purpose. 1.1.3 Potential effects of partitioning on the test results can...

  8. Healthcare's Future: Strategic Investment in Technology.

    Science.gov (United States)

    Franklin, Michael A

    2018-01-01

    Recent and rapid advances in the implementation of technology have greatly affected the quality and efficiency of healthcare delivery in the United States. Simultaneously, diverse generational pressures-including the consumerism of millennials and unsustainable growth in the costs of care for baby boomers-have accelerated a revolution in healthcare delivery that was marked in 2010 by the passage of the Affordable Care Act.Against this backdrop, Maryland and the Centers for Medicare & Medicaid Services entered into a partnership in 2014 to modernize the Maryland All-Payer Model. Under this architecture, each Maryland hospital negotiates a global budget revenue agreement with the state's rate-setting agency, limiting the hospital's annual revenue to the budgetary cap established by the state.At Atlantic General Hospital (AGH), leaders had established a disciplined strategic planning process in which the board of trustees, medical staff, and administration annually agree on goals and initiatives to achieve the objectives set forth in its five-year strategic plans. This article describes two initiatives to improve care using technology. In 2006, AGH introduced a service guarantee in the emergency room (ER); the ER 30-Minute Promise assures patients that they will be placed in a bed or receive care within 30 minutes of arrival in the ER. In 2007, several independent hospitals in the state formed Maryland eCare to jointly contract for intensive care unit (ICU) physician coverage via telemedicine. This technology allows clinical staff to continuously monitor ICU patients remotely. The positive results of the ER 30-Minute Promise and Maryland eCare program show that technological advances in an independent, small, rural hospital can make a significant impact on its ability to maintain independence. AGH's strategic investments prepared the organization well for the transition in 2014 to a value-based payment system.

  9. The Los Alamos accelerator code group

    Energy Technology Data Exchange (ETDEWEB)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-05-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG`s activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET.

  10. The Los Alamos accelerator code group

    International Nuclear Information System (INIS)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-01-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG's activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET

  11. Multinational Corporation and International Strategic Alliance

    Institute of Scientific and Technical Information of China (English)

    陆兮

    2015-01-01

    The world is now deeply into the second great wave of globalization, in which product, capital, and markets are becoming more and more integrated across countries. Multinational corporations are gaining their rapid growth around the globe and playing a significant role in the world economy. Meanwhile, the accelerated rate of globalization has also imposed pressures on MNCs, left them desperately seeking overseas alliances in order to remain competitive. International strategic alliances, which bring together large and commonly competitive firms for specific purposes, have gradual y shown its importance in the world market. And the form of international joint venture is now widely adopted. Then after the formation of alliances, selecting the right partner, formulating right strategies, establishing harmonious and effective partnership are generally the key to success.

  12. Strategic thinking on oil development in China

    International Nuclear Information System (INIS)

    Liu Keyu; Shan Weiguo

    2005-01-01

    It is expected that crude oil production in China will maintain its current level until 2020. Driven by higher living standards and the rapid development of energy intensive industries, China's oil demand will increase rapidly and might lead to heavier import dependency. Three cases of demand forecasts are presented, but for the sake of sustainable economic and social development, neither the high nor the middle case is favourable for China. Thus, China must seek a path of oil saving economic development, and limit oil consumption to no more than 350MT in 2010 and 450MT in 2020. Meanwhile, in order to secure the oil supply, the following strategies should be adopted: save oil and develop alternative energies; stabilise domestic oil production and to diversify oil imports and overseas oil exploration and development; accelerate the gas industry and introduce strategic petroleum reserves. (author)

  13. The Advanced Test Reactor Strategic Evaluation Program

    International Nuclear Information System (INIS)

    Buescher, B.J.

    1990-01-01

    A systematic evaluation of safety, environmental, and operational issues has been initiated at the Advanced Test Reactor (ATR). This program, the Strategic Evaluation Program (STEP), provides an integrated review of safety and operational issues against the standards applied to licensed commercial facilities. In the review of safety issues, 18 deviations were identified which required prompt attention. Resolution of these items has been accelerated in the program. An integrated living schedule is being developed to address the remaining findings. A risk evaluation is being performed on the proposed corrective actions and these actions will then be formally ranked in order of priority based on considerations of safety and operational significance. Once the final ranking is completed, an integrated schedule will be developed, which will include considerations of availability of funding and operating schedule. 3 refs., 2 figs

  14. Strategic cycling: shaking complacency in healthcare strategic planning.

    Science.gov (United States)

    Begun, J; Heatwole, K B

    1999-01-01

    As the conditions affecting business and healthcare organizations in the United States have become more turbulent and uncertain, strategic planning has decreased in popularity. Strategic planning is criticized for stiffling creative responses to the new marketplace and for fostering compartmentalized organizations, adherence to outmoded strategies, tunnel vision in strategy formulation, and overemphasis on planning to the detriment of implementation. However, effective strategic planning can be a force for mobilizing all the constituents of an organization, creating discipline in pursuit of a goal, broadening an organization's perspective, improving communication among disciplines, and motivating the organization's workforce. It is worthwhile for healthcare organizations to preserve these benefits of strategic planning at the same time recognizing the many sources of turbulence and uncertainty in the healthcare environment. A model of "strategic cycling" is presented to address the perceived shortcomings of traditional strategic planning in a dynamic environment. The cycling model facilitates continuous assessment of the organization's mission/values/vision and primary strategies based on feedback from benchmark analysis, shareholder impact, and progress in strategy implementation. Multiple scenarios and contingency plans are developed in recognition of the uncertain future. The model represents a compromise between abandoning strategic planning and the traditional, linear model of planning based on progress through predetermined stages to a masterpiece plan.

  15. ABSTRACTS Preliminary Study of Strategic Inner Cores

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    When a strategic entity attempts to make a dicision, first the project must be m accoroance wlm its strategic framework as well as make the strategic inner cores prominent. The existing theories of development strategy indicate that the formation of the framework can be divided into the following parts: inside and outside environments, purpose, goal, key points, and countermeasures. The strategic inner cores that put forward by this paper is the intensification and advancement for the theory of strategic framework, strategic orientation, strategic vision and main line are inciuded. Appearance of these ideas have improved the theory and enhanced strategic practice.

  16. Strategic thinking for radiology.

    Science.gov (United States)

    Schilling, R B

    1997-08-01

    We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.

  17. Trust in Strategic Alliances

    DEFF Research Database (Denmark)

    Nielsen, Bo

    2011-01-01

    This article examines the dynamic and multi-dimensional nature of trust in strategic alliances. Adopting a co-evolutionary approach, I developed a framework to show how trust, conceptualised in different forms, plays distinct roles at various evolutionary stages of the alliance relationship....... Emphasising the multi-dimensional and dynamic role of trust, the framework illustrates how initial levels of a particular type of trust may co-evolve with the alliance and influence subsequent phases of the relationship – either on its own or in combination with other types or dimensions of trust....... The theoretical distinction between trust as antecedent, moderator and outcome during the evolution of the alliance relationship leads to research questions that may guide future empirical research....

  18. Strategizing on innovation systems

    DEFF Research Database (Denmark)

    Jofre, Sergio

    developments enabling proper policy actions. The concept of innovation systems assumes that flows of technology and information among people, companies and institutions are crucial to the innovative process. At national level, innovation and technical development are the result of a complex set of interactions......This paper explores the strategic context of the implementation of the European Institute of Technology (EIT) from the perspective of National Innovation Systems (NIS) and the Triple Helix of University-Government-Industry relationship. The analytical framework is given by a comparative study...... implemented several action plans and programmes aiming at improving its technological and non-technological innovation capability, its performance in the global context is yet week, particularly if compared to rival economies such as Japan and the US (EC, 2008a). A recent initiative to foster Europe...

  19. Strategic research on materials

    International Nuclear Information System (INIS)

    Williams, J.

    1987-01-01

    Strategic research is defined as that which is necessary to support not only an understanding of the phenomenon on which a new technology is based, but also the raft of other technologies needed to exploit the new phenomenon. The theme is illustrated by reference to the development of ceramics of importance to the nuclear industry and in particularly with relation to the AGR. Starting from natural uranium, the underlying and wide ranging research effort devoted to the technology of isotopic enrichment, the investigation of the uranium-oxygen binary system, fabrication of uranium dioxide fuel, interactions between the fuel and stainless steel cans, between the cans and CO 2 coolant and between the coolant and graphite moderator, is outlined. The role of ceramics in stable radioactive waste containment is also briefly mentioned. (author)

  20. Laser-driven acceleration with Bessel beam

    International Nuclear Information System (INIS)

    Imasaki, Kazuo; Li, Dazhi

    2005-01-01

    A new approach of laser-driven acceleration with Bessel beam is described. Bessel beam, in contrast to the Gaussian beam, shows diffraction-free'' characteristics in its propagation, which implies potential in laser-driven acceleration. But a normal laser, even if the Bessel beam, laser can not accelerate charged particle efficiently because the difference of velocity between the particle and photon makes cyclic acceleration and deceleration phase. We proposed a Bessel beam truncated by a set of annular slits those makes several special regions in its travelling path, where the laser field becomes very weak and the accelerated particles are possible to receive no deceleration as they undergo decelerating phase. Thus, multistage acceleration is realizable with high gradient. In a numerical computation, we have shown the potential of multistage acceleration based on a three-stage model. (author)