WorldWideScience

Sample records for accelerated strategic computing

  1. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories, along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.

  2. Accelerating Strategic Change Through Action Learning

    DEFF Research Database (Denmark)

    Younger, Jon; Sørensen, René; Cleemann, Christine

    2013-01-01

    Purpose – The purpose of this paper is to describe how a leading global company used action-learning based leadership development to accelerate strategic culture change. Design/methodology/approach – It describes the need for change, and the methodology and approach by which the initiative, Impact......, generated significant benefits. Findings – The initiative led to financial benefit, as well as measurable gains in customer centricity, collaboration, and innovation. It was also a powerful experience for participants in their journey as commercial leaders. Originality/value – Impact was created using...

  3. Accelerating Clean Energy Commercialization. A Strategic Partnership Approach

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Richard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Arent, Douglas J. [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Locklin, Ken [Impax Asset Management Group (United Kingdom)

    2016-04-01

    Technology development in the clean energy and broader clean tech space has proven to be challenging. Long-standing methods for advancing clean energy technologies from science to commercialization are best known for relatively slow, linear progression through research and development, demonstration, and deployment (RDD&D); and characterized by well-known valleys of death for financing. Investment returns expected by traditional venture capital investors have been difficult to achieve, particularly for hardware-centric innovations, and companies that are subject to project finance risks. Commercialization support from incubators and accelerators has helped address these challenges by offering more support services to start-ups; however, more effort is needed to fulfill the desired clean energy future. The emergence of new strategic investors and partners in recent years has opened up innovative opportunities for clean tech entrepreneurs, and novel commercialization models are emerging that involve new alliances among clean energy companies, RDD&D, support systems, and strategic customers. For instance, Wells Fargo and Company (WFC) and the National Renewable Energy Laboratory (NREL) have launched a new technology incubator that supports faster commercialization through a focus on technology development. The incubator combines strategic financing, technology and technical assistance, strategic customer site validation, and ongoing financial support.

  4. CONFERENCE: Computers and accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-01-15

    In September of last year a Conference on 'Computers in Accelerator Design and Operation' was held in West Berlin attracting some 160 specialists including many from outside Europe. It was a Europhysics Conference, organized by the Hahn-Meitner Institute with Roman Zelazny as Conference Chairman, postponed from an earlier intended venue in Warsaw. The aim was to bring together specialists in the fields of accelerator design, computer control and accelerator operation.

  5. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  6. Personal computers in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.

    1988-01-01

    The advent of the personal computer has created a popular movement which has also made a strong impact on science and engineering. Flexible software environments combined with good computational performance and large storage capacities are becoming available at steadily decreasing costs. Of equal importance, however, is the quality of the user interface offered on many of these products. Graphics and screen interaction is available in ways that were only possible on specialized systems before. Accelerator engineers were quick to pick up the new technology. The first applications were probably for controllers and data gatherers for beam measurement equipment. Others followed, and today it is conceivable to make personal computer a standard component of an accelerator control system. This paper reviews the experience gained at CERN so far and describes the approach taken in the design of the common control center for the SPS and the future LEP accelerators. The design goal has been to be able to integrate personal computers into the accelerator control system and to build the operator's workplace around it. (orig.)

  7. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  8. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  9. Analogue computer display of accelerator beam optics

    International Nuclear Information System (INIS)

    Brand, K.

    1984-01-01

    Analogue computers have been used years ago by several authors for the design of magnetic beam handling systems. At Bochum a small analogue/hybrid computer was combined with a particular analogue expansion and logic control unit for beam transport work. This apparatus was very successful in the design and setup of the beam handling system of the tandem accelerator. The center of the stripper canal was the object point for the calculations, instead of the high energy acceleration tube a drift length was inserted into the program neglecting the weak focusing action of the tube. In the course of the installation of a second injector for heavy ions it became necessary to do better calculations. A simple method was found to represent accelerating sections on the computer and a particular way to simulate thin lenses was adopted. The analogue computer system proved its usefulness in the design and in studies of the characteristics of different accelerator installations over many years. The results of the calculations are in very good agreement with real accelerator data. The apparatus is the ideal tool to demonstrate beam optics to students and accelerator operators since the effect of a change of any of the parameters is immediately visible on the oscilloscope

  10. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  11. Cloud computing strategic framework (FY13 - FY15).

    Energy Technology Data Exchange (ETDEWEB)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.; Cox, Philip M.; Rogers, G. Kelly

    2012-11-01

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  12. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  13. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  14. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  15. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  16. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  17. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  18. Computer simulation of dynamic processes on accelerators

    International Nuclear Information System (INIS)

    Kol'ga, V.V.

    1979-01-01

    The problems of computer numerical investigation of motion of accelerated particles in accelerators and storages, an effect of different accelerator systems on the motion, determination of optimal characteristics of accelerated charged particle beams are considered. Various simulation representations are discussed which describe the accelerated particle dynamics, such as the enlarged particle method, the representation where a great number of discrete particle is substituted for a field of continuously distributed space charge, the method based on determination of averaged beam characteristics. The procedure is described of numerical studies involving the basic problems, viz. calculation of closed orbits, establishment of stability regions, investigation of resonance propagation determination of the phase stability region, evaluation of the space charge effect the problem of beam extraction. It is shown that most of such problems are reduced to solution of the Cauchy problem using a computer. The ballistic method which is applied to solution of the boundary value problem of beam extraction is considered. It is shown that introduction into the equation under study of additional members with the small positive regularization parameter is a general idea of the methods for regularization of noncorrect problems [ru

  19. Computing tools for accelerator design calculations

    International Nuclear Information System (INIS)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  20. Computer codes for beam dynamics analysis of cyclotronlike accelerators

    Science.gov (United States)

    Smirnov, V.

    2017-12-01

    Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.

  1. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  2. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  3. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  4. Strategic directions of computing at Fermilab

    Science.gov (United States)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  5. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  6. Computer programs in accelerator physics

    International Nuclear Information System (INIS)

    Keil, E.

    1984-01-01

    Three areas of accelerator physics are discussed in which computer programs have been applied with much success: i) single-particle beam dynamics in circular machines, i.e. the design and matching of machine lattices; ii) computations of electromagnetic fields in RF cavities and similar objects, useful for the design of RF cavities and for the calculation of wake fields; iii) simulation of betatron and synchrotron oscillations in a machine with non-linear elements, e.g. sextupoles, and of bunch lengthening due to longitudinal wake fields. (orig.)

  7. Strategic directions of computing at Fermilab

    International Nuclear Information System (INIS)

    Wolbers, S.

    1997-04-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R ampersand D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object- oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and project. R ampersand D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing

  8. Computer simulations of compact toroid formation and acceleration

    International Nuclear Information System (INIS)

    Peterkin, R.E. Jr.; Sovinec, C.R.

    1990-01-01

    Experiments to form, accelerate, and focus compact toroid plasmas will be performed on the 9.4 MJ SHIVA STAR fast capacitor bank at the Air Force Weapons Laboratory during the 1990. The MARAUDER (magnetically accelerated rings to achieve ultrahigh directed energy and radiation) program is a research effort to accelerate magnetized plasma rings with the masses between 0.1 and 1.0 mg to velocities above 10 8 cm/sec and energies above 1 MJ. Research on these high-velocity compact toroids may lead to development of very fast opening switches, high-power microwave sources, and an alternative path to inertial confinement fusion. Design of a compact toroid accelerator experiment on the SHIVA STAR capacitor bank is underway, and computer simulations with the 2 1/2-dimensional magnetohydrodynamics code, MACH2, have been performed to guide this endeavor. The compact toroids are produced in a magnetized coaxial plasma gun, and the acceleration will occur in a configuration similar to a coaxial railgun. Detailed calculations of formation and equilibration of a low beta magnetic force-free configuration (curl B = kB) have been performed with MACH2. In this paper, the authors discuss computer simulations of the focusing and acceleration of the toroid

  9. Accelerator requirments for strategic defense

    International Nuclear Information System (INIS)

    Gullickson, R.L.

    1987-01-01

    The authors discuss how directed energy applications require accelerators with high brightness and large gradients to minimize size and weight for space systems. Several major directed energy applications are based upon accelerator technology. The radio-frequency linear accelerator is the basis for both space-based neutral particle beam (NPB) and free electron laser (FEL) devices. The high peak current of the induction linac has made it a leading candidate for ground based free electron laser applications

  10. US DOE Grand Challenge in Computational Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, R.; Habib, S.; Qiang, J.; Ko, K.; Li, Z.; McCandless, B.; Mi, W.; Ng, C.; Saparov, M.; Srinivas, V.; Sun, Y.; Zhan, X.; Decyk, V.; Golub, G.

    1998-01-01

    Particle accelerators are playing an increasingly important role in basic and applied science, and are enabling new accelerator-driven technologies. But the design of next-generation accelerators, such as linear colliders and high intensity linacs, will require a major advance in numerical modeling capability due to extremely stringent beam control and beam loss requirements, and the presence of highly complex three-dimensional accelerator components. To address this situation, the U.S. Department of Energy has approved a ''Grand Challenge'' in Computational Accelerator Physics, whose primary goal is to develop a parallel modeling capability that will enable high performance, large scale simulations for the design, optimization, and numerical validation of next-generation accelerators. In this paper we report on the status of the Grand Challenge

  11. Symbolic mathematical computing: orbital dynamics and application to accelerators

    International Nuclear Information System (INIS)

    Fateman, R.

    1986-01-01

    Computer-assisted symbolic mathematical computation has become increasingly useful in applied mathematics. A brief introduction to such capabilitites and some examples related to orbital dynamics and accelerator physics are presented. (author)

  12. Strategic engineering for cloud computing and big data analytics

    CERN Document Server

    Ramachandran, Muthu; Sarwar, Dilshad

    2017-01-01

    This book demonstrates the use of a wide range of strategic engineering concepts, theories and applied case studies to improve the safety, security and sustainability of complex and large-scale engineering and computer systems. It first details the concepts of system design, life cycle, impact assessment and security to show how these ideas can be brought to bear on the modeling, analysis and design of information systems with a focused view on cloud-computing systems and big data analytics. This informative book is a valuable resource for graduate students, researchers and industry-based practitioners working in engineering, information and business systems as well as strategy. .

  13. Computer-based training for particle accelerator personnel

    International Nuclear Information System (INIS)

    Silbar, R.R.

    1999-01-01

    A continuing problem at many laboratories is the training of new operators in the arcane technology of particle accelerators. Presently most of this training occurs on the job, under a mentor. Such training is expensive, and while it provides operational experience, it is frequently lax in providing the physics background needed to truly understand accelerator systems. Using computers in a self-paced, interactive environment can be more effective in meeting this training need. copyright 1999 American Institute of Physics

  14. Computational needs for the RIA accelerator systems

    International Nuclear Information System (INIS)

    Ostroumov, P.N.; Nolen, J.A.; Mustapha, B.

    2006-01-01

    This paper discusses the computational needs for the full design and simulation of the RIA accelerator systems. Beam dynamics simulations are essential to first define and optimize the architectural design for both the driver linac and the post-accelerator. They are also important to study different design options and various off-normal modes in order to decide on the most-performing and cost-effective design. Due to the high-intensity primary beams, the beam-stripper interaction is a source of both radioactivation and beam contamination and should be carefully investigated and simulated for proper beam collimation and shielding. The targets and fragment separators area needs also very special attention in order to reduce any radiological hazards by careful shielding design. For all these simulations parallel computing is an absolute necessity

  15. Strategic flexibility in computational estimation for Chinese- and Canadian-educated adults.

    Science.gov (United States)

    Xu, Chang; Wells, Emma; LeFevre, Jo-Anne; Imbo, Ineke

    2014-09-01

    The purpose of the present study was to examine factors that influence strategic flexibility in computational estimation for Chinese- and Canadian-educated adults. Strategic flexibility was operationalized as the percentage of trials on which participants chose the problem-based procedure that best balanced proximity to the correct answer with simplification of the required calculation. For example, on 42 × 57, the optimal problem-based solution is 40 × 60 because 2,400 is closer to the exact answer 2,394 than is 40 × 50 or 50 × 60. In Experiment 1 (n = 50), where participants had free choice of estimation procedures, Chinese-educated participants were more likely to choose the optimal problem-based procedure (80% of trials) than Canadian-educated participants (50%). In Experiment 2 (n = 48), participants had to choose 1 of 3 solution procedures. They showed moderate strategic flexibility that was equal across groups (60%). In Experiment 3 (n = 50), participants were given the same 3 procedure choices as in Experiment 2 but different instructions and explicit feedback. When instructed to respond quickly, both groups showed moderate strategic flexibility as in Experiment 2 (60%). When instructed to respond as accurately as possible or to balance speed and accuracy, they showed very high strategic flexibility (greater than 90%). These findings suggest that solvers will show very different levels of strategic flexibility in response to instructions, feedback, and problem characteristics and that these factors interact with individual differences (e.g., arithmetic skills, nationality) to produce variable response patterns.

  16. Computer codes for particle accelerator design and analysis: A compendium. Second edition

    International Nuclear Information System (INIS)

    Deaven, H.S.; Chan, K.C.D.

    1990-05-01

    The design of the next generation of high-energy accelerators will probably be done as an international collaborative efforts and it would make sense to establish, either formally or informally, an international center for accelerator codes with branches for maintenance, distribution, and consultation at strategically located accelerator centers around the world. This arrangement could have at least three beneficial effects. It would cut down duplication of effort, provide long-term support for the best codes, and provide a stimulating atmosphere for the evolution of new codes. It does not take much foresight to see that the natural evolution of accelerator design codes is toward the development of so-called Expert Systems, systems capable of taking design specifications of future accelerators and producing specifications for optimized magnetic transport and acceleration components, making a layout, and giving a fairly impartial cost estimate. Such an expert program would use present-day programs such as TRANSPORT, POISSON, and SUPERFISH as tools in the optimization process. Such a program would also serve to codify the experience of two generations of accelerator designers before it is lost as these designers reach retirement age. This document describes 203 codes that originate from 10 countries and are currently in use. The authors feel that this compendium will contribute to the dialogue supporting the international collaborative effort that is taking place in the field of accelerator physics today

  17. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    International Nuclear Information System (INIS)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-01-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction

  18. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  19. Quantum computing accelerator I/O : LDRD 52750 final report

    International Nuclear Information System (INIS)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-01-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work

  20. Mathematical model of accelerator output characteristics and their calculation on a computer

    International Nuclear Information System (INIS)

    Mishulina, O.A.; Ul'yanina, M.N.; Kornilova, T.V.

    1975-01-01

    A mathematical model is described of output characteristics of a linear accelerator. The model is a system of differential equations. Presence of phase limitations is a specific feature of setting the problem which makes it possible to ensure higher simulation accuracy and determine a capture coefficient. An algorithm is elaborated of computing output characteristics based upon the mathematical model suggested. A capture coefficient, coordinate expectation characterizing an average phase value of the beam particles, coordinate expectation characterizing an average value of the reverse relative velocity of the beam particles as well as dispersion of these coordinates are output characteristics of the accelerator. Calculation methods of the accelerator output characteristics are described in detail. The computations have been performed on the BESM-6 computer, the characteristics computing time being 2 min 20 sec. Relative error of parameter computation averages 10 -2

  1. The Extrapolation-Accelerated Multilevel Aggregation Method in PageRank Computation

    Directory of Open Access Journals (Sweden)

    Bing-Yuan Pu

    2013-01-01

    Full Text Available An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.

  2. Strategic Plan for a Scientific Cloud Computing infrastructure for Europe

    CERN Document Server

    Lengert, Maryline

    2011-01-01

    Here we present the vision, concept and direction for forming a European Industrial Strategy for a Scientific Cloud Computing Infrastructure to be implemented by 2020. This will be the framework for decisions and for securing support and approval in establishing, initially, an R&D European Cloud Computing Infrastructure that serves the need of European Research Area (ERA ) and Space Agencies. This Cloud Infrastructure will have the potential beyond this initial user base to evolve to provide similar services to a broad range of customers including government and SMEs. We explain how this plan aims to support the broader strategic goals of our organisations and identify the benefits to be realised by adopting an industrial Cloud Computing model. We also outline the prerequisites and commitment needed to achieve these objectives.

  3. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    Directory of Open Access Journals (Sweden)

    Wei Wang

    Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other

  4. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-01-01

    This paper describes some recent developments in computing and stresses their application to accelerator control systems. Among the advances that promise to have a significant impact are: i) low cost scientific workstations; ii) the use of ''windows'', pointing devices and menus in a multitasking operating system; iii) high resolution large-screen graphics monitors; iv) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, the authors examine the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  5. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    Theil, E.; Jacobson, V.; Paxson, V.

    1987-04-01

    This paper describes some recent developments in computing and stresses their application in accelerator control systems. Among the advances that promise to have a significant impact are (1) low cost scientific workstations; (2) the use of ''windows'', pointing devices and menus in a multi-tasking operating system; (3) high resolution large-screen graphics monitors; (4) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, this paper examines the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  6. Advanced Computing for 21st Century Accelerator Science and Technology

    International Nuclear Information System (INIS)

    Dragt, Alex J.

    2004-01-01

    Dr. Dragt of the University of Maryland is one of the Institutional Principal Investigators for the SciDAC Accelerator Modeling Project Advanced Computing for 21st Century Accelerator Science and Technology whose principal investigators are Dr. Kwok Ko (Stanford Linear Accelerator Center) and Dr. Robert Ryne (Lawrence Berkeley National Laboratory). This report covers the activities of Dr. Dragt while at Berkeley during spring 2002 and at Maryland during fall 2003

  7. Lasers and particle beam for fusion and strategic defense

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    This special issue of the Journal of Fusion Energy consists of the edited transscripts of a symposium on the applications of laser and particle beams to fusion and strategic defense. Its eleven papers discuss these topics: the Strategic Defense Initiative; accelerators for heavy ion fusion; rf accelerators for fusion and strategic defense; Pulsed power, ICF, and the Strategic Defense Initiative; chemical lasers; the feasibility of KrF lasers for fusion; the damage resistance of coated optic; liquid crystal devices for laser systems; fusion neutral-particle beam research and its contribution to the Star Wars program; and induction linacs and free electron laser amplifiers for ICF devices and directed-energy weapons

  8. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  9. GPU acceleration of Dock6's Amber scoring computation.

    Science.gov (United States)

    Yang, Hailong; Zhou, Qiongqiong; Li, Bo; Wang, Yongjian; Luan, Zhongzhi; Qian, Depei; Li, Hanlu

    2010-01-01

    Dressing the problem of virtual screening is a long-term goal in the drug discovery field, which if properly solved, can significantly shorten new drugs' R&D cycle. The scoring functionality that evaluates the fitness of the docking result is one of the major challenges in virtual screening. In general, scoring functionality in docking requires a large amount of floating-point calculations, which usually takes several weeks or even months to be finished. This time-consuming procedure is unacceptable, especially when highly fatal and infectious virus arises such as SARS and H1N1, which forces the scoring task to be done in a limited time. This paper presents how to leverage the computational power of GPU to accelerate Dock6's (http://dock.compbio.ucsf.edu/DOCK_6/) Amber (J. Comput. Chem. 25: 1157-1174, 2004) scoring with NVIDIA CUDA (NVIDIA Corporation Technical Staff, Compute Unified Device Architecture - Programming Guide, NVIDIA Corporation, 2008) (Compute Unified Device Architecture) platform. We also discuss many factors that will greatly influence the performance after porting the Amber scoring to GPU, including thread management, data transfer, and divergence hidden. Our experiments show that the GPU-accelerated Amber scoring achieves a 6.5× speedup with respect to the original version running on AMD dual-core CPU for the same problem size. This acceleration makes the Amber scoring more competitive and efficient for large-scale virtual screening problems.

  10. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.

    1977-03-01

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  11. ''Accelerators and Beams,'' multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R. R.; Browman, A. A.; Mead, W. C.; Williams, R. A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user's rate of learning and length of retention of the material. They integrate interactive ''On-Screen Laboratories,'' hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer

  12. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Borsic, A; Attardo, E A; Halter, R J

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  13. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    Science.gov (United States)

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  14. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  15. Accelerators and Beams, multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R.R.; Browman, A.A.; Mead, W.C.; Williams, R.A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user close-quote s rate of learning and length of retention of the material. They integrate interactive On-Screen Laboratories, hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer. copyright 1999 American Institute of Physics

  16. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  17. Personal computer control system for small size tandem accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Takayama, Hiroshi; Kawano, Kazuhiro; Shinozaki, Masataka [Nissin - High Voltage Co. Ltd., Kyoto (Japan)

    1996-12-01

    As the analysis apparatus using tandem accelerator has a lot of control parameter, numbers of control parts set on control panel are so many to make the panel more complex and its operativity worse. In order to improve these faults, development and design of a control system using personal computer for the control panel mainly constituted by conventional hardware parts were tried. Their predominant characteristics are shown as follows: (1) To make the control panel construction simpler and more compact, because the hardware device on the panel surface becomes the smallest limit as required by using a personal computer for man-machine interface. (2) To make control speed more rapid, because sequence control is closed within each block by driving accelerator system to each block and installing local station of the sequencer network at each block. (3) To make expandability larger, because of few improvement of the present hardware by interrupting the sequencer local station into the net and correcting image of the computer when increasing a new beamline. And, (4) to make control system cheaper, because of cheaper investment and easier programming by using the personal computer. (G.K.)

  18. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  19. A contribution to the computation of the impedance in acceleration resonators

    International Nuclear Information System (INIS)

    Liu, Cong

    2016-05-01

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  20. A contribution to the computation of the impedance in acceleration resonators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong

    2016-05-15

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  1. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    Science.gov (United States)

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  2. Acceleration of FDTD mode solver by high-performance computing techniques.

    Science.gov (United States)

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  3. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  4. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  5. Effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing.

    Science.gov (United States)

    Yoo, Won-Gyu

    2015-01-01

    [Purpose] This study showed the effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing. [Subjects] Twenty-one male computer workers voluntarily consented to participate in this study. They consisted of 7 workers who could type 200-300 characteristics/minute, 7 workers who could type 300-400 characteristics/minute, and 7 workers who could type 400-500 chracteristics/minute. [Methods] This study was used to measure the acceleration and peak contact pressure of the fingertips for different typing speed groups using an accelerometer and CONFORMat system. [Results] The fingertip contact pressure was increased in the high typing speed group compared with the low and medium typing speed groups. The fingertip acceleration was increased in the high typing speed group compared with the low and medium typing speed groups. [Conclusion] The results of the present study indicate that a fast typing speed cause continuous pressure stress to be applied to the fingers, thereby creating pain in the fingers.

  6. Computer codes for designing proton linear accelerators

    International Nuclear Information System (INIS)

    Kato, Takao

    1992-01-01

    Computer codes for designing proton linear accelerators are discussed from the viewpoint of not only designing but also construction and operation of the linac. The codes are divided into three categories according to their purposes: 1) design code, 2) generation and simulation code, and 3) electric and magnetic fields calculation code. The role of each category is discussed on the basis of experience at KEK (the design of the 40-MeV proton linac and its construction and operation, and the design of the 1-GeV proton linac). We introduce our recent work relevant to three-dimensional calculation and supercomputer calculation: 1) tuning of MAFIA (three-dimensional electric and magnetic fields calculation code) for supercomputer, 2) examples of three-dimensional calculation of accelerating structures by MAFIA, 3) development of a beam transport code including space charge effects. (author)

  7. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  8. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  9. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  10. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Directory of Open Access Journals (Sweden)

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  11. Microprocessor based beam intensity and efficiency display system for the Fermilab accelerator

    International Nuclear Information System (INIS)

    Biwer, R.

    1979-01-01

    The Main Accelerator display system for the Fermilab accelerator gathers charge data and displays it including processed transfer efficiencies of each of the accelerators. To accomplish this, strategically located charge converters monitor the circulating internal beam of each of the Fermilab accelerators. Their outputs are processed via an asynchronously triggered, multiplexed analog-to-digital converter. The data is converted into a digital byte containing address code and data, then stores it into two 16-bit memories. One memory outputs the interleaved data as a data pulse train while the other interfaces directly to a local host computer for further analysis. The microprocessor based display unit synchronizes displayed data during normal operation as well as special storage modes. The display unit outputs data to the fron panel in the form of a numeric value and also makes digital-to-analog conversions of displayed data for external peripheral devices. 5 refs

  12. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    Science.gov (United States)

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  13. FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.    In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the...

  14. Computer-aided waste management strategic planning and analysis

    International Nuclear Information System (INIS)

    Avci, H.I.; Kotek, T.J.; Koebnick, B.L.

    1995-01-01

    A computational model called WASTE-MGMT has been developed to assist in the evaluation of alternative waste management approaches in a complex setting involving multiple sites, waste streams, and processing options. The model provides the quantities and characteristics of wastes processed at any facility or shipped between any two sites as well as environmental emissions at any facility within the waste management system. The model input is defined by three types of fundamental waste management data: (1) waste inventories and characteristics at the point of generation; (2) treatment, storage, and disposal facility characteristics; and (3) definitions of alternative management approaches. The model has been successfully used in the preparation of the US Department of Energy (DOE) Environmental Management Programmatic.Environmental Impact Statement (EM PEIS). Certain improvements are either being implemented or planned that would extend the usefulness and applicability of the WASTE-MGMT model beyond the EM PEIS and info the. strategic planning for management of wastes under the responsibility of DOE or other agencies

  15. X-BAND LINEAR COLLIDER R and D IN ACCELERATING STRUCTURES THROUGH ADVANCED COMPUTING

    International Nuclear Information System (INIS)

    Li, Z

    2004-01-01

    This paper describes a major computational effort that addresses key design issues in the high gradient accelerating structures for the proposed X-band linear collider, GLC/NLC. Supported by the US DOE's Accelerator Simulation Project, SLAC is developing a suite of parallel electromagnetic codes based on unstructured grids for modeling RF structures with higher accuracy and on a scale previously not possible. The new simulation tools have played an important role in the R and D of X-Band accelerating structures, in cell design, wakefield analysis and dark current studies

  16. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  17. 3-D computations and measurements of accelerator magnets for the APS

    International Nuclear Information System (INIS)

    Turner, L.R.; Kim, S.H.; Kim, K.

    1993-01-01

    The Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), requires dipole, quadrupole, sextupole, and corrector magnets for each of its circular accelerator systems. Three-dimensional (3-D) field computations are needed to eliminate unwanted multipole fields from the ends of long quadrupole and dipole magnets and to guarantee that the flux levels in the poles of short magnets will not cause saturation. Measurements of the magnets show good agreement with the computations

  18. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  19. High-brightness H/sup -/ accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1987-01-01

    Neutral particle beam (NPB) devices based on high-brightness H/sup -/ accelerators are an important component of proposed strategic defense systems. The basic rational and R and D program are outlined and examples given of the underlying technology thrusts toward advanced systems. Much of the research accomplished in the past year is applicable to accelerator systems in general; some of these activities are discussed

  20. 2-D and 3-D computations of curved accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.

    1991-01-01

    In order to save computer memory, a long accelerator magnet may be computed by treating the long central region and the end regions separately. The dipole magnets for the injector synchrotron of the Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), employ magnet iron consisting of parallel laminations, stacked with a uniform radius of curvature of 33.379 m. Laplace's equation for the magnetic scalar potential has a different form for a straight magnet (x-y coordinates), a magnet with surfaces curved about a common center (r-θ coordinates), and a magnet with parallel laminations like the APS injector dipole. Yet pseudo 2-D computations for the three geometries give basically identical results, even for a much more strongly curved magnet. Hence 2-D (x-y) computations of the central region and 3-D computations of the end regions can be combined to determine the overall magnetic behavior of the magnets. 1 ref., 6 figs

  1. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  2. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  3. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  4. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  5. Learning without experience: Understanding the strategic implications of deregulation and competition in the electricity industry

    Energy Technology Data Exchange (ETDEWEB)

    Lomi, A. [School of Economics, University of Bologna, Bologna (Italy); Larsen, E.R. [Dept. of Managements Systems and Information, City University Business School, London (United Kingdom)

    1998-11-01

    As deregulation of the electricity industry continues to gain momentum around the world, electricity companies face unprecedented challenges. Competitive complexity and intensity will increase substantially as deregulated companies find themselves competing in new industries, with new rules, against unfamiliar competitors - and without any history to learn from. We describe the different kinds of strategic issues that newly deregulated utility companies are facing, and the risks that strategic issues implicate. We identify a number of problems induced by experiential learning under conditions of competence-destroying change, and we illustrate ways in which companies can activate history-independent learning processes. We suggest that Micro worlds - a new generation of computer-based learning environments made possible by conceptual and technological progress in the fields of system dynamics and systems thinking - are particularly appropriate tools to accelerate and enhance organizational and managerial learning under conditions of increased competitive complexity. (au)

  6. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  7. Proceedings of the conference on computer codes and the linear accelerator community

    International Nuclear Information System (INIS)

    Cooper, R.K.

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned

  8. Proceedings of the conference on computer codes and the linear accelerator community

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.K. (comp.)

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned.

  9. ACE3P Computations of Wakefield Coupling in the CLIC Two-Beam Accelerator

    International Nuclear Information System (INIS)

    Candel, Arno

    2010-01-01

    The Compact Linear Collider (CLIC) provides a path to a multi-TeV accelerator to explore the energy frontier of High Energy Physics. Its novel two-beam accelerator concept envisions rf power transfer to the accelerating structures from a separate high-current decelerator beam line consisting of power extraction and transfer structures (PETS). It is critical to numerically verify the fundamental and higher-order mode properties in and between the two beam lines with high accuracy and confidence. To solve these large-scale problems, SLAC's parallel finite element electromagnetic code suite ACE3P is employed. Using curvilinear conformal meshes and higher-order finite element vector basis functions, unprecedented accuracy and computational efficiency are achieved, enabling high-fidelity modeling of complex detuned structures such as the CLIC TD24 accelerating structure. In this paper, time-domain simulations of wakefield coupling effects in the combined system of PETS and the TD24 structures are presented. The results will help to identify potential issues and provide new insights on the design, leading to further improvements on the novel CLIC two-beam accelerator scheme.

  10. Strategic decision making

    OpenAIRE

    Stokman, Frans N.; Assen, Marcel A.L.M. van; Knoop, Jelle van der; Oosten, Reinier C.H. van

    2000-01-01

    This paper introduces a methodology for strategic intervention in collective decision making.The methodology is based on (1) a decomposition of the problem into a few main controversial issues, (2) systematic interviews of subject area specialists to obtain a specification of the decision setting,consisting of a list of stakeholders with their capabilities, positions, and salience on each of the issues; (3) computer simulation. The computer simulation models incorporate only the main processe...

  11. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  12. Applications of the Strategic Defense Initiative's Compact Accelerators

    National Research Council Canada - National Science Library

    Montanarelli, Nick

    1992-01-01

    ...) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI sponsored compact induction linear accelerator may replace Cobalt 60 radiation...

  13. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  14. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  15. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  16. Convergence acceleration of two-phase flow calculations in FLICA-4. A thermal-hydraulic 3D computer code

    International Nuclear Information System (INIS)

    Toumi, I.

    1995-01-01

    Time requirements for 3D two-phase flow steady state calculations are generally long. Usually, numerical methods for steady state problems are iterative methods consisting in time-like methods that are marched to a steady state. Based on the eigenvalue spectrum of the iteration matrix for various flow configuration, two convergence acceleration techniques are discussed; over-relaxation and eigenvalue annihilation. This methods were applied to accelerate the convergence of three dimensional steady state two-phase flow calculations within the FLICA-4 computer code. These acceleration methods are easy to implement and no extra computer memory is required. Successful results are presented for various test problems and a saving of 30 to 50 % in CPU time have been achieved. (author). 10 refs., 4 figs

  17. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  18. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  19. Performance analysis and acceleration of explicit integration for large kinetic networks using batched GPU computations

    Energy Technology Data Exchange (ETDEWEB)

    Shyles, Daniel [University of Tennessee (UT); Dongarra, Jack J. [University of Tennessee, Knoxville (UTK); Guidry, Mike W. [ORNL; Tomov, Stanimire Z. [ORNL; Billings, Jay Jay [ORNL; Brock, Benjamin A. [ORNL; Haidar Ahmad, Azzam A. [ORNL

    2016-09-01

    Abstract—We demonstrate the systematic implementation of recently-developed fast explicit kinetic integration algorithms that solve efficiently N coupled ordinary differential equations (subject to initial conditions) on modern GPUs. We take representative test cases (Type Ia supernova explosions) and demonstrate two or more orders of magnitude increase in efficiency for solving such systems (of realistic thermonuclear networks coupled to fluid dynamics). This implies that important coupled, multiphysics problems in various scientific and technical disciplines that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible. As examples of such applications we present the computational techniques developed for our ongoing deployment of these new methods on modern GPU accelerators. We show that similarly to many other scientific applications, ranging from national security to medical advances, the computation can be split into many independent computational tasks, each of relatively small-size. As the size of each individual task does not provide sufficient parallelism for the underlying hardware, especially for accelerators, these tasks must be computed concurrently as a single routine, that we call batched routine, in order to saturate the hardware with enough work.

  20. Electromagnetic computer simulations of collective ion acceleration by a relativistic electron beam

    International Nuclear Information System (INIS)

    Galvez, M.; Gisler, G.R.

    1988-01-01

    A 2.5 electromagnetic particle-in-cell computer code is used to study the collective ion acceleration when a relativistic electron beam is injected into a drift tube partially filled with cold neutral plasma. The simulations of this system reveals that the ions are subject to electrostatic acceleration by an electrostatic potential that forms behind the head of the beam. This electrostatic potential develops soon after the beam is injected into the drift tube, drifts with the beam, and eventually settles to a fixed position. At later times, this electrostatic potential becomes a virtual cathode. When the permanent position of the electrostatic potential is at the edge of the plasma or further up, then ions are accelerated forward and a unidirectional ion flow is obtained otherwise a bidirectional ion flow occurs. The ions that achieve higher energy are those which drift with the negative potential. When the plasma density is varied, the simulations show that optimum acceleration occurs when the density ratio between the beam (n b ) and the plasma (n o ) is unity. Simulations were carried out by changing the ion mass. The results of these simulations corroborate the hypothesis that the ion acceleration mechanism is purely electrostatic, so that the ion acceleration depends inversely on the charge particle mass. The simulations also show that the ion maximum energy increased logarithmically with the electron beam energy and proportional with the beam current

  1. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  2. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  3. Plasma accelerators

    International Nuclear Information System (INIS)

    Bingham, R.; Angelis, U. de; Johnston, T.W.

    1991-01-01

    Recently attention has focused on charged particle acceleration in a plasma by a fast, large amplitude, longitudinal electron plasma wave. The plasma beat wave and plasma wakefield accelerators are two efficient ways of producing ultra-high accelerating gradients. Starting with the plasma beat wave accelerator (PBWA) and laser wakefield accelerator (LWFA) schemes and the plasma wakefield accelerator (PWFA) steady progress has been made in theory, simulations and experiments. Computations are presented for the study of LWFA. (author)

  4. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    Science.gov (United States)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  5. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    Science.gov (United States)

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  6. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    International Nuclear Information System (INIS)

    Frankel, R.S.

    1995-01-01

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation

  7. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  8. Program for computing inhomogeneous coaxial resonators and accelerating systems of the U-400 and ITs-100 cyclotrons

    International Nuclear Information System (INIS)

    Gul'bekyan, G.G.; Ivanov, Eh.L.

    1987-01-01

    The ''Line'' computer code for computing inhomogeneous coaxial resonators is described. The results obtained for the resonators of the U-400 cyclotron made it possible to increase the energy of accelerated ions up to 27 MeV/nucl. The computations fot eh ITs-100 cyclic implantator gave the opportunity to build a compact design with a low value of consumed RF power

  9. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  10. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  11. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    Science.gov (United States)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  12. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  13. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul; Wolfers, Soeren

    2017-01-01

    We provide a general discussion of Smolyak's algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak's work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak's algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  14. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul

    2017-03-26

    We provide a general discussion of Smolyak\\'s algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak\\'s work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak\\'s algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  15. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    International Nuclear Information System (INIS)

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  16. Product-market differentiation: a strategic planning model for community hospitals.

    Science.gov (United States)

    Milch, R A

    1980-01-01

    Community hospitals would seem to have every reason to identify and capitalize on their product-market strengths. The strategic marketing/planning model provides a framework for rational analysis of the community hospital dilemma and for developing sensible solutions to the complex problems of accelerating hospital price-inflation.

  17. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  18. A new 3-D integral code for computation of accelerator magnets

    International Nuclear Information System (INIS)

    Turner, L.R.; Kettunen, L.

    1991-01-01

    For computing accelerator magnets, integral codes have several advantages over finite element codes; far-field boundaries are treated automatically, and computed field in the bore region satisfy Maxwell's equations exactly. A new integral code employing edge elements rather than nodal elements has overcome the difficulties associated with earlier integral codes. By the use of field integrals (potential differences) as solution variables, the number of unknowns is reduced to one less than the number of nodes. Two examples, a hollow iron sphere and the dipole magnet of Advanced Photon Source injector synchrotron, show the capability of the code. The CPU time requirements are comparable to those of three-dimensional (3-D) finite-element codes. Experiments show that in practice it can realize much of the potential CPU time saving that parallel processing makes possible. 8 refs., 4 figs., 1 tab

  19. Strategic Culture: the Concept and the Directions of Research

    Directory of Open Access Journals (Sweden)

    Эдуард Николаевич Ожиганов

    2012-06-01

    Full Text Available The definition and estimation of political qualification of the ruling groups and long-term prognosis of their activities is a paramount task of strategical analysis. The ruling groups have their own interests and strategical manipulations with them (both successful and poor constitute the important part of their game behavior, which effectiveness in defined periods is more or less computational. The game behavior of the ruling groups by-turn depends on the characteristics of strategic culture. This link is evident under their comparative analysis.

  20. Continuous Analog of Accelerated OS-EM Algorithm for Computed Tomography

    Directory of Open Access Journals (Sweden)

    Kiyoko Tateishi

    2017-01-01

    Full Text Available The maximum-likelihood expectation-maximization (ML-EM algorithm is used for an iterative image reconstruction (IIR method and performs well with respect to the inverse problem as cross-entropy minimization in computed tomography. For accelerating the convergence rate of the ML-EM, the ordered-subsets expectation-maximization (OS-EM with a power factor is effective. In this paper, we propose a continuous analog to the power-based accelerated OS-EM algorithm. The continuous-time image reconstruction (CIR system is described by nonlinear differential equations with piecewise smooth vector fields by a cyclic switching process. A numerical discretization of the differential equation by using the geometric multiplicative first-order expansion of the nonlinear vector field leads to an exact equivalent iterative formula of the power-based OS-EM. The convergence of nonnegatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem for consistent inverse problems. We illustrate through numerical experiments that the convergence characteristics of the continuous system have the highest quality compared with that of discretization methods. We clarify how important the discretization method approximates the solution of the CIR to design a better IIR method.

  1. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  2. Quantum optical device accelerating dynamic programming

    OpenAIRE

    Grigoriev, D.; Kazakov, A.; Vakulenko, S.

    2005-01-01

    In this paper we discuss analogue computers based on quantum optical systems accelerating dynamic programming for some computational problems. These computers, at least in principle, can be realized by actually existing devices. We estimate an acceleration in resolving of some NP-hard problems that can be obtained in such a way versus deterministic computers

  3. Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS

    International Nuclear Information System (INIS)

    Li Deming

    2001-01-01

    Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS is described. Also, the establishment of the 14 C particle measuring device and the improvement of the original power supply system are described

  4. Commissioning the GTA accelerator

    International Nuclear Information System (INIS)

    Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Bowling, S.; Brown, S.; Cole, R.; Gilpatrick, J.D.; Garnett, R.; Guy, F.W.; Ingalls, W.B.; Johnson, K.F.; Kerstiens, D.; Little, C.; Lohsen, R.A.; Lloyd, S.; Lysenko, W.P.; Mottershead, C.T.; Neuschaefer, G.; Power, J.; Rusthoi, D.P.; Sandoval, D.P. Stevens, R.R. Jr.; Vaughn, G.; Wadlinger, E.A.; Yuan, V.; Connolly, R.; Weiss, R.; Saadatmand, K.

    1992-01-01

    The Ground Test Accelerator (GTA) is supported by the Strategic Defense command as part of their Neutral Particle Beam (NPB) program. Neutral particles have the advantage that in space they are unaffected by the earth's magnetic field and travel in straight lines unless they enter the earth's atmosphere and become charged by stripping. Heavy particles are difficult to stop and can probe the interior of space vehicles; hence, NPB can function as a discriminator between warheads and decoys. We are using GTA to resolve the physics and engineering issues related to accelerating, focusing, and steering a high-brightness, high-current H - beam and then neutralizing it. Our immediate goal is to produce a 24-MeV, 50mA device with a 2% duty factor

  5. Strategic Control in Decision Making under Uncertainty

    Science.gov (United States)

    Venkatraman, Vinod; Huettel, Scott

    2012-01-01

    Complex economic decisions – whether investing money for retirement or purchasing some new electronic gadget – often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, to evaluate outcomes against a variety of contexts, and to flexibly match behavior to changes in the environment. In recent years, substantial research implicates the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision making. This region contains a functional topography such that the posterior dmPFC supports response-related control while the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue both for generalized contributions of the dmPFC to cognitive control, and for specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are also likely to be critical for decision making in other domains, including interpersonal interactions in social settings. PMID:22487037

  6. Collective ion acceleration

    International Nuclear Information System (INIS)

    Godfrey, B.B.; Faehl, R.J.; Newberger, B.S.; Shanahan, W.R.; Thode, L.E.

    1977-01-01

    Progress achieved in the understanding and development of collective ion acceleration is presented. Extensive analytic and computational studies of slow cyclotron wave growth on an electron beam in a helix amplifier were performed. Research included precise determination of linear coupling between beam and helix, suppression of undesired transients and end effects, and two-dimensional simulations of wave growth in physically realizable systems. Electrostatic well depths produced exceed requirements for the Autoresonant Ion Acceleration feasibility experiment. Acceleration of test ions to modest energies in the troughs of such waves was also demonstrated. Smaller efforts were devoted to alternative acceleration mechanisms. Langmuir wave phase velocity in Converging Guide Acceleration was calculated as a function of the ratio of electron beam current to space-charge limiting current. A new collective acceleration approach, in which cyclotron wave phase velocity is varied by modulation of electron beam voltage, is proposed. Acceleration by traveling Virtual Cathode or Localized Pinch was considered, but appears less promising. In support of this research, fundamental investigations of beam propagation in evacuated waveguides, of nonneutral beam linear eigenmodes, and of beam stability were carried out. Several computer programs were developed or enhanced. Plans for future work are discussed

  7. Acceleration of the nodal program FERM

    International Nuclear Information System (INIS)

    Nakata, H.

    1985-01-01

    Acceleration of the nodal FERM was tried by three acceleration schemes. Results of the calculations showed the best acceleration with the Tchebyshev method where the savings in the computing time were of the order of 50%. Acceleration with the Assymptotic Source Extrapoltation Method and with the Coarse-Mesh Rebalancing Method did not result in any improvement on the global computational time, although a reduction in the number of outer iterations was observed. (Author) [pt

  8. Strategic Reading, Ontologies, and the Future of Scientific Publishing

    Science.gov (United States)

    Renear, Allen H.; Palmer, Carole L.

    2009-08-01

    The revolution in scientific publishing that has been promised since the 1980s is about to take place. Scientists have always read strategically, working with many articles simultaneously to search, filter, scan, link, annotate, and analyze fragments of content. An observed recent increase in strategic reading in the online environment will soon be further intensified by two current trends: (i) the widespread use of digital indexing, retrieval, and navigation resources and (ii) the emergence within many scientific disciplines of interoperable ontologies. Accelerated and enhanced by reading tools that take advantage of ontologies, reading practices will become even more rapid and indirect, transforming the ways in which scientists engage the literature and shaping the evolution of scientific publishing.

  9. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    Science.gov (United States)

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  10. Cognitive Characteristics of Strategic and Non-strategic Gamblers.

    Science.gov (United States)

    Mouneyrac, Aurélie; Lemercier, Céline; Le Floch, Valérie; Challet-Bouju, Gaëlle; Moreau, Axelle; Jacques, Christian; Giroux, Isabelle

    2018-03-01

    Participation in strategic and non-strategic games is mostly explained in the literature by gender: men gamble on strategic games, while women gamble on non-strategic games. However, little is known about the underlying cognitive factors that could also distinguish strategic and non-strategic gamblers. We suggest that cognitive style and need for cognition also explain participation in gambling subtypes. From a dual-process perspective, cognitive style is the tendency to reject or accept the fast, automatic answer that comes immediately in response to a problem. Individuals that preferentially reject the automatic response use an analytic style, which suggest processing information in a slow way, with deep treatment. The intuitive style supposes a reliance on fast, automatic answers. The need for cognition provides a motivation to engage in effortful activities. One hundred and forty-nine gamblers (53 strategic and 96 non-strategic) answered the Cognitive Reflection Test, Need For Cognition Scale, and socio-demographic questions. A logistic regression was conducted to evaluate the influence of gender, cognitive style and need for cognition on participation in strategic and non-strategic games. Our results show that a model with both gender and cognitive variables is more accurate than a model with gender alone. Analytic (vs. intuitive) style, high (vs. low) need for cognition and being male (vs. female) are characteristics of strategic gamblers (vs. non-strategic gamblers). This study highlights the importance of considering the cognitive characteristics of strategic and non-strategic gamblers in order to develop preventive campaigns and treatments that fit the best profiles for gamblers.

  11. Acceleration of the FERM nodal program

    International Nuclear Information System (INIS)

    Nakata, H.

    1985-01-01

    It was tested three acceleration methods trying to reduce the number of outer iterations in the FERM nodal program. The results obtained indicated that the Chebychev polynomial acceleration method with variable degree results in a economy of 50% in the computer time. Otherwise, the acceleration method by source asymptotic extrapolation or by zonal rebalance did not result in economy of the global computer time, however some acceleration had been verified in outer iterations. (M.C.K.) [pt

  12. STRATEGIC PARTNERSHIP OF UKRAINE: DECLARATIONS AND REALITIES

    Directory of Open Access Journals (Sweden)

    Nataliya Demchenko

    2015-11-01

    Full Text Available The strategic partnership of cooperation is a higher step than conventional relationships. Conditioned by specific interests of the parties, such cooperation is possible between those partners who have mutual territorial claims and have mutual commitment to the territorial integrity. At the same time with many partners (it’s quantity is about 20, Ukraine has no simple partnership and cooperation, a lot of them reseived the status of "strategic partners, but often they are not the states, whose national interests in strategic areas correspondes to the current interests of Ukraine. It should be noted that today among the countries that have been declared as the strategic partners of Ukraine, not all of them support national interests in the present. Ukraine, appeared as an independent state, began use new methods of international cooperation, without adequately developed strategy for their use. Some problems facing the country, can be solved, other must be taken into account in determining its development strategy. Therefore, the subject of the research is global and specific problems that consider issues of economic security and partnership in Ukraine in modern conditions. The objective of the paper is to study options for a strategic partnership in Ukraine by improving the institutional mechanism to coordinate the integration processes. The article is based on studies of foreign and domestic scientists. Practical implications. Formation of effective international cooperation of Ukraine in the context of globalization, the choice of strategic partners on the basis of mutually beneficial cooperation. Results. The analysis of Ukraine’s cooperation with Russia; the features of the largest modern regional associations; reasonably objective need for Ukraine’s integration into the regional associations; recommendations on the necessary measures to accelerate the process of deepening Ukraine’s integration with the EU.

  13. Electromagnetic modeling in accelerator designs

    International Nuclear Information System (INIS)

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described

  14. Strategic control in decision-making under uncertainty.

    Science.gov (United States)

    Venkatraman, Vinod; Huettel, Scott A

    2012-04-01

    Complex economic decisions - whether investing money for retirement or purchasing some new electronic gadget - often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, evaluate outcomes against a variety of contexts, and flexibly match behavior to changes in the environment. In recent years, substantial research has implicated the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision-making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision-making. This region contains a functional topography such that the posterior dmPFC supports response-related control, whereas the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue for both generalized contributions of the dmPFC to cognitive control, and specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are likely to be critical for decision-making in other domains, including interpersonal interactions in social settings. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  15. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  16. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation....... This model incorporates elements of central strategizing, autonomous entrepreneurial behavior, interactive information processing, and open communication systems that enhance the organization's ability to observe exogenous changes and respond effectively to them....

  17. TU-FG-201-04: Computer Vision in Autonomous Quality Assurance of Linear Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Yu, H; Jenkins, C; Yu, S; Yang, Y; Xing, L [Stanford University, Stanford, CA (United States)

    2016-06-15

    Purpose: Routine quality assurance (QA) of linear accelerators represents a critical and costly element of a radiation oncology center. Recently, a system was developed to autonomously perform routine quality assurance on linear accelerators. The purpose of this work is to extend this system and contribute computer vision techniques for obtaining quantitative measurements for a monthly multi-leaf collimator (MLC) QA test specified by TG-142, namely leaf position accuracy, and demonstrate extensibility for additional routines. Methods: Grayscale images of a picket fence delivery on a radioluminescent phosphor coated phantom are captured using a CMOS camera. Collected images are processed to correct for camera distortions, rotation and alignment, reduce noise, and enhance contrast. The location of each MLC leaf is determined through logistic fitting and a priori modeling based on knowledge of the delivered beams. Using the data collected and the criteria from TG-142, a decision is made on whether or not the leaf position accuracy of the MLC passes or fails. Results: The locations of all MLC leaf edges are found for three different picket fence images in a picket fence routine to 0.1mm/1pixel precision. The program to correct for image alignment and determination of leaf positions requires a runtime of 21– 25 seconds for a single picket, and 44 – 46 seconds for a group of three pickets on a standard workstation CPU, 2.2 GHz Intel Core i7. Conclusion: MLC leaf edges were successfully found using techniques in computer vision. With the addition of computer vision techniques to the previously described autonomous QA system, the system is able to quickly perform complete QA routines with minimal human contribution.

  18. THE MODELS OF STRATEGIC MANAGEMENT OF INFOCOMM BUSINESS

    Directory of Open Access Journals (Sweden)

    M. A. Lyashenko

    2015-01-01

    and communication business made in article of the analysis one general idea of formation of strategy of management of infocommunication business which consists in full recognition of inevitability of globalization processes in the modern world at the accelerated development of information technologies was selected. In these conditions, the companies use such strategic means of the competition, as: increase of productivity, mastering of the new markets, creation of new business models and attraction of talents on a global scale.

  19. Strategic sizing of energy storage facilities in electricity markets

    DEFF Research Database (Denmark)

    Nasrolahpour, Ehsan; Kazempour, Seyyedjalal; Zareipour, Hamidreza

    2016-01-01

    This paper proposes a model to determine the optimasize of an energy storage facility from a strategic investor’s perspective. This investor seeks to maximize its profit through making strategic planning, i.e., storage sizing, and strategic operational, i.e., offering and bidding, decisions. We...... consider the uncertainties associated with rival generators’ offering strategies and future load levels in the proposed model. The strategic investment decisions include the sizes of charging device, discharging device and energy reservoir. The proposed model is a stochastic bi-level optimization problem......; the planning and operation decisions are made in the upper-level, and market clearing is modeled in the lower-level under different operating scenarios. To make the proposed model computationally tractable, an iterative solution technique based on Benders’ decomposition is implemented. This provides a master...

  20. Modern computer networks and distributed intelligence in accelerator controls

    International Nuclear Information System (INIS)

    Briegel, C.

    1991-01-01

    Appropriate hardware and software network protocols are surveyed for accelerator control environments. Accelerator controls network topologies are discussed with respect to the following criteria: vertical versus horizontal and distributed versus centralized. Decision-making considerations are provided for accelerator network architecture specification. Current trends and implementations at Fermilab are discussed

  1. Interacting with accelerators

    International Nuclear Information System (INIS)

    Dasgupta, S.

    1994-01-01

    Accelerators are research machines which produce energetic particle beam for use as projectiles to effect nuclear reactions. These machines along with their services and facilities may occupy very large areas. The man-machine interface of accelerators has evolved with technological changes in the computer industry and may be partitioned into three phases. The present paper traces the evolution of man-machine interface from the earliest accelerators to the present computerized systems incorporated in modern accelerators. It also discusses the advantages of incorporating expert system technology for assisting operators. (author). 8 ref

  2. An accelerated conjugate gradient algorithm to compute low-lying eigenvalues - a study for the Dirac operator in SU(2) lattice QCD

    International Nuclear Information System (INIS)

    Kalkreuter, T.; Simma, H.

    1995-07-01

    The low-lying eigenvalues of a (sparse) hermitian matrix can be computed with controlled numerical errors by a conjugate gradient (CG) method. This CG algorithm is accelerated by alternating it with exact diagonalizations in the subspace spanned by the numerically computed eigenvectors. We study this combined algorithm in case of the Dirac operator with (dynamical) Wilson fermions in four-dimensional SU(2) gauge fields. The algorithm is numerically very stable and can be parallelized in an efficient way. On lattices of sizes 4 4 - 16 4 an acceleration of the pure CG method by a factor of 4 - 8 is found. (orig.)

  3. Theoretical problems in accelerator physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses the following research on accelerators: computational methods; higher order mode suppression in accelerators structures; overmoded waveguide components and application to SLED II and power transport; rf sources; accelerator cavity design for a B factory asymmetric collider; and photonic band gap cavities

  4. A heterogeneous computing accelerated SCE-UA global optimization method using OpenMP, OpenCL, CUDA, and OpenACC.

    Science.gov (United States)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Liang, Ke; Hong, Yang

    2017-10-01

    The shuffled complex evolution optimization developed at the University of Arizona (SCE-UA) has been successfully applied in various kinds of scientific and engineering optimization applications, such as hydrological model parameter calibration, for many years. The algorithm possesses good global optimality, convergence stability and robustness. However, benchmark and real-world applications reveal the poor computational efficiency of the SCE-UA. This research aims at the parallelization and acceleration of the SCE-UA method based on powerful heterogeneous computing technology. The parallel SCE-UA is implemented on Intel Xeon multi-core CPU (by using OpenMP and OpenCL) and NVIDIA Tesla many-core GPU (by using OpenCL, CUDA, and OpenACC). The serial and parallel SCE-UA were tested based on the Griewank benchmark function. Comparison results indicate the parallel SCE-UA significantly improves computational efficiency compared to the original serial version. The OpenCL implementation obtains the best overall acceleration results however, with the most complex source code. The parallel SCE-UA has bright prospects to be applied in real-world applications.

  5. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  6. Strategic Entrepreneurship

    DEFF Research Database (Denmark)

    Klein, Peter G.; Barney, Jay B.; Foss, Nicolai Juul

    Strategic entrepreneurship is a newly recognized field that draws, not surprisingly, from the fields of strategic management and entrepreneurship. The field emerged officially with the 2001 special issue of the Strategic Management Journal on “strategic entrepreneurship”; the first dedicated...... periodical, the Strategic Entrepreneurship Journal, appeared in 2007. Strategic entrepreneurship is built around two core ideas. (1) Strategy formulation and execution involves attributes that are fundamentally entrepreneurial, such as alertness, creativity, and judgment, and entrepreneurs try to create...... and capture value through resource acquisition and competitive posi-tioning. (2) Opportunity-seeking and advantage-seeking—the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field—are pro-cesses that should be considered jointly. This entry...

  7. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    Science.gov (United States)

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  8. Computer simulation of 2-D and 3-D ion beam extraction and acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Ido, Shunji; Nakajima, Yuji [Saitama Univ., Urawa (Japan). Faculty of Engineering

    1997-03-01

    The two-dimensional code and the three-dimensional code have been developed to study the physical features of the ion beams in the extraction and acceleration stages. By using the two-dimensional code, the design of first electrode(plasma grid) is examined in regard to the beam divergence. In the computational studies by using the three-dimensional code, the axis-off model of ion beam is investigated. It is found that the deflection angle of ion beam is proportional to the gap displacement of the electrodes. (author)

  9. Proposal for an Accelerator R&D User Facility at Fermilab's Advanced Superconducting Test Accelerator (ASTA)

    Energy Technology Data Exchange (ETDEWEB)

    Church, M. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Edwards, H. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Harms, E. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Henderson, S. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Holmes, S. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Lumpkin, A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Kephart, R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Levedev, V. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Leibfritz, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Nagaitsev, S. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Piot, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Northern Illinois Univ., DeKalb, IL (United States); Prokop, C. [Northern Illinois Univ., DeKalb, IL (United States); Shiltsev, V. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sun, Y. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Valishev, A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2013-10-01

    Fermilab is the nation’s particle physics laboratory, supported by the DOE Office of High Energy Physics (OHEP). Fermilab is a world leader in accelerators, with a demonstrated track-record— spanning four decades—of excellence in accelerator science and technology. We describe the significant opportunity to complete, in a highly leveraged manner, a unique accelerator research facility that supports the broad strategic goals in accelerator science and technology within the OHEP. While the US accelerator-based HEP program is oriented toward the Intensity Frontier, which requires modern superconducting linear accelerators and advanced highintensity storage rings, there are no accelerator test facilities that support the accelerator science of the Intensity Frontier. Further, nearly all proposed future accelerators for Discovery Science will rely on superconducting radiofrequency (SRF) acceleration, yet there are no dedicated test facilities to study SRF capabilities for beam acceleration and manipulation in prototypic conditions. Finally, there are a wide range of experiments and research programs beyond particle physics that require the unique beam parameters that will only be available at Fermilab’s Advanced Superconducting Test Accelerator (ASTA). To address these needs we submit this proposal for an Accelerator R&D User Facility at ASTA. The ASTA program is based on the capability provided by an SRF linac (which provides electron beams from 50 MeV to nearly 1 GeV) and a small storage ring (with the ability to store either electrons or protons) to enable a broad range of beam-based experiments to study fundamental limitations to beam intensity and to develop transformative approaches to particle-beam generation, acceleration and manipulation which cannot be done elsewhere. It will also establish a unique resource for R&D towards Energy Frontier facilities and a test-bed for SRF accelerators and high brightness beam applications in support of the OHEP

  10. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    Science.gov (United States)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  11. FMIT accelerator

    International Nuclear Information System (INIS)

    Armstrong, D.D.

    1983-01-01

    A 35-MeV 100-mA cw linear accelerator is being designed by Los Alamos for use in the Fusion Materials Irradiation Test (FMIT) Facility. Essential to this program is the design, construction, and evaluation of performance of the accelerator's injector, low-energy beam transport, and radio-frequency quadrupole sections before they are shipped to the facility site. The installation and testing of some of these sections have begun as well as the testing of the rf, noninterceptive beam diagnostics, computer control, dc power, and vacuum systems. An overview of the accelerator systems and the performance to date is given

  12. Strategic Planning: What's so Strategic about It?

    Science.gov (United States)

    Strong, Bart

    2005-01-01

    The words "strategic" and "planning" used together can lead to confusion unless one spent the early years of his career in never-ending, team-oriented, corporate training sessions. Doesn't "strategic" have something to do with extremely accurate bombing or a defensive missile system or Star Wars or something? Don't "strategic" and "planning" both…

  13. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    We present performance results of a mixed-precision strategy developed to improve a recently developed massively parallel GPU-accelerated tool for fast and scalable simulation of unsteady fully nonlinear free surface water waves over uneven depths (Engsig-Karup et.al. 2011). The underlying wave......-preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  14. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  15. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  16. Analysis of optoelectronic strategic planning in Taiwan by artificial intelligence portfolio tool

    Science.gov (United States)

    Chang, Rang-Seng

    1992-05-01

    Taiwan ROC has achieved significant advances in the optoelectronic industry with some Taiwan products ranked high in the world market and technology. Six segmentations of optoelectronic were planned. Each one was divided into several strategic items, design artificial intelligent portfolio tool (AIPT) to analyze the optoelectronic strategic planning in Taiwan. The portfolio is designed to provoke strategic thinking intelligently. This computer- generated strategy should be selected and modified by the individual. Some strategies for the development of the Taiwan optoelectronic industry also are discussed in this paper.

  17. Application of local area networks to accelerator control systems at the Stanford Linear Accelerator

    International Nuclear Information System (INIS)

    Fox, J.D.; Linstadt, E.; Melen, R.

    1983-03-01

    The history and current status of SLAC's SDLC networks for distributed accelerator control systems are discussed. These local area networks have been used for instrumentation and control of the linear accelerator. Network topologies, protocols, physical links, and logical interconnections are discussed for specific applications in distributed data acquisition and control system, computer networks and accelerator operations

  18. How Strategic are Strategic Information Systems?

    Directory of Open Access Journals (Sweden)

    Alan Eardley

    1996-11-01

    Full Text Available There are many examples of information systems which are claimed to have created and sustained competitive advantage, allowed beneficial collaboration or simply ensured the continued survival of the organisations which used them These systems are often referred to as being 'strategic'. This paper argues that many of the examples of strategic information systems as reported in the literature are not sufficiently critical in determining whether the systems meet the generally accepted definition of the term 'strategic' - that of achieving sustainable competitive advantage. Eight of the information systems considered to be strategic are examined here from the standpoint of one widely-accepted 'competition' framework- Porter's model of industry competition . The framework is then used to question the linkage between the information systems and the mechanisms which are required for the enactment of strategic business objectives based on competition. Conclusions indicate that the systems are compatible with Porter's framework. Finally, some limitations of the framework are discussed and aspects of the systems which extend beyond the framework are highlighted

  19. Computer codes and methods for simulating accelerator driven systems

    International Nuclear Information System (INIS)

    Sartori, E.; Byung Chan Na

    2003-01-01

    A large set of computer codes and associated data libraries have been developed by nuclear research and industry over the past half century. A large number of them are in the public domain and can be obtained under agreed conditions from different Information Centres. The areas covered comprise: basic nuclear data and models, reactor spectra and cell calculations, static and dynamic reactor analysis, criticality, radiation shielding, dosimetry and material damage, fuel behaviour, safety and hazard analysis, heat conduction and fluid flow in reactor systems, spent fuel and waste management (handling, transportation, and storage), economics of fuel cycles, impact on the environment of nuclear activities etc. These codes and models have been developed mostly for critical systems used for research or power generation and other technological applications. Many of them have not been designed for accelerator driven systems (ADS), but with competent use, they can be used for studying such systems or can form the basis for adapting existing methods to the specific needs of ADS's. The present paper describes the types of methods, codes and associated data available and their role in the applications. It provides Web addresses for facilitating searches for such tools. Some indications are given on the effect of non appropriate or 'blind' use of existing tools to ADS. Reference is made to available experimental data that can be used for validating the methods use. Finally, some international activities linked to the different computational aspects are described briefly. (author)

  20. The Role of Implicit Motives in Strategic Decision-Making: Computational Models of Motivated Learning and the Evolution of Motivated Agents

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2015-11-01

    Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.

  1. Research of Virtual Accelerator Control System

    Institute of Scientific and Technical Information of China (English)

    DongJinmei; YuanYoujin; ZhengJianhua

    2003-01-01

    A Virtual Accelerator is a computer process which simulates behavior of beam in an accelerator and responds to the accelerator control program under development in a same way as an actual accelerator. To realize Virtual Accelerator, control system should provide the same program interface to top layer Application Control Program, it can make 'Real Accelerator' and 'Virtual Accelerator'use the same GUI, so control system should have a layer to hide hardware details, Application Control Program access control devices through logical name but not through coded hardware address. Without this layer, it is difficult to develop application program which can access both 'Virtual' and 'Real' Accelerators using same program interfaces. For this reason, we can create CSR Runtime Database which allows application program to access hardware devices and data on a simulation process in a unified way. A device 'is represented as a collection of records in CSR Runtime Database. A control program on host computer can access devices in the system only through names of record fields, called channel.

  2. Computer applications: Automatic control system for high-voltage accelerator

    International Nuclear Information System (INIS)

    Bryukhanov, A.N.; Komissarov, P.Yu.; Lapin, V.V.; Latushkin, S.T.. Fomenko, D.E.; Yudin, L.I.

    1992-01-01

    An automatic control system for a high-voltage electrostatic accelerator with an accelerating potential of up to 500 kV is described. The electronic apparatus on the high-voltage platform is controlled and monitored by means of a fiber-optic data-exchange system. The system is based on CAMAC modules that are controlled by a microprocessor crate controller. Data on accelerator operation are represented and control instructions are issued by means of an alphanumeric terminal. 8 refs., 6 figs

  3. Shady strategic behavior : Recognizing strategic behavior of Dark Triad followers

    NARCIS (Netherlands)

    Schyns, Birgit; Wisse, Barbara; Sanders, Stacey

    2018-01-01

    The importance of strategic behavior in organizations has long been recognized. However, so far the literature has primarily focused on leaders’ strategic behavior, largely ignoring followers’ strategic behavior. In the present paper, we take a follower trait perspective to strategic follower

  4. Strategic Planning in U.S. Municipalities

    Directory of Open Access Journals (Sweden)

    James VAN RAVENSWAY

    2015-12-01

    Full Text Available Strategic planning started in the U.S. as a corporate planning endeavor. By the 1960’s, it had become a major corporate management tool in the Fortune 500. At fi rst, it was seen as a way of interweaving policies, values and purposes with management, resources and market information in a way that held the organization together. By the 1950’s, the concept was simplifi ed somewhat to focus on SWOT as a way of keeping the corporation afl oat in a more turbulent world. The public sector has been under pressure for a long time to become more effi cient, effective and responsive. Many have felt that the adoption of business practices would help to accomplish that. One tool borrowed from business has been strategic planning. At the local government level, strategic planning became popular starting in the 1980’s, and the community’s planning offi ce was called on to lead the endeavor. The planning offi ce was often the advocate of the process. Urban planning offi ces had been doing long-range plans for decades, but with accelerating urban change a more rapid action-oriented response was desired. The paper describes this history and process in the East Lansing, Michigan, U.S., where comprehensive community plans are the result of a multi-year visioning process and call for action- oriented, strategies for targeted parts of the community.

  5. Fiscal years 1994--1998 Information Technology Strategic Plan

    International Nuclear Information System (INIS)

    1993-11-01

    A team of senior managers from across the US Nuclear Regulatory Commission (NRC), working with the Office of Information Resources Management (IRM), has completed an NRC Strategic Information Technology (IT) Plan. The Plan addresses three major areas: (1) IT Program Management, (2) IT Infrastructure, and (3) Information and Applications Management. Key recommendations call for accelerating the replacement of Agency workstations, implementing a new document management system, applying business process reengineering to selected Agency work processes, and establishing an Information Technology Council to advise the Director of IRM

  6. Neurocognitive dysfunction in strategic and non-strategic gamblers.

    Science.gov (United States)

    Grant, Jon E; Odlaug, Brian L; Chamberlain, Samuel R; Schreiber, Liana R N

    2012-08-07

    It has been theorized that there may be subtypes of pathological gambling, particularly in relation to the main type of gambling activities undertaken. Whether or not putative pathological gambling subtypes differ in terms of their clinical and cognitive profiles has received little attention. Subjects meeting DSM-IV criteria for pathological gambling were grouped into two categories of preferred forms of gambling - strategic (e.g., cards, dice, sports betting, stock market) and non-strategic (e.g., slots, video poker, pull tabs). Groups were compared on clinical characteristics (gambling severity, and time and money spent gambling), psychiatric comorbidity, and neurocognitive tests assessing motor impulsivity and cognitive flexibility. Seventy-seven subjects were included in this sample (45.5% females; mean age: 42.7±14.9) which consisted of the following groups: strategic (n=22; 28.6%) and non-strategic (n=55; 71.4%). Non-strategic gamblers were significantly more likely to be older, female, and divorced. Money spent gambling did not differ significantly between groups although one measure of gambling severity reflected more severe problems for strategic gamblers. Strategic and non-strategic gamblers did not differ in terms of cognitive function; both groups showed impairments in cognitive flexibility and inhibitory control relative to matched healthy volunteers. These preliminary results suggest that preferred form of gambling may be associated with specific clinical characteristics but are not dissociable in terms of cognitive inflexibility and motor impulsivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Centralized digital control of accelerators

    International Nuclear Information System (INIS)

    Melen, R.E.

    1983-09-01

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors

  8. An Examination of Resonance, Acceleration, and Particle Dynamics in the Micro-Accelerator Platform

    International Nuclear Information System (INIS)

    McNeur, Josh; Rosenzweig, J. B.; Travish, G.; Zhou, J.; Yoder, R.

    2010-01-01

    An effort to build a micron-scale dielectric-based slab-symmetric accelerator is underway at UCLA. The structure achieves acceleration via a resonant accelerating mode that is excited in an approximately 800 nm wide vacuum gap by a side coupled 800 nm laser. Detailed simulation results on structure fields and particle dynamics, using HFSS and VORPAL, are presented. We examine the quality factors of the accelerating modes for various structures and the excitations of non-accelerating destructive modes. Additionally, the results of an analytic and computational study of focusing, longitudinal dynamics and acceleration are described. Methods for achieving simultaneous transverse and longitudinal focusing are discussed, including modification of structure dimensions and slow variation of the coupling periodicity.

  9. Strategic cycling: shaking complacency in healthcare strategic planning.

    Science.gov (United States)

    Begun, J; Heatwole, K B

    1999-01-01

    As the conditions affecting business and healthcare organizations in the United States have become more turbulent and uncertain, strategic planning has decreased in popularity. Strategic planning is criticized for stiffling creative responses to the new marketplace and for fostering compartmentalized organizations, adherence to outmoded strategies, tunnel vision in strategy formulation, and overemphasis on planning to the detriment of implementation. However, effective strategic planning can be a force for mobilizing all the constituents of an organization, creating discipline in pursuit of a goal, broadening an organization's perspective, improving communication among disciplines, and motivating the organization's workforce. It is worthwhile for healthcare organizations to preserve these benefits of strategic planning at the same time recognizing the many sources of turbulence and uncertainty in the healthcare environment. A model of "strategic cycling" is presented to address the perceived shortcomings of traditional strategic planning in a dynamic environment. The cycling model facilitates continuous assessment of the organization's mission/values/vision and primary strategies based on feedback from benchmark analysis, shareholder impact, and progress in strategy implementation. Multiple scenarios and contingency plans are developed in recognition of the uncertain future. The model represents a compromise between abandoning strategic planning and the traditional, linear model of planning based on progress through predetermined stages to a masterpiece plan.

  10. Strategic Leadership as Determinant of Strategic Change: A Theoretical Review and Propositions

    OpenAIRE

    Ahadiat, Ayi

    2009-01-01

    The strategic change is an issue that closely related to strategic leadership. As this paper elaborates how strategic leadership determines the strategic change, the elaboration of both concept and their relationship are presented through propositions that are developed from the modified Hambrick’s model. Strategic leadership that causes strategic change in terms of strategic process and content within environmental and organizational context will lead to organizational performance as an ulti...

  11. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  12. High performance proton accelerators

    International Nuclear Information System (INIS)

    Favale, A.J.

    1989-01-01

    In concert with this theme this paper briefly outlines how Grumman, over the past 4 years, has evolved from a company that designed and fabricated a Radio Frequency Quadrupole (RFQ) accelerator from the Los Alamos National Laboratory (LANL) physics and specifications to a company who, as prime contractor, is designing, fabricating, assembling and commissioning the US Army Strategic Defense Commands (USA SDC) Continuous Wave Deuterium Demonstrator (CWDD) accelerator as a turn-key operation. In the case of the RFQ, LANL scientists performed the physics analysis, established the specifications supported Grumman on the mechanical design, conducted the RFQ tuning and tested the RFQ at their laboratory. For the CWDD Program Grumman has the responsibility for the physics and engineering designs, assembly, testing and commissioning albeit with the support of consultants from LANL, Lawrence Berkeley Laboratory (LBL) and Brookhaven National laboratory. In addition, Culham Laboratory and LANL are team members on CWDD. LANL scientists have reviewed the physics design as well as a USA SDC review board. 9 figs

  13. Implementation of Hardware Accelerators on Zynq

    DEFF Research Database (Denmark)

    Toft, Jakob Kenn

    of the ARM Cortex-9 processor featured on the Zynq SoC, with regard to execution time, power dissipation and energy consumption. The implementation of the hardware accelerators were successful. Use of the Monte Carlo processor resulted in a significant increase in performance. The Telco hardware accelerator......In the recent years it has become obvious that the performance of general purpose processors are having trouble meeting the requirements of high performance computing applications of today. This is partly due to the relatively high power consumption, compared to the performance, of general purpose...... processors, which has made hardware accelerators an essential part of several datacentres and the worlds fastest super-computers. In this work, two different hardware accelerators were implemented on a Xilinx Zynq SoC platform mounted on the ZedBoard platform. The two accelerators are based on two different...

  14. Accelerators for heavy ion fusion

    International Nuclear Information System (INIS)

    Bangerter, R.O.

    1985-10-01

    Large fusion devices will almost certainly produce net energy. However, a successful commercial fusion energy system must also satisfy important engineering and economic constraints. Inertial confinement fusion power plants driven by multi-stage, heavy-ion accelerators appear capable of meeting these constraints. The reasons behind this promising outlook for heavy-ion fusion are given in this report. This report is based on the transcript of a talk presented at the Symposium on Lasers and Particle Beams for Fusion and Strategic Defense at the University of Rochester on April 17-19, 1985

  15. Accelerator R and D: Research for Science - Science for Society

    International Nuclear Information System (INIS)

    Holtkamp, N.R.; Biedron, S.; Milton, S.V.; Boeh, L.; Clayton, J.E.; Zdasiuk, G.; Gourlay, S.A.; Zisman, M.S.; Hamm, R.W.; Henderson, S.; Hoffstaetter, G.H.; Merminga, L.; Ozaki, S.; Pilat, F.C.; White, M.

    2012-01-01

    In September 2011 the US Senate Appropriations Committee requested a ten-year strategic plan from the Department of Energy (DOE) that would describe how accelerator R and D today could advance applications directly relevant to society. Based on the 2009 workshop 'Accelerators for America's Future' an assessment was made on how accelerator technology developed by the nation's laboratories and universities could directly translate into a competitive strength for industrial partners and a variety of government agencies in the research, defense and national security sectors. The Office of High Energy Physics, traditionally the steward for advanced accelerator R and D within DOE, commissioned a task force under its auspices to generate and compile ideas on how best to implement strategies that would help fulfill the needs of industry and other agencies, while maintaining focus on its core mission of fundamental science investigation.

  16. XACC - eXtreme-scale Accelerator Programming Framework

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  17. STRATEGIC ALLIANCES – VIABLE ALTERNATIVE TO CREATE A COMPETITIVE ADVANTAGE IN A GLOBAL MARKET

    Directory of Open Access Journals (Sweden)

    Irina NICOLAU

    2010-12-01

    Full Text Available In the past years, in the light of the economic turbulences all around the world, one of the most important ways to assure a competitive advantage is creating a strategic alliance. Such collaborative ventures between firms were developed as a response to the changes which have been happening to the world economy as increased competition, higher costs of developing new products, accelerated technological changes and, maybe the most important – the recent world economic crises. Being part of a strategic alliance creates competitive advantage for the companies by establishing their presence worldwide, by building up operating experience in overseas markets and gaining access to those national markets that were inaccessible before. At the same time, a strategic alliance means management commitment, special skills and forward planning for each company which takes part to an alliance.

  18. Accelerating Approximate Bayesian Computation with Quantile Regression: application to cosmological redshift distributions

    Science.gov (United States)

    Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.

    2018-02-01

    Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.

  19. A LEGO paradigm for virtual accelerator concept

    International Nuclear Information System (INIS)

    Andrianov, S.; Ivanov, A.; Podzyvalov, E.

    2012-01-01

    The paper considers basic features of a Virtual Accelerator concept based on LEGO paradigm. This concept involves three types of components: different mathematical models for accelerator design problems, integrated beam simulation packages (i. e. COSY, MAD, OptiM and others), and a special class of virtual feedback instruments similar to real control systems (EPICS). All of these components should inter-operate for more complete analysis of control systems and increased fault tolerance. The Virtual Accelerator is an information and computing environment which provides a framework for analysis based on these components that can be combined in different ways. Corresponding distributed computing services establish interaction between mathematical models and low level control system. The general idea of the software implementation is based on the Service-Oriented Architecture (SOA) that allows using cloud computing technology and enables remote access to the information and computing resources. The Virtual Accelerator allows a designer to combine powerful instruments for modeling beam dynamics in a friendly way including both self-developed and well-known packages. In the scope of this concept the following is also proposed: the control system identification, analysis and result verification, visualization as well as virtual feedback for beam line operation. The architecture of the Virtual Accelerator system itself and results of beam dynamics studies are presented. (authors)

  20. Cyber Terrorism demands a Global Risks and Threats Strategic Management

    International Nuclear Information System (INIS)

    Gareva, R.

    2007-01-01

    The world is in the third wave of development, which is digital managed and networked. Information, which creates the knowledge is transferring thorough the Internet by exponential function. The rapid advancement of the computer technology has a great influence over the development of the critical information infrastructure, thus changing the safety environment and the national values and interests. This advancement produces threats and risks from computer perspective which are sublimated in different forms of international terrorism and particularly in cyber terrorism. The main aim of this paper is based on a thorough analysis of what is scientifically known and practiced when nowadays critical information infrastructure is in the focus of the cyber terrorism. The rapid IT development demands changes in the strategic management focus. As a result of a time-consuming theoretical and empirical research this paper suggests a methodology for strategic managing of: threats, risks and vulnerabilities. The proposed methodology is seen as a mean to increase the human security conscious in every sense of the word, and to promote the need for rules, procedures and standards establishment from the aspect of the strategic management in the new information epoch concerning. In addition, through a scientific discourse, a short attempt is made to relate Macedonian reality with the phenomenon mentioned above. The most fundamental set phrase is that the efficiency and promptly made decisions during strategic planning are a projection of the systematic organization of functions and models for managing the risks and threats of the critical information infrastructure. Hence, this paper could be seen as a perspective when taking in consideration the regional strategic management, and the cyber space vital functioning. (author)

  1. Computer Based Dose Control System on Linear Accelerator

    International Nuclear Information System (INIS)

    Taxwim; Djoko-SP; Widi-Setiawan; Agus-Budi Wiyatna

    2000-01-01

    The accelerator technology has been used for radio therapy. DokterKaryadi Hospital in Semarang use electron or X-ray linear accelerator (Linac)for cancer therapy. One of the control parameter of linear accelerator isdose rate. It is particle current or amount of photon rate to the target. Thecontrol of dose rate in linac have been done by adjusting repetition rate ofanode pulse train of electron source. Presently the control is stillproportional control. To enhance the quality of the control result (minimalstationer error, velocity and stability), the dose control system has beendesigned by using the PID (Proportional Integral Differential) controlalgorithm and the derivation of transfer function of control object.Implementation of PID algorithm control system is done by giving an input ofdose error (the different between output dose and dose rate set point). Theoutput of control system is used for correction of repetition rate set pointfrom pulse train of electron source anode. (author)

  2. Charting the expansion of strategic exploratory behavior during adolescence.

    Science.gov (United States)

    Somerville, Leah H; Sasse, Stephanie F; Garrad, Megan C; Drysdale, Andrew T; Abi Akar, Nadine; Insel, Catherine; Wilson, Robert C

    2017-02-01

    Although models of exploratory decision making implicate a suite of strategies that guide the pursuit of information, the developmental emergence of these strategies remains poorly understood. This study takes an interdisciplinary perspective, merging computational decision making and developmental approaches to characterize age-related shifts in exploratory strategy from adolescence to young adulthood. Participants were 149 12-28-year-olds who completed a computational explore-exploit paradigm that manipulated reward value, information value, and decision horizon (i.e., the utility that information holds for future choices). Strategic directed exploration, defined as information seeking selective for long time horizons, emerged during adolescence and maintained its level through early adulthood. This age difference was partially driven by adolescents valuing immediate reward over new information. Strategic random exploration, defined as stochastic choice behavior selective for long time horizons, was invoked at comparable levels over the age range, and predicted individual differences in attitudes toward risk taking in daily life within the adolescent portion of the sample. Collectively, these findings reveal an expansion of the diversity of strategic exploration over development, implicate distinct mechanisms for directed and random exploratory strategies, and suggest novel mechanisms for adolescent-typical shifts in decision making. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Strategizing Communication

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    beyond, but not past instrumental, rational plans in order to become better able to understand and manage the concrete, incremental practices and contexts in which communication becomes strategic. Thus, we argue that although strategic communicators do (and should) make plans, a plan in itself does...... of the specific communicative disciplines and practices employed by the organization and/or its individual members, be they marketing, public relations, corporate communication, branding, public affairs or social advocacy. In all cases, strategic communicators do well to focus more on the process of communicating...... for understanding and managing strategic communication processes....

  4. Dissociable neural representations of reinforcement and belief prediction errors underlie strategic learning.

    Science.gov (United States)

    Zhu, Lusha; Mathewson, Kyle E; Hsu, Ming

    2012-01-31

    Decision-making in the presence of other competitive intelligent agents is fundamental for social and economic behavior. Such decisions require agents to behave strategically, where in addition to learning about the rewards and punishments available in the environment, they also need to anticipate and respond to actions of others competing for the same rewards. However, whereas we know much about strategic learning at both theoretical and behavioral levels, we know relatively little about the underlying neural mechanisms. Here, we show using a multi-strategy competitive learning paradigm that strategic choices can be characterized by extending the reinforcement learning (RL) framework to incorporate agents' beliefs about the actions of their opponents. Furthermore, using this characterization to generate putative internal values, we used model-based functional magnetic resonance imaging to investigate neural computations underlying strategic learning. We found that the distinct notions of prediction errors derived from our computational model are processed in a partially overlapping but distinct set of brain regions. Specifically, we found that the RL prediction error was correlated with activity in the ventral striatum. In contrast, activity in the ventral striatum, as well as the rostral anterior cingulate (rACC), was correlated with a previously uncharacterized belief-based prediction error. Furthermore, activity in rACC reflected individual differences in degree of engagement in belief learning. These results suggest a model of strategic behavior where learning arises from interaction of dissociable reinforcement and belief-based inputs.

  5. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  6. X-ray beam hardening correction for measuring density in linear accelerator industrial computed tomography

    International Nuclear Information System (INIS)

    Zhou Rifeng; Wang Jue; Chen Weimin

    2009-01-01

    Due to X-ray attenuation being approximately proportional to material density, it is possible to measure the inner density through Industrial Computed Tomography (ICT) images accurately. In practice, however, a number of factors including the non-linear effects of beam hardening and diffuse scattered radiation complicate the quantitative measurement of density variations in materials. This paper is based on the linearization method of beam hardening correction, and uses polynomial fitting coefficient which is obtained by the curvature of iron polychromatic beam data to fit other materials. Through theoretical deduction, the paper proves that the density measure error is less than 2% if using pre-filters to make the spectrum of linear accelerator range mainly 0.3 MeV to 3 MeV. Experiment had been set up at an ICT system with a 9 MeV electron linear accelerator. The result is satisfactory. This technique makes the beam hardening correction easy and simple, and it is valuable for measuring the ICT density and making use of the CT images to recognize materials. (authors)

  7. Stratway: A Modular Approach to Strategic Conflict Resolution

    Science.gov (United States)

    Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.

    2011-01-01

    In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.

  8. Strategic financial analysis: the CFO's role in strategic planning.

    Science.gov (United States)

    Litos, D M

    1985-03-01

    Strategic financial analysis, the financial information support system for the strategic planning process, provides information vital to maintaining a healthy bottom line. This article, the third in HCSM's series on the organizational components of strategic planning, reviews the role of the chief financial officer in determining which programs and services will best meet the future needs of the institution.

  9. Mapping strategic diversity: strategic thinking from a variety of perspectives

    NARCIS (Netherlands)

    Jacobs, D.

    2010-01-01

    In his influential work, Strategy Safari, Henry Mintzberg and his colleagues presented ten schools of strategic thought. In this impressive book, Dany Jacobs demonstrates that the real world of strategic management is much wider and richer. In Mapping Strategic Diversity, Jacobs distinguishes

  10. Checkpointing for a hybrid computing node

    Science.gov (United States)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  11. Strategic Leadership Primer (Third Edition)

    Science.gov (United States)

    2010-01-01

    decision making � STRATEGIC DECISION MAKING Strategic Change There are several strategic decisions that involved...The Ontology of Strategic Decision Making Strategic decisions are non-routine and involve both the art of leadership and the science of management...building consensus,”5 implicitly requires the capacity for strategic decision making� The Complexity of Strategic Decision Making Strategic

  12. Recent Improvements to CHEF, a Framework for Accelerator Computations

    Energy Technology Data Exchange (ETDEWEB)

    Ostiguy, J.-F.; Michelotti, L.P.; /Fermilab

    2009-05-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5

  13. Recent Improvements to CHEF, a Framework for Accelerator Computations

    International Nuclear Information System (INIS)

    Ostiguy, J.-F.; Michelotti, L.P.

    2009-01-01

    CHEF is body of software dedicated to accelerator related computations. It consists of a hierarchical set of libraries and a stand-alone application based on the latter. The implementation language is C++; the code makes extensive use of templates and modern idioms such as iterators, smart pointers and generalized function objects. CHEF has been described in a few contributions at previous conferences. In this paper, we provide an overview and discuss recent improvements. Formally, CHEF refers to two distinct but related things: (1) a set of class libraries; and (2) a stand-alone application based on these libraries. The application makes use of and exposes a subset of the capabilities provided by the libraries. CHEF has its ancestry in efforts started in the early nineties. At that time, A. Dragt, E. Forest [2] and others showed that ring dynamics can be formulated in a way that puts maps rather than Hamiltonians, into a central role. Automatic differentiation (AD) techniques, which were just coming of age, were a natural fit in a context where maps are represented by their Taylor approximations. The initial vision, which CHEF carried over, was to develop a code that (1) concurrently supports conventional tracking, linear and non-linear map-based techniques (2) avoids 'hardwired' approximations that are not under user control (3) provides building blocks for applications. C++ was adopted as the implementation language because of its comprehensive support for operator overloading and the equal status it confers to built-in and user-defined data types. It should be mentioned that acceptance of AD techniques in accelerator science owes much to the pioneering work of Berz [1] who implemented--in fortran--the first production quality AD engine (the foundation for the code COSY). Nowadays other engines are available, but few are native C++ implementations. Although AD engines and map based techniques are making their way into more traditional codes e.g. [5], it is also

  14. The auroral electron accelerator

    International Nuclear Information System (INIS)

    Bryant, D.A.; Hall, D.S.

    1989-01-01

    A model of the auroral electron acceleration process is presented in which the electrons are accelerated resonantly by lower-hybrid waves. The essentially stochastic acceleration process is approximated for the purposes of computation by a deterministic model involving an empirically derived energy transfer function. The empirical function, which is consistent with all that is known of electron energization by lower-hybrid waves, allows many, possibly all, observed features of the electron distribution to be reproduced. It is suggested that the process occurs widely in both space and laboratory plasmas. (author)

  15. Hacking control systems, switching… accelerators off?

    CERN Multimedia

    Computer Security Team

    2013-01-01

    In response to our article in the last Bulletin, we received the following comment: “Wasn’t Stuxnet designed to stop the Iranian nuclear programme? Why then all this noise with regard to CERN accelerators? Don’t you realize that ‘computer security’ is not the raison d'être of CERN?”. Thank you for this golden opportunity to delve into this issue.   Given the sophistication of Stuxnet, it might have been hard to detect such a targeted attack against CERN, if at all. But this is not the point. There are much simpler risks for our accelerator complex and infrastructure. And, while “‘computer security’ is [indeed] not the raison d' être”, it is our collective responsibility to keep this risk at bay.   Examples? Just think of a simple computer virus infecting Windows-based control PCs connected to the accelerator network (the Technical Network, &ld...

  16. Flexusi Interface Builder For Computer Based Accelerator Monitoring And Control System

    CERN Document Server

    Kurakin, V G; Kurakin, P V

    2004-01-01

    We have developed computer code for any desired graphics user interface designing for monitoring and control system at the executable level. This means that operator can build up measurement console consisting of virtual devices before or even during real experiment without recompiling source file. Such functionality results in number of advantages comparing with traditional programming. First of all any risk disappears to introduce bug into source code. Another important thing is the fact the both program developers and operator staff do not interface in developing ultimate product (measurement console). Thus, small team without detailed project can design even very complicated monitoring and control system. For the reason mentioned below, approach suggested is especially helpful for large complexes to be monitored and control, accelerator being among them. The program code consists of several modules, responsible for data acquisition, control and representation. Borland C++ Builder technologies based on VCL...

  17. Automatic generation of application specific FPGA multicore accelerators

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Schleuniger, Pascal; Jensen, Nicklas Bo

    2014-01-01

    High performance computing systems make increasing use of hardware accelerators to improve performance and power properties. For large high-performance FPGAs to be successfully integrated in such computing systems, methods to raise the abstraction level of FPGA programming are required...... to identify optimal performance energy trade-offs points for a multicore based FPGA accelerator....

  18. Assessing the Relationships among Cloud Adoption, Strategic Alignment and Information Technology Effectiveness

    Science.gov (United States)

    Chebrolu, Shankar Babu

    2010-01-01

    Against the backdrop of new economic realities, one of the larger forces that is affecting businesses worldwide is cloud computing, whose benefits include agility, time to market, time to capability, reduced cost, renewed focus on the core and strategic partnership with the business. Cloud computing can potentially transform a majority of the…

  19. ORNL 25 MV tandem accelerator control system

    International Nuclear Information System (INIS)

    Juras, R.C.; Biggerstaff, J.A.; Hoglund, D.E.

    1985-01-01

    The CAMAC-based control system for the 25 MV tandem electrostatic accelerator of the Holifield Heavy Ion Research Facility at Oak Ridge National Laboratory (ORNL) was specified by ORNL and built by the National Electrostatics Corporation. Two Perkin-Elmer 32-bit minicomputers are used in the system, a message switching computer and a supervisory computer. The message switching computer transmits and receives control information on six serial highways. This computer shares memory with the supervisory computer. Operator consoles are located on a serial highway; control is by means of a console CRT, trackball, and assignable shaft encoders and meters. Two identical consoles operate simultaneously: one is located in the tandem control room; the other is located in the cyclotron control room to facilitate operation during injection of tandem beams into the cyclotron or when beam lines under control of the cyclotron control system are used. The supervisory computer is used for accelerator parameter setup calculations, actual accelerator setup for new beams based on scaled, recorded parameters from previously run beams, and various other functions. Nearly seven years of control system operation and improvements will be discussed

  20. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  1. The Computer Program LIAR for Beam Dynamics Calculations in Linear Accelerators

    International Nuclear Information System (INIS)

    Assmann, R.W.; Adolphsen, C.; Bane, K.; Raubenheimer, T.O.; Siemann, R.H.; Thompson, K.

    2011-01-01

    Linear accelerators are the central components of the proposed next generation of linear colliders. They need to provide acceleration of up to 750 GeV per beam while maintaining very small normalized emittances. Standard simulation programs, mainly developed for storage rings, do not meet the specific requirements for high energy linear accelerators. We present a new program LIAR ('LInear Accelerator Research code') that includes wakefield effects, a 6D coupled beam description, specific optimization algorithms and other advanced features. Its modular structure allows to use and to extend it easily for different purposes. The program is available for UNIX workstations and Windows PC's. It can be applied to a broad range of accelerators. We present examples of simulations for SLC and NLC.

  2. Strategic cost management as the main component of strategic management accounting

    OpenAIRE

    Ходзицька, Валентина Василівна

    2013-01-01

    The influence of cost management on making management decisions and functioning of the system of strategic management accounting was analyzed in the paper. The main aspects of the influence of strategic management accounting on making effective management decisions in the system of integrated management of business entities were highlighted. The scope of the organizational activity, covered by the strategic management accounting was described.The paper shows the orientation of strategic manag...

  3. Spacetime transformations from a uniformly accelerated frame

    International Nuclear Information System (INIS)

    Friedman, Yaakov; Scarr, Tzvi

    2013-01-01

    We use the generalized Fermi–Walker transport to construct a one-parameter family of inertial frames which are instantaneously comoving to a uniformly accelerated observer. We explain the connection between our approach and that of Mashhoon. We show that our solutions of uniformly accelerated motion have constant acceleration in the comoving frame. Assuming the weak hypothesis of locality, we obtain local spacetime transformations from a uniformly accelerated frame K′ to an inertial frame K. The spacetime transformations between two uniformly accelerated frames with the same acceleration are Lorentz. We compute the metric at an arbitrary point of a uniformly accelerated frame. (paper)

  4. Strategic marketing research

    NARCIS (Netherlands)

    Bijmolt, Tammo H.A.; Frambach, Ruud T.; Verhallen, Theo M.M.

    1996-01-01

    This article introduces the term “strategic marketing research” for the collection and analysis of data in support of strategic marketing management. In particular, strategic marketing research plays an important role in defining the market, analysis of the environment, and the formulation of

  5. Strategic information security

    CERN Document Server

    Wylder, John

    2003-01-01

    Introduction to Strategic Information SecurityWhat Does It Mean to Be Strategic? Information Security Defined The Security Professional's View of Information Security The Business View of Information SecurityChanges Affecting Business and Risk Management Strategic Security Strategic Security or Security Strategy?Monitoring and MeasurementMoving Forward ORGANIZATIONAL ISSUESThe Life Cycles of Security ManagersIntroductionThe Information Security Manager's Responsibilities The Evolution of Data Security to Information SecurityThe Repository Concept Changing Job Requirements Business Life Cycles

  6. Economic Modeling as a Component of Academic Strategic Planning.

    Science.gov (United States)

    MacKinnon, Joyce; Sothmann, Mark; Johnson, James

    2001-01-01

    Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)

  7. Strategic Responsiveness

    DEFF Research Database (Denmark)

    Pedersen, Carsten; Juul Andersen, Torben

    decision making is often conceived as ‘standing on the two feet’ of deliberate or intended strategic decisions by top management and emergent strategic decisions pursued by lower-level managers and employees. In this view, the paper proposes that bottom-up initiatives have a hard time surfacing...... in hierarchical organizations and that lowerlevel managers and employees, therefore, pursue various strategies to bypass the official strategy processes to act on emerging strategic issues and adapt to changing environmental conditions.......The analysis of major resource committing decisions is central focus in the strategy field, but despite decades of rich conceptual and empirical research we still seem distant from a level of understanding that can guide corporate practices under dynamic and unpredictable conditions. Strategic...

  8. “Change is constant in today’s business for competitive advantage. Strategic leadership is vital for effective strategic change management - roles & responsibilities and strategic capability of strategic leadership.”

    OpenAIRE

    Chia, Grace Hui Yen

    2009-01-01

    The aim of this paper is to seek to understand the reachange is constant in today’s business for competitive advantage. And to make the strategic change happen in order to achieve the desired outcome, what will be the right strategic process flow. What are the key challenges that will be encountered throughout the process of strategic change management? This paper will also learn whether strategic leadership is vital to make the strategic change happen in the effective way since many literatu...

  9. Strategic growth options

    NARCIS (Netherlands)

    Kulatilaka, N.; Perotti, E.C.

    1998-01-01

    We provide a strategic rationale for growth options under uncertainty and imperfect corn-petition. In a market with strategic competition, investment confers a greater capability to take advantage of future growth opportunities. This strategic advantage leads to the capture of a greater share of the

  10. 11. Strategic planning.

    Science.gov (United States)

    2014-05-01

    There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.

  11. Implementation Of Strategic Management

    African Journals Online (AJOL)

    Administrator

    Creativity and innovation is the new game plan inherent in strategic .... The diagram below is a simplified operational model of strategic management, ..... Bryson (1995) outlines four benefits of strategic (planning) Management in his ... champions, good strategic planning teams, enough slack to handle potentially disruptive.

  12. Multinational Corporation and International Strategic Alliance

    Institute of Scientific and Technical Information of China (English)

    陆兮

    2015-01-01

    The world is now deeply into the second great wave of globalization, in which product, capital, and markets are becoming more and more integrated across countries. Multinational corporations are gaining their rapid growth around the globe and playing a significant role in the world economy. Meanwhile, the accelerated rate of globalization has also imposed pressures on MNCs, left them desperately seeking overseas alliances in order to remain competitive. International strategic alliances, which bring together large and commonly competitive firms for specific purposes, have gradual y shown its importance in the world market. And the form of international joint venture is now widely adopted. Then after the formation of alliances, selecting the right partner, formulating right strategies, establishing harmonious and effective partnership are generally the key to success.

  13. The Los Alamos accelerator code group

    International Nuclear Information System (INIS)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-01-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG's activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET

  14. Alliance Coordination, Dysfunctions, and the Protection of Idiosyncratic Knowledge in Strategic Learning Alliances

    OpenAIRE

    Müller, Dirk

    2010-01-01

    In high technology industries firms use strategic learning alliances to create value that can’t be created alone. While they open their interorganizational membrane to gain new skills and competences, generate new products and services, accelerate development speed, and enter into new markets their idiosyncratic knowledge base may be impaired when knowledge related dysfunctions like the unintended knowledge transfer, asymmetric learning speed or premature closing occur. Within a value approac...

  15. Cyber Conflict Between Taiwan and China; Strategic Insights, Spring 2011

    OpenAIRE

    Chang, Yao-chung

    2011-01-01

    This article appeared in Strategic Insights, Spring 2011 The Republic of China (Taiwan hereafter) and the People’s Republic of China (China hereafter) are two particularly attractive targets for internet hackers. Reports have found that, compared to other countries in the Asia and Pacific regions, China and Taiwan rank as the top two countries in terms of malicious computer activity. Reports have also shown that most hacking into Taiwanese computer systems is initiated from wit...

  16. Reactor and /or accelerator: general remarks on strategic considerations in sourcing/producing radiopharmaceuticals and radiotracer for the Philippines

    International Nuclear Information System (INIS)

    Nazarea, A.D.

    1996-01-01

    The most important sources of radionuclides in the world are particle accelerators and nuclear reactors. Since the late 1940's many radiotracers and radiopharmaceuticals have been innovated and conceived, designed, produced and applied in important industrial and clinical/ biomedical settings. For example in the health area, reactor-produced radionuclides have become indispensable for diagnostic imaging involving, in its most recent and advanced development, radioimmunoscintigraphy, which exploits the exquisite ligand-specificity of monoclonal antibodies, reagents which in turn are the products of advances in biotechnology. Thus far, one of the most indispensable radiopharmaceuticals has been 99m Tc, which is usually obtained as a daughter decay product of 99 Mo. In January 1991, some questions about the stability of the worldwide commercial supply of 99 Mo became highlighted when the major commercial world producer of 99 Mo, Nordion International, shut down its facilities temporarily in Canada due to contamination in its main reactor building (see for instance relevant newsbrief in J. Nuclear Medicine (1991): 'Industry agrees to join DOE study of domestic moly-99 production'). With the above background, my remarks will attempt to open discussions on strategic considerations relevant to questions of 'self reliance' in radiotracers/radiopharmaceutical production in the Philippines. For instance, the relevant question of sourcing local radionuclide needs from a fully functioning multipurpose cyclotron facility within the country that will then supply the needs of the local industrial, biomedical (including research) and health sectors; and possibly, eventually acquiring the capability to export to nearby countries longer-lived radiotracers and radiopharmaceuticals

  17. Accelerated time-of-flight (TOF) PET image reconstruction using TOF bin subsetization and TOF weighting matrix pre-computation

    International Nuclear Information System (INIS)

    Mehranian, Abolfazl; Kotasidis, Fotis; Zaidi, Habib

    2016-01-01

    FDG-PET study also revealed that for the same noise level, a higher contrast recovery can be obtained by increasing the number of TOF subsets. It can be concluded that the proposed TOF weighting matrix pre-computation and subsetization approaches enable to further accelerate and improve the convergence properties of OSEM and MLEM algorithms, thus opening new avenues for accelerated TOF PET image reconstruction. (paper)

  18. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...

  19. Strategic market segmentation

    Directory of Open Access Journals (Sweden)

    Maričić Branko R.

    2015-01-01

    Full Text Available Strategic planning of marketing activities is the basis of business success in modern business environment. Customers are not homogenous in their preferences and expectations. Formulating an adequate marketing strategy, focused on realization of company's strategic objectives, requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation. Strategic planning imposes a need to plan marketing activities according to strategically important segments on the long term basis. At the same time, there is a need to revise and adapt marketing activities on the short term basis. There are number of criteria based on which market segmentation is performed. The paper will consider effectiveness and efficiency of different market segmentation criteria based on empirical research of customer expectations and preferences. The analysis will include traditional criteria and criteria based on behavioral model. The research implications will be analyzed from the perspective of selection of the most adequate market segmentation criteria in strategic planning of marketing activities.

  20. Strategic serendipity

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark; Lemmergaard, Jeanette

    2014-01-01

    This paper contributes to critical voices on the issue of strategic communication. It does so by exploring how an organisation can seize the moment of serendipity based on careful preparation of its issues management and communication channels. The focus of the study is the media coverage......-of-the-art knowledge and in-depth understanding of the affordances of different communication channels, we discuss the importance of establishing opportunities for serendipity in strategic communication planning. The contribution of the paper is to develop the concept of strategic serendipity and show how...

  1. The Los Alamos accelerator code group

    Energy Technology Data Exchange (ETDEWEB)

    Krawczyk, F.L.; Billen, J.H.; Ryne, R.D.; Takeda, Harunori; Young, L.M.

    1995-05-01

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG`s activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET.

  2. Superconductor Requirements and Characterization for High Field Accelerator Magnets

    Energy Technology Data Exchange (ETDEWEB)

    Barzi, E.; Zlobin, A. V.

    2015-05-01

    The 2014 Particle Physics Project Prioritization Panel (P5) strategic plan for U.S. High Energy Physics (HEP) endorses a continued world leadership role in superconducting magnet technology for future Energy Frontier Programs. This includes 10 to 15 T Nb3Sn accelerator magnets for LHC upgrades and a future 100 TeV scale pp collider, and as ultimate goal that of developing magnet technologies above 20 T based on both High Temperature Superconductors (HTS) and Low Temperature Superconductors (LTS) for accelerator magnets. To achieve these objectives, a sound conductor development and characterization program is needed and is herein described. This program is intended to be conducted in close collaboration with U.S. and International labs, Universities and Industry.

  3. Modern control techniques for accelerators

    International Nuclear Information System (INIS)

    Goodwin, R.W.; Shea, M.F.

    1984-01-01

    Beginning in the mid to late sixties, most new accelerators were designed to include computer based control systems. Although each installation differed in detail, the technology of the sixties and early to mid seventies dictated an architecture that was essentially the same for the control systems of that era. A mini-computer was connected to the hardware and to a console. Two developments have changed the architecture of modern systems: the microprocessor and local area networks. This paper discusses these two developments and demonstrates their impact on control system design and implementation by way of describing a possible architecture for any size of accelerator. Both hardware and software aspects are included

  4. Healthcare's Future: Strategic Investment in Technology.

    Science.gov (United States)

    Franklin, Michael A

    2018-01-01

    Recent and rapid advances in the implementation of technology have greatly affected the quality and efficiency of healthcare delivery in the United States. Simultaneously, diverse generational pressures-including the consumerism of millennials and unsustainable growth in the costs of care for baby boomers-have accelerated a revolution in healthcare delivery that was marked in 2010 by the passage of the Affordable Care Act.Against this backdrop, Maryland and the Centers for Medicare & Medicaid Services entered into a partnership in 2014 to modernize the Maryland All-Payer Model. Under this architecture, each Maryland hospital negotiates a global budget revenue agreement with the state's rate-setting agency, limiting the hospital's annual revenue to the budgetary cap established by the state.At Atlantic General Hospital (AGH), leaders had established a disciplined strategic planning process in which the board of trustees, medical staff, and administration annually agree on goals and initiatives to achieve the objectives set forth in its five-year strategic plans. This article describes two initiatives to improve care using technology. In 2006, AGH introduced a service guarantee in the emergency room (ER); the ER 30-Minute Promise assures patients that they will be placed in a bed or receive care within 30 minutes of arrival in the ER. In 2007, several independent hospitals in the state formed Maryland eCare to jointly contract for intensive care unit (ICU) physician coverage via telemedicine. This technology allows clinical staff to continuously monitor ICU patients remotely. The positive results of the ER 30-Minute Promise and Maryland eCare program show that technological advances in an independent, small, rural hospital can make a significant impact on its ability to maintain independence. AGH's strategic investments prepared the organization well for the transition in 2014 to a value-based payment system.

  5. Strategic Innovation Capacity: A Mixed Method Study on Deliberate Strategic Learning Mechanisms

    OpenAIRE

    Berghman, Liselore

    2006-01-01

    textabstractSeveral management scholars have come to propound strategic innovation as an effective means to create new and substantially superior customer value, and to combat firms’ inclination towards strategic convergence. Research on strategic innovation is however still in its infancy, tends to lack scientific rigor and has so far proven unable to provide managers with well-founded insights into the specifics of strategic innovation creation. This research therefore aims to study mechani...

  6. Accelerated Adaptive MGS Phase Retrieval

    Science.gov (United States)

    Lam, Raymond K.; Ohara, Catherine M.; Green, Joseph J.; Bikkannavar, Siddarayappa A.; Basinger, Scott A.; Redding, David C.; Shi, Fang

    2011-01-01

    The Modified Gerchberg-Saxton (MGS) algorithm is an image-based wavefront-sensing method that can turn any science instrument focal plane into a wavefront sensor. MGS characterizes optical systems by estimating the wavefront errors in the exit pupil using only intensity images of a star or other point source of light. This innovative implementation of MGS significantly accelerates the MGS phase retrieval algorithm by using stream-processing hardware on conventional graphics cards. Stream processing is a relatively new, yet powerful, paradigm to allow parallel processing of certain applications that apply single instructions to multiple data (SIMD). These stream processors are designed specifically to support large-scale parallel computing on a single graphics chip. Computationally intensive algorithms, such as the Fast Fourier Transform (FFT), are particularly well suited for this computing environment. This high-speed version of MGS exploits commercially available hardware to accomplish the same objective in a fraction of the original time. The exploit involves performing matrix calculations in nVidia graphic cards. The graphical processor unit (GPU) is hardware that is specialized for computationally intensive, highly parallel computation. From the software perspective, a parallel programming model is used, called CUDA, to transparently scale multicore parallelism in hardware. This technology gives computationally intensive applications access to the processing power of the nVidia GPUs through a C/C++ programming interface. The AAMGS (Accelerated Adaptive MGS) software takes advantage of these advanced technologies, to accelerate the optical phase error characterization. With a single PC that contains four nVidia GTX-280 graphic cards, the new implementation can process four images simultaneously to produce a JWST (James Webb Space Telescope) wavefront measurement 60 times faster than the previous code.

  7. Scientific Computing Strategic Plan for the Idaho National Laboratory

    International Nuclear Information System (INIS)

    Whiting, Eric Todd

    2015-01-01

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory's (INL's) challenge and charge, and is central to INL's ongoing success. Computing is an essential part of INL's future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing number of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.

  8. Research and simulation of intense pulsed beam transfer in electrostatic accelerate tube

    International Nuclear Information System (INIS)

    Li Chaolong; Shi Haiquan; Lu Jianqin

    2012-01-01

    To study intense pulsed beam transfer in electrostatic accelerate tube, the matrix method was applied to analyze the transport matrixes in electrostatic accelerate tube of non-intense pulsed beam and intense pulsed beam, and a computer code was written for the intense pulsed beam transporting in electrostatic accelerate tube. Optimization techniques were used to attain the given optical conditions and iteration procedures were adopted to compute intense pulsed beam for obtaining self-consistent solutions in this computer code. The calculations were carried out by using ACCT, TRACE-3D and TRANSPORT for different beam currents, respectively. The simulation results show that improvement of the accelerating voltage ratio can enhance focusing power of electrostatic accelerate tube, reduce beam loss and increase the transferring efficiency. (authors)

  9. Strategic management thinking and practice in the public sector: A strategic planning for all seasons?

    OpenAIRE

    Johnsen, Åge

    2014-01-01

    This paper explores how strategic management thinking manifests itself in strategic management practice in the public sector. Mintzberg’s framework of 10 strategic management schools of thought is chosen for mapping strategic management thinking. The paper analyses a convenience sample of 35 strategic management processes, observation of an agency’s strategy reformulation process and interviews of managers in the public sector in Norway for informing the discussion. Strategic planning is heav...

  10. Accelerated Synchrotron X-ray Diffraction Data Analysis on a Heterogeneous High Performance Computing System

    Energy Technology Data Exchange (ETDEWEB)

    Qin, J; Bauer, M A, E-mail: qin.jinhui@gmail.com, E-mail: bauer@uwo.ca [Computer Science Department, University of Western Ontario, London, ON N6A 5B7 (Canada)

    2010-11-01

    The analysis of synchrotron X-ray Diffraction (XRD) data has been used by scientists and engineers to understand and predict properties of materials. However, the large volume of XRD image data and the intensive computations involved in the data analysis makes it hard for researchers to quickly reach any conclusions about the images from an experiment when using conventional XRD data analysis software. Synchrotron time is valuable and delays in XRD data analysis can impact decisions about subsequent experiments or about materials that they are investigating. In order to improve the data analysis performance, ideally to achieve near real time data analysis during an XRD experiment, we designed and implemented software for accelerated XRD data analysis. The software has been developed for a heterogeneous high performance computing (HPC) system, comprised of IBM PowerXCell 8i processors and Intel quad-core Xeon processors. This paper describes the software and reports on the improved performance. The results indicate that it is possible for XRD data to be analyzed at the rate it is being produced.

  11. Accelerated Synchrotron X-ray Diffraction Data Analysis on a Heterogeneous High Performance Computing System

    International Nuclear Information System (INIS)

    Qin, J; Bauer, M A

    2010-01-01

    The analysis of synchrotron X-ray Diffraction (XRD) data has been used by scientists and engineers to understand and predict properties of materials. However, the large volume of XRD image data and the intensive computations involved in the data analysis makes it hard for researchers to quickly reach any conclusions about the images from an experiment when using conventional XRD data analysis software. Synchrotron time is valuable and delays in XRD data analysis can impact decisions about subsequent experiments or about materials that they are investigating. In order to improve the data analysis performance, ideally to achieve near real time data analysis during an XRD experiment, we designed and implemented software for accelerated XRD data analysis. The software has been developed for a heterogeneous high performance computing (HPC) system, comprised of IBM PowerXCell 8i processors and Intel quad-core Xeon processors. This paper describes the software and reports on the improved performance. The results indicate that it is possible for XRD data to be analyzed at the rate it is being produced.

  12. Strategic Innovation Capacity: A Mixed Method Study on Deliberate Strategic Learning Mechanisms

    NARCIS (Netherlands)

    L.A. Berghman (Liselore)

    2006-01-01

    textabstractSeveral management scholars have come to propound strategic innovation as an effective means to create new and substantially superior customer value, and to combat firms’ inclination towards strategic convergence. Research on strategic innovation is however still in its infancy, tends to

  13. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    International Nuclear Information System (INIS)

    Hules, John A.

    2008-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics

  14. A theoretical investigation of the collective acceleration of cluster ions with accelerated potential waves

    International Nuclear Information System (INIS)

    Suzuki, Hiroshi; Enjoji, Hiroshi; Kawaguchi, Motoichi; Noritake, Toshiya

    1984-01-01

    A theoretical treatment of the acceleration of cluster ions for additional heating of fusion plasma using the trapping effect in an accelerated potential wave is described. The conceptual design of the accelerator is the same as that by Enjoji, and the potential wave used is sinusoidal. For simplicity, collisions among cluster ions and the resulting breakups are neglected. The masses of the cluster ions are specified to range from 100 m sub(D) to 1000 m sub(D) (m sub(D): mass of a deuterium atom). Theoretical treatment is carried out only for the injection velocity which coincides with the phase velocity of the applied wave at the entrance of the accelerator. An equation describing the rate for successful acceleration of ions with a certain mass is deduced for the continuous injection of cluster ions. Computation for a typical mass distribution shows that more than 70% of the injected particles are effectively accelerated. (author)

  15. Nonlinear theory of diffusive acceleration of particles by shock waves

    Energy Technology Data Exchange (ETDEWEB)

    Malkov, M.A. [University of California at San Diego, La Jolla, CA (United States)]. E-mail: mmalkov@ucsd.edu; Drury, L. O' C. [Dublin Institute for Advanced Studies, 5 Merrion Square, Dublin 2 (Ireland)

    2001-04-01

    Among the various acceleration mechanisms which have been suggested as responsible for the nonthermal particle spectra and associated radiation observed in many astrophysical and space physics environments, diffusive shock acceleration appears to be the most successful. We review the current theoretical understanding of this process, from the basic ideas of how a shock energizes a few reactionless particles to the advanced nonlinear approaches treating the shock and accelerated particles as a symbiotic self-organizing system. By means of direct solution of the nonlinear problem we set the limit to the test-particle approximation and demonstrate the fundamental role of nonlinearity in shocks of astrophysical size and lifetime. We study the bifurcation of this system, proceeding from the hydrodynamic to kinetic description under a realistic condition of Bohm diffusivity. We emphasize the importance of collective plasma phenomena for the global flow structure and acceleration efficiency by considering the injection process, an initial stage of acceleration and, the related aspects of the physics of collisionless shocks. We calculate the injection rate for different shock parameters and different species. This, together with differential acceleration resulting from nonlinear large-scale modification, determines the chemical composition of accelerated particles. The review concentrates on theoretical and analytical aspects but our strategic goal is to link the fundamental theoretical ideas with the rapidly growing wealth of observational data. (author)

  16. The control computer for the Chalk River electron test accelerator

    International Nuclear Information System (INIS)

    McMichael, G.E.; Fraser, J.S.; McKeown, J.

    1978-02-01

    A versatile control and data acquisition system has been developed for a modest-sized linear accelerator using mainly process I/O hardware and software. This report describes the evolution of the present system since 1972, the modifications needed to satisfy the changing requirements of the various accelerator physics experiments and the limitations of such a system in process control. (author)

  17. Strategic Belief Management

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects....... The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars.......While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects...

  18. Determining success factors for effective strategic change: Role of middle managers' strategic involvement

    Directory of Open Access Journals (Sweden)

    Minhajul Islam Ukil

    2017-05-01

    Full Text Available Middle managers are believed to play most crucial part in strategic change that in consequence leads to organizational success. The present study seeks to identify the underlying success factors for effective strategic change and, to investigate the relationship between middle management strategic involvement and effective strategic change. Data were collected following a survey administered among a group of mid-level managers (N=144 serving in twenty different private commercial banks in Bangladesh, and analyzed using various statistical tests including descriptive analysis, Pearson correlation, and simple and multiple regressions in STATA. Results uncovers that factors like relation with top management, strategy, role and skills are essential for effective strategic change. This study also reveals significant relationship between middle management strategic involvement and effective strategic change. Findings of this research suggest that organizations shall involve mid-level managers to formulate and implement strategy since middle mangers work as a bridge between top management and ground level workers.

  19. Laser-driven acceleration with Bessel beam

    International Nuclear Information System (INIS)

    Imasaki, Kazuo; Li, Dazhi

    2005-01-01

    A new approach of laser-driven acceleration with Bessel beam is described. Bessel beam, in contrast to the Gaussian beam, shows diffraction-free'' characteristics in its propagation, which implies potential in laser-driven acceleration. But a normal laser, even if the Bessel beam, laser can not accelerate charged particle efficiently because the difference of velocity between the particle and photon makes cyclic acceleration and deceleration phase. We proposed a Bessel beam truncated by a set of annular slits those makes several special regions in its travelling path, where the laser field becomes very weak and the accelerated particles are possible to receive no deceleration as they undergo decelerating phase. Thus, multistage acceleration is realizable with high gradient. In a numerical computation, we have shown the potential of multistage acceleration based on a three-stage model. (author)

  20. Strategic management for university hospitals

    Directory of Open Access Journals (Sweden)

    Martha Isabel Riaño-Casallas

    2016-10-01

    Full Text Available Introduction: There are several approaches and schools that support strategic management processes. University hospitals require the implementation of a strategic approach to their management, since they are a particular type of organization with the triple mission of providing health care, education and research. Objective: To propose a strategic profile for a university hospital. Materials and methods: The theoretical framework of strategic management was analyzed and some particular components of hospital management were studied; based on these criteria, the strategic management process in three high complexity hospitals of Bogotá, D.C. was examined and a profile of both the objectives and the functional strategies for the hospital was proposed. Results: The main strategic thinking schools are presented; the processes and components of strategic management are described, and a strategic management profile for a university hospital is proposed. Conclusion: The strategic orientation of management for an institution with the characteristics of a university hospital facilitates achieving organizational objectives.

  1. Neutron induced activation in the EVEDA accelerator materials: Implications for the accelerator maintenance

    International Nuclear Information System (INIS)

    Sanz, J.; Garcia, M.; Sauvan, P.; Lopez, D.; Moreno, C.; Ibarra, A.; Sedano, L.

    2009-01-01

    The Engineering Validation and Engineering Design Activities (EVEDA) phase of the International Fusion Materials Irradiation Facility project should result in an accelerator prototype for which the analysis of the dose rates evolution during the beam-off phase is a necessary task for radioprotection and maintenance feasibility purposes. Important aspects of the computational methodology to address this problem are discussed, and dose rates for workers inside the accelerator vault are assessed and found to be not negligible.

  2. Neutron induced activation in the EVEDA accelerator materials: Implications for the accelerator maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, J. [Department of Power Engineering, Universidad Nacional de Educacion a Distancia (UNED), C/Juan del Rosal 12, 28040 Madrid (Spain); Institute of Nuclear Fusion, UPM, 28006 Madrid (Spain)], E-mail: jsanz@ind.uned.es; Garcia, M.; Sauvan, P.; Lopez, D. [Department of Power Engineering, Universidad Nacional de Educacion a Distancia (UNED), C/Juan del Rosal 12, 28040 Madrid (Spain); Institute of Nuclear Fusion, UPM, 28006 Madrid (Spain); Moreno, C.; Ibarra, A.; Sedano, L. [CIEMAT, 28040 Madrid (Spain)

    2009-04-30

    The Engineering Validation and Engineering Design Activities (EVEDA) phase of the International Fusion Materials Irradiation Facility project should result in an accelerator prototype for which the analysis of the dose rates evolution during the beam-off phase is a necessary task for radioprotection and maintenance feasibility purposes. Important aspects of the computational methodology to address this problem are discussed, and dose rates for workers inside the accelerator vault are assessed and found to be not negligible.

  3. The NSC 16 MV tandem accelerator control system

    International Nuclear Information System (INIS)

    Ajith Kumar, B.P.; Kannaiyan, J.; Sugathan, P.; Bhowmik, R.K.

    1994-01-01

    The computerized control system for the 16 MV Pelletron accelerator at the Nuclear Science Centre runs on a PC-AT 386 computer. Devices in the accelerator are interfaced to the computer by using a CAMAC Serial Highway. The software, written in C, is Database oriented and supports many features useful for the accelerator operation. The control console consists of an EGA monitor, keyboard, assignable control knobs and meters, a diagrammatic display showing the overall status of the machine and a similar panel for showing the status of radiation safety interlocks. The system has been operational for the past three years and is discussed below. (orig.)

  4. Accelerating VASP electronic structure calculations using graphic processing units

    KAUST Repository

    Hacene, Mohamed

    2012-08-20

    We present a way to improve the performance of the electronic structure Vienna Ab initio Simulation Package (VASP) program. We show that high-performance computers equipped with graphics processing units (GPUs) as accelerators may reduce drastically the computation time when offloading these sections to the graphic chips. The procedure consists of (i) profiling the performance of the code to isolate the time-consuming parts, (ii) rewriting these so that the algorithms become better-suited for the chosen graphic accelerator, and (iii) optimizing memory traffic between the host computer and the GPU accelerator. We chose to accelerate VASP with NVIDIA GPU using CUDA. We compare the GPU and original versions of VASP by evaluating the Davidson and RMM-DIIS algorithms on chemical systems of up to 1100 atoms. In these tests, the total time is reduced by a factor between 3 and 8 when running on n (CPU core + GPU) compared to n CPU cores only, without any accuracy loss. © 2012 Wiley Periodicals, Inc.

  5. Accelerating VASP electronic structure calculations using graphic processing units

    KAUST Repository

    Hacene, Mohamed; Anciaux-Sedrakian, Ani; Rozanska, Xavier; Klahr, Diego; Guignon, Thomas; Fleurat-Lessard, Paul

    2012-01-01

    We present a way to improve the performance of the electronic structure Vienna Ab initio Simulation Package (VASP) program. We show that high-performance computers equipped with graphics processing units (GPUs) as accelerators may reduce drastically the computation time when offloading these sections to the graphic chips. The procedure consists of (i) profiling the performance of the code to isolate the time-consuming parts, (ii) rewriting these so that the algorithms become better-suited for the chosen graphic accelerator, and (iii) optimizing memory traffic between the host computer and the GPU accelerator. We chose to accelerate VASP with NVIDIA GPU using CUDA. We compare the GPU and original versions of VASP by evaluating the Davidson and RMM-DIIS algorithms on chemical systems of up to 1100 atoms. In these tests, the total time is reduced by a factor between 3 and 8 when running on n (CPU core + GPU) compared to n CPU cores only, without any accuracy loss. © 2012 Wiley Periodicals, Inc.

  6. Vacuum system for Advanced Test Accelerator

    International Nuclear Information System (INIS)

    Denhoy, B.S.

    1981-01-01

    The Advanced Test Accelerator (ATA) is a pulsed linear electron beam accelerator designed to study charged particle beam propagation. ATA is designed to produce a 10,000 amp 50 MeV, 70 ns electron beam. The electron beam acceleration is accomplished in ferrite loaded cells. Each cell is capable of maintaining a 70 ns 250 kV voltage pulse across a 1 inch gap. The electron beam is contained in a 5 inch diameter, 300 foot long tube. Cryopumps turbomolecular pumps, and mechanical pumps are used to maintain a base pressure of 2 x 10 -6 torr in the beam tube. The accelerator will be installed in an underground tunnel. Due to the radiation environment in the tunnel, the controlling and monitoring of the vacuum equipment, pressures and temperatures will be done from the control room through a computer interface. This paper describes the vacuum system design, the type of vacuum pumps specified, the reasons behind the selection of the pumps and the techniques used for computer interfacing

  7. Vacuum system for Advanced Test Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Denhoy, B.S.

    1981-09-03

    The Advanced Test Accelerator (ATA) is a pulsed linear electron beam accelerator designed to study charged particle beam propagation. ATA is designed to produce a 10,000 amp 50 MeV, 70 ns electron beam. The electron beam acceleration is accomplished in ferrite loaded cells. Each cell is capable of maintaining a 70 ns 250 kV voltage pulse across a 1 inch gap. The electron beam is contained in a 5 inch diameter, 300 foot long tube. Cryopumps turbomolecular pumps, and mechanical pumps are used to maintain a base pressure of 2 x 10/sup -6/ torr in the beam tube. The accelerator will be installed in an underground tunnel. Due to the radiation environment in the tunnel, the controlling and monitoring of the vacuum equipment, pressures and temperatures will be done from the control room through a computer interface. This paper describes the vacuum system design, the type of vacuum pumps specified, the reasons behind the selection of the pumps and the techniques used for computer interfacing.

  8. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  9. Modern control techniques for accelerators

    International Nuclear Information System (INIS)

    Goodwin, R.W.; Shea, M.F.

    1984-05-01

    Beginning in the mid to late sixties, most new accelerators were designed to include computer based control systems. Although each installation differed in detail, the technology of the sixties and early to mid seventies dictated an architecture that was essentially the same for the control systems of that era. A mini-computer was connected to the hardware and to a console. Two developments have changed the architecture of modern systems: (a) the microprocessor and (b) local area networks. This paper discusses these two developments and demonstrates their impact on control system design and implementation by way of describing a possible architecture for any size of accelerator. Both hardware and software aspects are included

  10. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    International Nuclear Information System (INIS)

    Xu, Zuwei; Zhao, Haibo; Zheng, Chuguang

    2015-01-01

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule provides a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are

  11. 76 FR 14950 - Closed Meeting of the U.S. Strategic Command Strategic Advisory Group

    Science.gov (United States)

    2011-03-18

    ... DEPARTMENT OF DEFENSE Office of the Secretary Closed Meeting of the U.S. Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of advisory committee closed meeting.... Strategic Command Strategic Advisory Group. DATES: April 7, 2011, from 8 a.m. to 5 p.m. and April 8, 2011...

  12. Learning to think strategically.

    Science.gov (United States)

    1994-01-01

    Strategic thinking focuses on issues that directly affect the ability of a family planning program to attract and retain clients. This issue of "The Family Planning Manager" outlines the five steps of strategic thinking in family planning administration: 1) define the organization's mission and strategic goals; 2) identify opportunities for improving quality, expanding access, and increasing demand; 3) evaluate each option in terms of its compatibility with the organization's goals; 4) select an option; and 5) transform strategies into action. Also included in this issue is a 20-question test designed to permit readers to assess their "strategic thinking quotient" and a list of sample questions to guide a strategic analysis.

  13. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  14. An Adiabatic Phase-Matching Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lemery, Francois [DESY; Floettmann, Klaus [DESY; Piot, Philippe [Northern Illinois U.; Kaertner, Franz X. [Hamburg U.; Assmann, Ralph [DESY

    2017-12-22

    We present a general concept to accelerate non-relativistic charged particles. Our concept employs an adiabatically-tapered dielectric-lined waveguide which supports accelerating phase velocities for synchronous acceleration. We propose an ansatz for the transient field equations, show it satisfies Maxwell's equations under an adiabatic approximation and find excellent agreement with a finite-difference time-domain computer simulation. The fields were implemented into the particle-tracking program {\\sc astra} and we present beam dynamics results for an accelerating field with a 1-mm-wavelength and peak electric field of 100~MV/m. The numerical simulations indicate that a $\\sim 200$-keV electron beam can be accelerated to an energy of $\\sim10$~MeV over $\\sim 10$~cm. The novel scheme is also found to form electron beams with parameters of interest to a wide range of applications including, e.g., future advanced accelerators, and ultra-fast electron diffraction.

  15. Processing of intended and unintended strategic issues and integration into the strategic agenda.

    Science.gov (United States)

    Ridder, Hans-Gerd; Schrader, Jan Simon

    2017-11-01

    Strategic change is needed in hospitals due to external and internal pressures. However, research on strategic change, as a combination of management and medical expertise in hospitals, remains scarce. We analyze how intended strategic issues are processed into deliberate strategies and how unintended strategic issues are processed into emergent strategies in the management of strategy formation in hospitals. This study empirically investigates the integration of medical and management expertise in strategy formation. The longitudinal character of the case study enabled us to track patterns of intended and unintended strategic issues over 2 years. We triangulated data from interviews, observations, and documents. In accordance with the quality standards of qualitative research procedures, we analyzed the data by pattern matching and provided analytical generalization regarding strategy formation in hospitals. Our findings suggest that strategic issues are particularly successful within the strategy formation process if interest groups are concerned with the strategic issue, prospective profits are estimated, and relevant decisions makers are involved early on. Structure and interaction processes require clear criteria and transparent procedures for effective strategy formation. There is systematic neglect of medical expertise in processes of generating strategies. Our study reveals that the decentralized structure of medical centers is an adequate template for both the operationalization of intended strategic issues and the development of unintended strategic issues. However, tasks, roles, responsibility, resources, and administrative support are necessary for effective management of strategy formation. Similarly, criteria, procedures, and decision-making are prerequisites for effective strategy formation.

  16. Aerodynamics in arbitrarily accelerating frames: application to high-g turns

    CSIR Research Space (South Africa)

    Gledhill, Irvy MA

    2010-09-01

    Full Text Available Fifth-generation missiles accelerate up to 100 g in turns, and higher accelerations are expected as agility increases. The auhtors have developed the theory of aerodynamics for arbitrary accelerations, and have validated modelling in a Computational...

  17. Interactive Design of Accelerators (IDA)

    International Nuclear Information System (INIS)

    Barton, M.Q.

    1987-01-01

    IDA is a beam transport line calculation program which runs interactively on an IBM PC computer. It can be used for a large fraction of the usual calculations done for beam transport systems or periods of accelerators or storage rings. Because of the interactive screen editor nature of the data input, this program permits one to rather quickly arrive at general properties of a beam line or an accelerator period

  18. Strategic Air Traffic Planning Using Eulerian Route Based Modeling and Optimization

    Science.gov (United States)

    Bombelli, Alessandro

    Due to a soaring air travel growth in the last decades, air traffic management has become increasingly challenging. As a consequence, planning tools are being devised to help human decision-makers achieve a better management of air traffic. Planning tools are divided into two categories, strategic and tactical. Strategic planning generally addresses a larger planning domain and is performed days to hours in advance. Tactical planning is more localized and is performed hours to minutes in advance. An aggregate route model for strategic air traffic flow management is presented. It is an Eulerian model, describing the flow between cells of unidirectional point-to-point routes. Aggregate routes are created from flight trajectory data based on similarity measures. Spatial similarity is determined using the Frechet distance. The aggregate routes approximate actual well-traveled traffic patterns. By specifying the model resolution, an appropriate balance between model accuracy and model dimension can be achieved. For a particular planning horizon, during which weather is expected to restrict the flow, a procedure for designing airborne reroutes and augmenting the traffic flow model is developed. The dynamics of the traffic flow on the resulting network take the form of a discrete-time, linear time-invariant system. The traffic flow controls are ground holding, pre-departure rerouting and airborne rerouting. Strategic planning--determining how the controls should be used to modify the future traffic flow when local capacity violations are anticipated--is posed as an integer programming problem of minimizing a weighted sum of flight delays subject to control and capacity constraints. Several tests indicate the effectiveness of the modeling and strategic planning approach. In the final, most challenging, test, strategic planning is demonstrated for the six western-most Centers of the 22-Center national airspace. The planning time horizon is four hours long, and there is

  19. Strategic planning in transition

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2012-01-01

    In this paper, we analyse how contested transitions in planning rationalities and spatial logics have shaped the processes and outputs of recent episodes of Danish ‘strategic spatial planning’. The practice of ‘strategic spatial planning’ in Denmark has undergone a concerted reorientation...... style of ‘strategic spatial planning’ with its associated spatial logics is continuously challenged by a persistent regulatory, top-down rationality of ‘strategic spatial planning’, rooted in spatial Keynesianism, which has long characterised the Danish approach. The findings reveal the emergence...... of a particularly Danish approach, retaining strong regulatory aspects. However this approach does not sit easily within the current neoliberal political climate, raising concerns of an emerging crisis of ‘strategic spatial planning’....

  20. The Talent Development Middle School. An Elective Replacement Approach to Providing Extra Help in Math--The CATAMA Program (Computer- and Team-Assisted Mathematics Acceleration). Report No. 21.

    Science.gov (United States)

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephen B.

    In Talent Development Middle Schools, students needing extra help in mathematics participate in the Computer- and Team-Assisted Mathematics Acceleration (CATAMA) course. CATAMA is an innovative combination of computer-assisted instruction and structured cooperative learning that students receive in addition to their regular math course for about…

  1. A Conceptual Model of a Research Design about Congruence between Environmental Turbulence, Strategic Aggressiveness, and General Management Capability in Community Colleges

    Science.gov (United States)

    Lewis, Alfred

    2013-01-01

    Numerous studies have examined the determinant strategic elements that affect the performance of organizations. These studies have increasing relevance to community colleges because of the accelerating pace of change in enrollment, resource availability, leadership turnover, and demand for service that these institutions are experiencing. The…

  2. Technology transfer from accelerator laboratories (challenges and opportunities)

    International Nuclear Information System (INIS)

    Verma, V.K.; Gardner, P.L.

    1994-06-01

    It is becoming increasingly evident that technology transfer from research laboratories must be a key element of their comprehensive strategic plans. Technology transfer involves using a verified and organized knowledge and research to develop commercially viable products. Management of technology transfer is the art of organizing and motivating a team of scientists, engineers and manufacturers and dealing intelligently with uncertainties. Concurrent engineering is one of the most effective approaches to optimize the process of technology transfer. The challenges, importance, opportunities and techniques of transferring technology from accelerator laboratories are discussed. (author)

  3. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  4. Strategic Alliance Poker: Demonstrating the Importance of Complementary Resources and Trust in Strategic Alliance Management

    Science.gov (United States)

    Reutzel, Christopher R.; Worthington, William J.; Collins, Jamie D.

    2012-01-01

    Strategic Alliance Poker (SAP) provides instructors with an opportunity to integrate the resource based view with their discussion of strategic alliances in undergraduate Strategic Management courses. Specifically, SAP provides Strategic Management instructors with an experiential exercise that can be used to illustrate the value creation…

  5. Multi-Mode Cavity Accelerator Structure

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yong [Yale Univ., New Haven, CT (United States); Hirshfield, Jay Leonard [Omega-P R& D, Inc., New Haven, CT (United States)

    2016-11-10

    This project aimed to develop a prototype for a novel accelerator structure comprising coupled cavities that are tuned to support modes with harmonically-related eigenfrequencies, with the goal of reaching an acceleration gradient >200 MeV/m and a breakdown rate <10-7/pulse/meter. Phase I involved computations, design, and preliminary engineering of a prototype multi-harmonic cavity accelerator structure; plus tests of a bimodal cavity. A computational procedure was used to design an optimized profile for a bimodal cavity with high shunt impedance and low surface fields to maximize the reduction in temperature rise ΔT. This cavity supports the TM010 mode and its 2nd harmonic TM011 mode. Its fundamental frequency is at 12 GHz, to benchmark against the empirical criteria proposed within the worldwide High Gradient collaboration for X-band copper structures; namely, a surface electric field Esurmax< 260 MV/m and pulsed surface heating ΔTmax< 56 °K. With optimized geometry, amplitude and relative phase of the two modes, reductions are found in surface pulsed heating, modified Poynting vector, and total RF power—as compared with operation at the same acceleration gradient using only the fundamental mode.

  6. Multi-Mode Cavity Accelerator Structure

    International Nuclear Information System (INIS)

    Jiang, Yong; Hirshfield, Jay Leonard

    2016-01-01

    This project aimed to develop a prototype for a novel accelerator structure comprising coupled cavities that are tuned to support modes with harmonically-related eigenfrequencies, with the goal of reaching an acceleration gradient >200 MeV/m and a breakdown rate <10"-"7/pulse/meter. Phase I involved computations, design, and preliminary engineering of a prototype multi-harmonic cavity accelerator structure; plus tests of a bimodal cavity. A computational procedure was used to design an optimized profile for a bimodal cavity with high shunt impedance and low surface fields to maximize the reduction in temperature rise Δ T. This cavity supports the TM010 mode and its 2nd harmonic TM011 mode. Its fundamental frequency is at 12 GHz, to benchmark against the empirical criteria proposed within the worldwide High Gradient collaboration for X-band copper structures; namely, a surface electric field E_s_u_r"m"a"x< 260 MV/m and pulsed surface heating Δ T"m"a"x< 56 °K. With optimized geometry, amplitude and relative phase of the two modes, reductions are found in surface pulsed heating, modified Poynting vector, and total RF power - as compared with operation at the same acceleration gradient using only the fundamental mode.

  7. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer

    International Nuclear Information System (INIS)

    Martinez Piquer, T. A.; Yuste Santos, C.

    1986-01-01

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs

  8. Radiation safety training for accelerator facilities

    International Nuclear Information System (INIS)

    Trinoskey, P.A.

    1997-02-01

    In November 1992, a working group was formed within the U.S. Department of Energy's (DOE's) accelerator facilities to develop a generic safety training program to meet the basic requirements for individuals working in accelerator facilities. This training, by necessity, includes sections for inserting facility-specific information. The resulting course materials were issued by DOE as a handbook under its technical standards in 1996. Because experimenters may be at a facility for only a short time and often at odd times during the day, the working group felt that computer-based training would be useful. To that end, Lawrence Livermore National Laboratory (LLNL) and Argonne National Laboratory (ANL) together have developed a computer-based safety training program for accelerator facilities. This interactive course not only enables trainees to receive facility- specific information, but time the training to their schedule and tailor it to their level of expertise

  9. Recent activities in accelerator code development

    International Nuclear Information System (INIS)

    Copper, R.K.; Ryne, R.D.

    1992-01-01

    In this paper we will review recent activities in the area of code development as it affects the accelerator community. We will first discuss the changing computing environment. We will review how the computing environment has changed in the last 10 years, with emphasis on computing power, operating systems, computer languages, graphics standards, and massively parallel processing. Then we will discuss recent code development activities in the areas of electromagnetics codes and beam dynamics codes

  10. Computer control system of TRISTAN

    International Nuclear Information System (INIS)

    Kurokawa, Shin-ichi; Shinomoto, Manabu; Kurihara, Michio; Sakai, Hiroshi.

    1984-01-01

    For the operation of a large accelerator, it is necessary to connect an enormous quantity of electro-magnets, power sources, vacuum equipment, high frequency accelerator and so on and to control them harmoniously. For the purpose, a number of computers are adopted, and connected with a network, in this way, a large computer system for laboratory automation which integrates and controls the whole system is constructed. As a distributed system of large scale, the functions such as electro-magnet control, file processing and operation control are assigned to respective computers, and the total control is made feasible by network connection, at the same time, as the interface with controlled equipment, the CAMAC (computer-aided measurement and control) is adopted to ensure the flexibility and the possibility of expansion of the system. Moreover, the language ''NODAL'' having network support function was developed so as to easily make software without considering the composition of more complex distributed system. The accelerator in the TRISTAN project is composed of an electron linear accelerator, an accumulation ring of 6 GeV and a main ring of 30 GeV. Two ring type accelerators must be synchronously operated as one body, and are controlled with one computer system. The hardware and software are outlined. (Kako, I.)

  11. RAMSES stands guard over the accelerator chain

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    RAMSES, the system that is used to monitor radiation at the LHC, CNGS, CTF3 and n-TOF facilities, will soon be installed at strategic points in the accelerator chain, replacing the older monitoring system ARCON. The replacement programme has already begun.   RAMSES (which stands for “Radiation Monitoring System for the Environment and Safety”) is designed to protect workers, the general public and the environment, both on the Organization’s site and in the surrounding areas. It is currently operational on all the LHC sites and at CTF3, CNGS and n-TOF, while the remaining sites are still equipped with the ARCON (Area CONtroller) system. Daniel Perrin, head of the Instrumentation and Logistics Section of the HSE Unit's Radiation Protection Group, explains: “ARCON was designed for the old LEP accelerator and dates back to the early 1980s, while RAMSES is a much more recent design intended specifically for the LHC. With 389 detectors distributed across 124 mea...

  12. Strategic agility for nursing leadership.

    Science.gov (United States)

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  13. Tactical and Strategic Sales Management for Intelligent Agents Guided By Economic Regimes

    NARCIS (Netherlands)

    W. Ketter (Wolfgang); J. Collins (John); M. Gini (Maria); A. Gupta (Alok); P. Schrater (Paul)

    2008-01-01

    textabstractWe present a computational approach that autonomous software agents can adopt to make tactical decisions, such as product pricing, and strategic decisions, such as product mix and production planning, to maximize profit in markets with supply and demand uncertainties. Using a combination

  14. Deflation acceleration of lattice QCD simulations

    International Nuclear Information System (INIS)

    Luescher, Martin

    2007-01-01

    Close to the chiral limit, many calculations in numerical lattice QCD can potentially be accelerated using low-mode deflation techniques. In this paper it is shown that the recently introduced domain-decomposed deflation subspaces can be propagated along the field trajectories generated by the Hybrid Monte Carlo (HMC) algorithm with a modest effort. The quark forces that drive the simulation may then be computed using a deflation-accelerated solver for the lattice Dirac equation. As a consequence, the computer time required for the simulations is significantly reduced and an improved scaling behaviour of the simulation algorithm with respect to the quark mass is achieved

  15. Deflation acceleration of lattice QCD simulations

    CERN Document Server

    Lüscher, Martin

    2007-01-01

    Close to the chiral limit, many calculations in numerical lattice QCD can potentially be accelerated using low-mode deflation techniques. In this paper it is shown that the recently introduced domain-decomposed deflation subspaces can be propagated along the field trajectories generated by the Hybrid Monte Carlo (HMC) algorithm with a modest effort. The quark forces that drive the simulation may then be computed using a deflation-accelerated solver for the lattice Dirac equation. As a consequence, the computer time required for the simulations is significantly reduced and an improved scaling behaviour of the simulation algorithm with respect to the quark mass is achieved.

  16. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  17. GPU based acceleration of first principles calculation

    International Nuclear Information System (INIS)

    Tomono, H; Tsumuraya, K; Aoki, M; Iitaka, T

    2010-01-01

    We present a Graphics Processing Unit (GPU) accelerated simulations of first principles electronic structure calculations. The FFT, which is the most time-consuming part, is about 10 times accelerated. As the result, the total computation time of a first principles calculation is reduced to 15 percent of that of the CPU.

  18. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  19. A new approach to modeling linear accelerator systems

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators with specific applications to machines of interest to Accelerator Driven Transmutation Technologies (ADTT). The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in accessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are to be modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were used to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Code (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version of ASM is described and examples of the modeling and analysis capabilities are illustrated. The results of an example study, for an accelerator concept typical of ADTT applications, is presented and sample displays from the computer interface are shown

  20. A Study on Strategic Planning and Procurement of Medicals in Uganda's Regional Referral Hospitals.

    Science.gov (United States)

    Masembe, Ishak Kamaradi

    2016-12-31

    This study was an analysis of the effect of strategic planning on procurement of medicals in Uganda's regional referral hospitals (RRH's). Medicals were defined as essential medicines, medical devices and medical equipment. The Ministry of Health (MOH) has been carrying out strategic planning for the last 15 years via the Health Sector Strategic Plans. Their assumption was that strategic planning would translate to strategic procurement and consequently, availability of medicals in the RRH's. However, despite the existence of these plans, there have been many complaints about expired drugs and shortages in RRH's. For this purpose, a third variable was important because it served the role of mediation. A questionnaire was used to obtain information on perceptions of 206 respondents who were selected using simple random sampling. 8 key informant interviews were held, 2 in each RRH. 4 Focus Group Discussions were held, 1 for each RRH, and between 5 and 8 staff took part as discussants for approximately three hours. The findings suggested that strategic planning was affected by funding to approximately 34% while the relationship between funding and procurement was 35%. The direct relationship between strategic planning and procurement was 18%. However when the total causal effect was computed it turned out that strategic planning and the related variable of funding contributed 77% to procurement of medicals under the current hierarchical model where MOH is charged with development of strategic plans for the entire health sector. Since even with this contribution there were complaints, the study proposed a new model called CALF which according to a simulation, if adopted by MOH, strategic planning would contribute 87% to effectiveness in procurement of medicals.

  1. Steady-state natural circulation analysis with computational fluid dynamic codes of a liquid metal-cooled accelerator driven system

    International Nuclear Information System (INIS)

    Abanades, A.; Pena, A.

    2009-01-01

    A new innovative nuclear installation is under research in the nuclear community for its potential application to nuclear waste management and, above all, for its capability to enhance the sustainability of nuclear energy in the future as component of a new nuclear fuel cycle in which its efficiency in terms of primary Uranium ore profit and radioactive waste generation will be improved. Such new nuclear installations are called accelerator driven system (ADS) and are the result of a profitable symbiosis between accelerator technology, high-energy physics and reactor technology. Many ADS concepts are based on the utilization of heavy liquid metal (HLM) coolants due to its neutronic and thermo-physical properties. Moreover, such coolants permit the operation in free circulation mode, one of the main aims of passive systems. In this paper, such operation regime is analysed in a proposed ADS design applying computational fluid dynamics (CFD)

  2. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  3. Complex Strategic Choices Applying Systemic Planning for Strategic Decision Making

    CERN Document Server

    Leleur, Steen

    2012-01-01

    Effective decision making requires a clear methodology, particularly in a complex world of globalisation. Institutions and companies in all disciplines and sectors are faced with increasingly multi-faceted areas of uncertainty which cannot always be effectively handled by traditional strategies. Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students resulting in new material stemming from and focusing on practical application of a systemic approach. The outcome is a coher...

  4. Strategic planning for neuroradiologists.

    Science.gov (United States)

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Two-dimensional photonic crystal accelerator structures

    Directory of Open Access Journals (Sweden)

    Benjamin M. Cowan

    2003-10-01

    Full Text Available Photonic crystals provide a method of confining a synchronous speed-of-light mode in an all-dielectric structure, likely a necessary feature in any optical accelerator. We explore computationally a class of photonic crystal structures with translational symmetry in a direction transverse to the electron beam. We demonstrate synchronous waveguide modes and discuss relevant parameters of such modes. We then explore how accelerator parameters vary as the geometry of the structure is changed and consider trade-offs inherent in the design of an accelerator of this type.

  6. STRATEGIC MANAGEMENT ACCOUNTING: DEFINITION AND TOOLS

    Directory of Open Access Journals (Sweden)

    Nadiia Pylypiv

    2017-08-01

    Full Text Available The article is dedicated to learning the essence of the definition of “strategic management accounting” in domestic and foreign literature. Strategic management accounting tools has been studied and identified constraints that affect its choice. The result of the study is that the understanding of strategic management accounting was formed by authors. The tools which are common for both traditional managerial accounting and strategic and the specific tools necessary for efficient implementation of strategic management accounting have been defined. Keywords: strategic management accounting, definition, tools, strategic management decisions.

  7. Quantum Accelerators for High-Performance Computing Systems

    OpenAIRE

    Britt, Keith A.; Mohiyaddin, Fahd A.; Humble, Travis S.

    2017-01-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantu...

  8. Shredder: GPU-Accelerated Incremental Storage and Computation

    OpenAIRE

    Bhatotia, Pramod; Rodrigues, Rodrigo; Verma, Akshat

    2012-01-01

    Redundancy elimination using data deduplication and incremental data processing has emerged as an important technique to minimize storage and computation requirements in data center computing. In this paper, we present the design, implementation and evaluation of Shredder, a high performance content-based chunking framework for supporting incremental storage and computation systems. Shredder exploits the massively parallel processing power of GPUs to overcome the CPU bottlenecks of content-ba...

  9. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  10. Strategic Aspirations

    DEFF Research Database (Denmark)

    Christensen, Lars Thøger; Morsing, Mette; Thyssen, Ole

    2016-01-01

    are often encouraged by social norms, regulations, and institutions—for example, institutionalized standards for corporate social responsibility (CSR) reporting—they live through local articulations and enactments that allow organizations to discover who they are and who they might become. Strategic......Strategic aspirations are public announcements designed to inspire, motivate, and create expectations about the future. Vision statements or value declarations are examples of such talk, through which organizations announce their ideal selves and declare what they (intend to) do. While aspirations...... aspirations, in other words, have exploratory and inspirational potential—two features that are highly essential in complex areas such as sustainability and CSR. This entry takes a communicative focus on strategic aspirations, highlighting the value of aspirational talk, understood as ideals and intentions...

  11. OpenMP for Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, J C; Stotzer, E J; Hart, A; de Supinski, B R

    2011-03-15

    OpenMP [13] is the dominant programming model for shared-memory parallelism in C, C++ and Fortran due to its easy-to-use directive-based style, portability and broad support by compiler vendors. Similar characteristics are needed for a programming model for devices such as GPUs and DSPs that are gaining popularity to accelerate compute-intensive application regions. This paper presents extensions to OpenMP that provide that programming model. Our results demonstrate that a high-level programming model can provide accelerated performance comparable to hand-coded implementations in CUDA.

  12. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators system

    OpenAIRE

    Charrue, P; Bland, A; Ghinet, F; Ribeiro, P

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete XENIX PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specificatio...

  13. An Innovative Method for Evaluating Strategic Goals in a Public Agency: Conservation Leadership in the U.S. Forest Service

    Science.gov (United States)

    David N. Bengston; David P. Fan

    1999-01-01

    This article presents an innovative methodology for evaluating strategic planning goals in a public agency. Computer-coded content analysis was used to evaluate attitudes expressed in about 28,000 on-line news media stories about the U.S. Department of Agriculture Forest Service and its strategic goal of conservation leadership. Three dimensions of conservation...

  14. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    Science.gov (United States)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  15. The neoliberalisation of strategic spatial planning

    DEFF Research Database (Denmark)

    Olesen, Kristian

    2014-01-01

    scales, and partly through the normalisation of neoliberal discourses in strategic spatial planning processes. This paper analyses the complex relationship, partly of unease and partly of coevolution, between neoliberalism and strategic spatial planning. Furthermore, the paper discusses the key......Strategic spatial planning practices have recently taken a neoliberal turn in many northwestern European countries. This neoliberalisation of strategic spatial planning has materialised partly in governance reforms aiming to reduce or abolish strategic spatial planning at national and regional...... challenges for strategic spatial planning in the face of neoliberalism and argues for a need to strengthen strategic spatial planning’s critical dimension....

  16. Ultimate-gradient accelerators physics and prospects

    CERN Document Server

    Skrinsky, Aleksander Nikolayevich

    1995-01-01

    As introduction, the needs and ways for ultimate acceleration gradients are discussed briefly. The Plasma Wake Field Acceleration is analized in the most important details. The structure of specific plasma oscillations and "high energy driver beam SP-plasma" interaction is presented, including computer simulation of the process. Some pratical ways to introduce the necessary mm-scale bunching in driver beam and to arrange sequential energy multiplication are dicussed. The influence of accelerating beam particle - plasma binary collisions is considered, also. As applications of PWFA, the use of proton super-colliders beams (LHC and Future SC) to drive the "multi particle types" accelerator, and the arrangements for the electron-positron TeV range collider are discussed.

  17. Strategic Communication Institutionalized

    DEFF Research Database (Denmark)

    Kjeldsen, Anna Karina

    2013-01-01

    of institutionalization when strategic communication is not yet visible as organizational practice, and how can such detections provide explanation for the later outcome of the process? (2) How can studies of strategic communication benefit from an institutional perspective? How can the virus metaphor generate a deeper...... understanding of the mechanisms that interact from the time an organization is exposed to a new organizational idea such as strategic communication until it surfaces in the form of symptoms such as mission and vision statements, communication manuals and communication positions? The first part of the article...... focuses on a discussion of the virus metaphor as an alternative to the widespread fashion metaphor for processes of institutionalization. The second part of the article provides empirical examples of the virus metaphor employed, examples that are drawn from a study of the institutionalization of strategic...

  18. On strategic spatial planning

    Directory of Open Access Journals (Sweden)

    Tošić Branka

    2014-01-01

    Full Text Available The goal of this paper is to explain the origin and development of strategic spatial planning, to show complex features and highlight the differences and/or advantages over traditional, physical spatial planning. Strategic spatial planning is seen as one of approaches in legally defined planning documents, and throughout the display of properties of sectoral national strategies, as well as issues of strategic planning at the local level in Serbia. The strategic approach is clearly recognized at the national and sub-national level of spatial planning in European countries and in our country. It has been confirmed by the goals outlined in documents of the European Union and Serbia that promote the grounds of territorial cohesion and strategic integrated planning, emphasizing cooperation and the principles of sustainable spatial development. [Projekat Ministarstva nauke Republike Srbije, br. 176017

  19. GPU Accelerated Vector Median Filter

    Science.gov (United States)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  20. Strategic consequences of emotional misrepresentation in negotiation: The blowback effect.

    Science.gov (United States)

    Campagna, Rachel L; Mislin, Alexandra A; Kong, Dejun Tony; Bottom, William P

    2016-05-01

    Recent research indicates that expressing anger elicits concession making from negotiating counterparts. When emotions are conveyed either by a computer program or by a confederate, results appear to affirm a long-standing notion that feigning anger is an effective bargaining tactic. We hypothesize this tactic actually jeopardizes postnegotiation deal implementation and subsequent exchange. Four studies directly test both tactical and strategic consequences of emotional misrepresentation. False representations of anger generated little tactical benefit but produced considerable and persistent strategic disadvantage. This disadvantage is because of an effect we call "blowback." A negotiator's misrepresented anger creates an action-reaction cycle that results in genuine anger and diminishes trust in both the negotiator and counterpart. Our findings highlight the importance of considering the strategic implications of emotional misrepresentation for negotiators interested in claiming value. We discuss the benefits of researching reciprocal interdependence between 2 or more negotiating parties and of modeling value creation beyond deal construction to include implementation of terms. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Analysis of Movement Acceleration of Down's Syndrome Teenagers Playing Computer Games.

    Science.gov (United States)

    Carrogi-Vianna, Daniela; Lopes, Paulo Batista; Cymrot, Raquel; Hengles Almeida, Jefferson Jesus; Yazaki, Marcos Lomonaco; Blascovi-Assis, Silvana Maria

    2017-12-01

    This study aimed to evaluate movement acceleration characteristics in adolescents with Down syndrome (DS) and typical development (TD), while playing bowling and golf videogames on the Nintendo ® Wii™. The sample comprised 21 adolescents diagnosed with DS and 33 with TD of both sexes, between 10 and 14 years of age. The arm swing accelerations of the dominant upper limb were collected as measures during the bowling and the golf games. The first valid measurement, verified by the software readings, recorded at the start of each of the games, was used in the analysis. In the bowling game, the groups presented significant statistical differences, with the maximum (M) peaks of acceleration for the Male Control Group (MCG) (M = 70.37) and Female Control Group (FCG) (M = 70.51) when compared with Male Down Syndrome Group (MDSG) (M = 45.33) and Female Down Syndrome Group (FDSG) (M = 37.24). In the golf game the groups also presented significant statistical differences, the only difference being that the maximum peaks of acceleration for both male groups were superior compared with the female groups, MCG (M = 74.80) and FCG (M = 56.80), as well as in MDSG (M = 45.12) and in FDSG (M = 30.52). It was possible to use accelerometry to evaluate the movement acceleration characteristics of teenagers diagnosed with DS during virtual bowling and golf games played on the Nintendo Wii console.

  2. The efficiency and the effectiveness of strategic management: from strategic planning to organizational change

    Directory of Open Access Journals (Sweden)

    Rolando Juan Soliz Estrada

    2007-09-01

    Full Text Available Strategic management is a technique which has as structuring basis the Strategic Administration and the Strategic Planning, adding to its improvement the administrative perspective of organization changes. However, the organization change models developed in the last years have been elaborated aiming the managing of general organization changes, and do not have a specific approach to the managing and improvement of Strategic Planning and the Changes caused by them, it means they are not models which focus directly on the Strategic Management. This work had as objectives to develop a Model of Strategic Administration and a Model of Organizational Change, which associated turn efficient and effective the Organizations’ Administration. In order to develop this work were used concepts and approaches preconized by qualitative research. As results, the two Models are presented, as well as their validation in an organization with lucrative objectives.

  3. The strategic entrepreneurial thinking imperative

    OpenAIRE

    S. Dhliwayo; J. J. Van Vuuren

    2007-01-01

    Purpose: The aim of this paper is to demonstrate that strategic entrepreneurial thinking is a unitary concept which should be viewed as a standalone construct. Design/Methodology/Approach: The concept strategic entrepreneurial thinking is modelled from an analysis of strategic thinking and entrepreneurial thinking from available literature. The strategic entrepreneurial mindset imperative is then emphasised and confirmed. Findings: This paper's finding is that there is no diff...

  4. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  5. Think Local-Act Local: Is It Time to Slow Down the Accelerated Move to Global Marketing?

    OpenAIRE

    Schuiling, Isabelle

    2001-01-01

    In view of the accelerated move of great corporations towards global marketing, the strategic changes of such companies raise interesting questions. Is marketing globalization reaching its limits after years of implementation? Is it time for companies to rethink their strategies and move back, like Coca-Cola, to a multi-domestic marketing approach?

  6. Hippocampal-cortical contributions to strategic exploration during perceptual discrimination.

    Science.gov (United States)

    Voss, Joel L; Cohen, Neal J

    2017-06-01

    The hippocampus is crucial for long-term memory; its involvement in short-term or immediate expressions of memory is more controversial. Rodent hippocampus has been implicated in an expression of memory that occurs on-line during exploration termed "vicarious trial-and-error" (VTE) behavior. VTE occurs when rodents iteratively explore options during perceptual discrimination or at choice points. It is strategic in that it accelerates learning and improves later memory. VTE has been associated with activity of rodent hippocampal neurons, and lesions of hippocampus disrupt VTE and associated learning and memory advantages. Analogous findings of VTE in humans would support the role of hippocampus in active use of short-term memory to guide strategic behavior. We therefore measured VTE using eye-movement tracking during perceptual discrimination and identified relevant neural correlates with functional magnetic resonance imaging. A difficult perceptual-discrimination task was used that required visual information to be maintained during a several second trial, but with no long-term memory component. VTE accelerated discrimination. Neural correlates of VTE included robust activity of hippocampus and activity of a network of medial prefrontal and lateral parietal regions involved in memory-guided behavior. This VTE-related activity was distinct from activity associated with simply viewing visual stimuli and making eye movements during the discrimination task, which occurred in regions frequently associated with visual processing and eye-movement control. Subjects were mostly unaware of performing VTE, thus further distancing VTE from explicit long-term memory processing. These findings bridge the rodent and human literatures on neural substrates of memory-guided behavior, and provide further support for the role of hippocampus and a hippocampal-centered network of cortical regions in the immediate use of memory in on-line processing and the guidance of behavior. © 2017

  7. 76 FR 52642 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Science.gov (United States)

    2011-08-23

    ... DEPARTMENT OF DEFENSE Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of Advisory Committee closed meeting.... Strategic Command Strategic Advisory Group. DATES: November 1, 2011, from 8 a.m. to 5 p.m. and November 2...

  8. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  9. Accelerator system model (ASM): A unique tool in exploring accelerator driven transmutation technologies (ADTT) system trade space

    Energy Technology Data Exchange (ETDEWEB)

    Myers, T.J.; Favale, A.J.; Berwald, D.H.; Burger, E.C.; Paulson, C.C.; Peacock, M.A.; Piaszczyk, C.M.; Piechowiak, E.M.; Rathke, J.W. [Northrop Grumman Corp., Bethpage, NY (United States). Advanced Technology and Development Center

    1997-09-01

    To aid in the development and optimization of emerging Accelerator Driven Transmutation Technology (ADTT) concepts, the Northrop Grumman Corporation, working together with G.H. Gillespie Associates and Los Alamos National Laboratory has developed a computational tool which combines both accelerator physics layout/analysis capabilities with engineering analysis capabilities to create a standardized platform to compare and contrast accelerator system configurations. In this context, the accelerator system configuration includes not only the accelerating structures, but also the major support systems such as the vacuum, thermal control, RF power, and cryogenic subsystem (if superconducting accelerator operation is investigated) as well as estimates of the costs for enclosures (accelerating tunnel and RF halls). This paper presents an overview of the Accelerator System Model (ASM) code flow, as well as a discussion of the data and analysis upon which it is based. Also presented is material which addresses the development of the evaluation criteria employed by this code including a presentation of the economic analysis methods, and a discussion of the cost database employed. The paper concludes with examples depicting completed and planned trade studies for both normal and superconducting accelerator applications. 8 figs.

  10. A Handbook for Strategic Planning

    Science.gov (United States)

    1994-01-01

    Total Quality Leadership, 48 mtrategic direction, strategic intent , organizational planning, 11tinaiCMc MIisiing.mysteusth nking, gap analysis 17 1CUPMtlI...Department of the Nawy vision, guiding principles, and strategic goals. Washington, DC: Author. Hamel, G., & Prahalad , C. K. (May-June 1989). Strategic ...professoional oirgani/atioins. strategic planning. Adv;ice mInav also take .,V resouirces, perimt, thet [QI 0 )fice, the form of recoiln~inedatioins onl

  11. Strategic Talk in Film.

    Science.gov (United States)

    Payr, Sabine; Skowron, Marcin; Dobrosovestnova, Anna; Trapp, Martin; Trappl, Robert

    2017-01-01

    Conversational robots and agents are being designed for educational and/or persuasive tasks, e.g., health or fitness coaching. To pursue such tasks over a long time, they will need a complex model of the strategic goal, a variety of strategies to implement it in interaction, and the capability of strategic talk. Strategic talk is incipient ongoing conversation in which at least one participant has the objective of changing the other participant's attitudes or goals. The paper is based on the observation that strategic talk can stretch over considerable periods of time and a number of conversational segments. Film dialogues are taken as a source to develop a model of the strategic talk of mentor characters. A corpus of film mentor utterances is annotated on the basis of the model, and the data are interpreted to arrive at insights into mentor behavior, especially into the realization and sequencing of strategies.

  12. A Study on Strategic Planning and Procurement of Medicals in Uganda’s Regional Referral Hospitals

    Science.gov (United States)

    2016-01-01

    This study was an analysis of the effect of strategic planning on procurement of medicals in Uganda’s regional referral hospitals (RRH’s). Medicals were defined as essential medicines, medical devices and medical equipment. The Ministry of Health (MOH) has been carrying out strategic planning for the last 15 years via the Health Sector Strategic Plans. Their assumption was that strategic planning would translate to strategic procurement and consequently, availability of medicals in the RRH’s. However, despite the existence of these plans, there have been many complaints about expired drugs and shortages in RRH’s. For this purpose, a third variable was important because it served the role of mediation. A questionnaire was used to obtain information on perceptions of 206 respondents who were selected using simple random sampling. 8 key informant interviews were held, 2 in each RRH. 4 Focus Group Discussions were held, 1 for each RRH, and between 5 and 8 staff took part as discussants for approximately three hours. The findings suggested that strategic planning was affected by funding to approximately 34% while the relationship between funding and procurement was 35%. The direct relationship between strategic planning and procurement was 18%. However when the total causal effect was computed it turned out that strategic planning and the related variable of funding contributed 77% to procurement of medicals under the current hierarchical model where MOH is charged with development of strategic plans for the entire health sector. Since even with this contribution there were complaints, the study proposed a new model called CALF which according to a simulation, if adopted by MOH, strategic planning would contribute 87% to effectiveness in procurement of medicals. PMID:28299158

  13. Cultivating strategic thinking skills.

    Science.gov (United States)

    Shirey, Maria R

    2012-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author presents an overview of strategic leadership and offers approaches for cultivating strategic thinking skills.

  14. Chaotic dynamics in accelerator physics

    International Nuclear Information System (INIS)

    Cary, J.R.

    1992-01-01

    Substantial progress was in several areas of accelerator dynamics. For developing understanding of longitudinal adiabatic dynamics, and for creating efficiency enhancements of recirculating free-electron lasers, was substantially completed. A computer code for analyzing the critical KAM tori that bound the dynamic aperture in circular machines was developed. Studies of modes that arise due to the interaction of coating beams with a narrow-spectrum impedance have begun. During this research educational and research ties with the accelerator community at large have been strengthened

  15. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Pachan, M.

    2001-01-01

    Full text: In view of limited number of scientific and technical staff, it was necessary to focus the activity on most important subjects and to keep balance between current duties and development of future projects. The dominant item was realisation of research and designing works in the Ordered Project for New Therapeutical Accelerator with two energies of photon beam 6 and 15 MeV. During the reported year, main efforts were oriented on: - computation and experimental works on optimization of electron gun parameters and electron optics in the injection system for accelerating structure, - calculation and modelling of standing wave, S-band accelerating structure to achieve broad range of electron energy variation with good phase acceptance and narrow energy spectrum of the output beam, - calculation and design of beam focusing and transport system, with deflection of the output beam for 2700 in achromatic sector magnet, - design and modelling of microwave power system, with pilot generator, klystron 6 MW amplifier, pulse modulator, waveguide system, four-port circulator and automatic frequency control, - preparative works on metrological procedures and apparatus for accelerated beam diagnostics comprising measurements of energy spectrum, beam intensity, transmission factor, leakage radiation, and other important beam parameters. Other important subject, worth mentioning are: - Advance in forming and metrology of narrow X-ray photon beams, dedicated to stereotactic radiosurgery and radiotherapy, - Adaptation of a new version of EGS-4, MC type code for computer simulation of dose distribution in therapeutical beams, - Participation in selected items of the TESLA Project in cooperation with DESY - Hamburg, - theory and computer simulation of higher order modes in superconducting accelerating structures, - technological research of methods and apparatus for thin layer coating of r.f. resonators and subunits in transmission circuits - Conceptual studies of proposed new

  16. Numerical Nuclear Second Derivatives on a Computing Grid: Enabling and Accelerating Frequency Calculations on Complex Molecular Systems.

    Science.gov (United States)

    Yang, Tzuhsiung; Berry, John F

    2018-06-04

    The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.

  17. Inverse Cerenkov laser acceleration experiment at ATF

    International Nuclear Information System (INIS)

    Wang, X.J.; Pogorelsky, I.; Fernow, R.; Kusche, K.P.; Liu, Y.; Kimura, W.D.; Kim, G.H.; Romea, R.D.; Steinhauer, L.C.

    1994-01-01

    Inverse Cerenkov laser acceleration was demonstrated using an axicon optical system at the Brookhaven Accelerator Test Facility (ATF). The ATF S-band linac and a high power 10.6 μm CO 2 laser were used for the experiment. Experimental arrangement and the laser and the electron beams synchronization are discussed. The electrons were accelerated more than 0.7 MeV for a 34 MW CO 2 laser power. More than 3.7 MeV acceleration was measured with 0.7 GW CO 2 laser power, which is more than 20 times of the previous ICA experiment. The experimental results are compared with computer program TRANSPORT simulations

  18. The CARE project (Coordinated Accelerator Research in Europe)

    International Nuclear Information System (INIS)

    Napoly, Olivier

    2006-01-01

    CARE, an ambitious and coordinated project of accelerator research and developments oriented towards High Energy Physics projects, has been launched in January 2004 by the main European laboratories and the European Commission with the 6th Framework Programme. This project aims at improving existing infrastructures dedicated to future projects such as linear colliders, upgrades of hadron colliders and high intensity proton drivers An important part of this programme is devoted to advancing the performance of the superconducting technology, both in the fields of RF cavities for electron and proton acceleration and of high field magnets, as well as to developing high intensity electron and proton injectors. We describe the plans of the four main Joint Research Activities and report on the results and progress obtained so far. The CARE project also includes three adjacent Networking Activities whose main goal is to organize a forum of discussions and to provide the strategic plans in the fields of the Linear Collider, intense Neutrino Beams, and future Hadron Colliders

  19. FY17 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the FY17 Division 1000 Science and Technology Strategic Plan. As this plan represents a continuation of the work we started last year, the four strategic themes (Mission Engagement, Bold Outcomes, Collaborative Environment, and Safety Imperative) remain the same, along with many of the goals. You will see most of the changes in the actions listed for each goal: We completed some actions, modified others, and added a few new ones. As I’ve stated previously, this is not a strategy to be pursued in tension with the Laboratory strategic plan. The Division 1000 strategic plan is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming months.

  20. FY16 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the Division 1000 Science and Technology Strategic Plan. This plan was created with considerable participation from all levels of management in Division 1000, and is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. The plan is characterized by four strategic themes: Mission Engagement, Bold Outcomes, Collaborative Environment, and the Safety Imperative. Each theme is accompanied by a brief vision statement, several goals, and planned actions to support those goals throughout FY16. I want to be clear that this is not a strategy to be pursued in tension with the Laboratory strategic plan. Rather, it is intended to describe “how” we intend to show up for the “what” described in Sandia’s Strategic Plan. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming year.

  1. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  2. Antecedents to strategic flexibility : Management cognition, firm resources and strategic options

    NARCIS (Netherlands)

    Combe, I.; Rudd, J.M.; Leeflang, P.S.H.; Greenley, G.E.

    2012-01-01

    Purpose - Current conceptualisations of strategic flexibility and its antecedents are theory-driven, which has resulted in a lack of consensus. To summarise this domain the paper aims to develop and present an a priori conceptual model of the antecedents and outcomes of strategic flexibility.

  3. Accelerator control using RSX-11M and CAMAC

    International Nuclear Information System (INIS)

    Kulaga, J.E.

    1978-01-01

    This paper describes a computer-control system for a superconducting linear accelerator currently under development at Argonne National Laboratory. RSX-11M V3.1 running on a PDP 11/34 is used with CAMAC hardware to fully control 22 active beam-line elements and monitor critical accelerator conditions such as temperature, vacuum, and beam characteristics. This paper contrasts the use of an RSX compatible CAMAC driver for most CAMAC I/O operations and the use of the Connect-to-Interrupt Vector directive for fast ADC operation. The usage of table-driven software to achieve hardware configuration independence is discussed, along with the design considerations of the software interface between a human operator and a computer-control system featuring multi-function computer-readable control knobs and computer-writable displays which make up the operator's control console

  4. Command, Control, Communication, Computers and Information Technology (C4&IT). Strategic Plan, FY2008 - 2012

    National Research Council Canada - National Science Library

    2008-01-01

    ...&IT)/CG-6, Chief Information Officer (CIO), for the Coast Guard publishes this C4&IT Strategic Plan. The purpose of this plan is to provide a unifying strategy to better integrate and synchronize Coast Guard C4...

  5. Ground test accelerator control system software

    International Nuclear Information System (INIS)

    Burczyk, L.; Dalesio, R.; Dingler, R.; Hill, J.; Howell, J.A.; Kerstiens, D.; King, R.; Kozubal, A.; Little, C.; Martz, V.; Rothrock, R.; Sutton, J.

    1988-01-01

    This paper reports on the GTA control system that provides an environment in which the automation of a state-of-the-art accelerator can be developed. It makes use of commercially available computers, workstations, computer networks, industrial 110 equipment, and software. This system has built-in supervisory control (like most accelerator control systems), tools to support continuous control (like the process control industry), and sequential control for automatic start-up and fault recovery (like few other accelerator control systems). Several software tools support these levels of control: a real-time operating system (VxWorks) with a real-time kernel (VRTX), a configuration database, a sequencer, and a graphics editor. VxWorks supports multitasking, fast context-switching, and preemptive scheduling. VxWorks/VRTX is a network-based development environment specifically designed to work in partnership with the UNIX operating system. A data base provides the interface to the accelerator components. It consists of a run time library and a database configuration and editing tool. A sequencer initiates and controls the operation of all sequence programs (expressed as state programs). A graphics editor gives the user the ability to create color graphic displays showing the state of the machine in either text or graphics form

  6. The strategic security officer.

    Science.gov (United States)

    Hodges, Charles

    2014-01-01

    This article discusses the concept of the strategic security officer, and the potential that it brings to the healthcare security operational environment. The author believes that training and development, along with strict hiring practices, can enable a security department to reach a new level of professionalism, proficiency and efficiency. The strategic officer for healthcare security is adapted from the "strategic corporal" concept of US Marine Corps General Charles C. Krulak which focuses on understanding the total force implications of the decisions made by the lowest level leaders within the Corps (Krulak, 1999). This article focuses on the strategic organizational implications of every security officer's decisions in the constantly changing and increasingly volatile operational environment of healthcare security.

  7. ABSTRACTS Preliminary Study of Strategic Inner Cores

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    When a strategic entity attempts to make a dicision, first the project must be m accoroance wlm its strategic framework as well as make the strategic inner cores prominent. The existing theories of development strategy indicate that the formation of the framework can be divided into the following parts: inside and outside environments, purpose, goal, key points, and countermeasures. The strategic inner cores that put forward by this paper is the intensification and advancement for the theory of strategic framework, strategic orientation, strategic vision and main line are inciuded. Appearance of these ideas have improved the theory and enhanced strategic practice.

  8. Energy Efficient FPGA based Hardware Accelerators for Financial Applications

    DEFF Research Database (Denmark)

    Kenn Toft, Jakob; Nannarelli, Alberto

    2014-01-01

    Field Programmable Gate Arrays (FPGAs) based accelerators are very suitable to implement application-specific processors using uncommon operations or number systems. In this work, we design FPGA-based accelerators for two financial computations with different characteristics and we compare...... the accelerator performance and energy consumption to a software execution of the application. The experimental results show that significant speed-up and energy savings, can be obtained for large data sets by using the accelerator at expenses of a longer development time....

  9. Strategic Risk Assessment

    Science.gov (United States)

    Derleth, Jason; Lobia, Marcus

    2009-01-01

    This slide presentation provides an overview of the attempt to develop and demonstrate a methodology for the comparative assessment of risks across the entire portfolio of NASA projects and assets. It includes information about strategic risk identification, normalizing strategic risks, calculation of relative risk score, and implementation options.

  10. Anderson Acceleration for Fixed-Point Iterations

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Homer F. [Worcester Polytechnic Institute, MA (United States)

    2015-08-31

    The purpose of this grant was to support research on acceleration methods for fixed-point iterations, with applications to computational frameworks and simulation problems that are of interest to DOE.

  11. Microfoundations of strategic decision effectiveness

    NARCIS (Netherlands)

    Jansen, R.J.G.; Van Santen, Sarah

    2017-01-01

    How do organizations make effective strategic decisions? In this study we build on research on the microfoundations of strategy and strategic decision-making to study the underpinnings of strategic decision effectiveness. We argue that the process-effectiveness link can be more fully understood if

  12. Automatic performance tuning of parallel and accelerated seismic imaging kernels

    KAUST Repository

    Haberdar, Hakan; Siddiqui, Shahzeb; Feki, Saber

    2014-01-01

    the performance of the MPI communications as well as developer productivity by providing a higher level of abstraction. Keeping productivity in mind, we opted toward pragma based programming for accelerated computation on latest accelerated architectures

  13. Evaluation of a server-client architecture for accelerator modeling and simulation

    International Nuclear Information System (INIS)

    Bowling, B.A.; Akers, W.; Shoaee, H.; Watson, W.; Zeijts, J. van; Witherspoon, S.

    1997-01-01

    Traditional approaches to computational modeling and simulation often utilize a batch method for code execution using file-formatted input/output. This method of code implementation was generally chosen for several factors, including CPU throughput and availability, complexity of the required modeling problem, and presentation of computation results. With the advent of faster computer hardware and the advances in networking and software techniques, other program architectures for accelerator modeling have recently been employed. Jefferson Laboratory has implemented a client/server solution for accelerator beam transport modeling utilizing a query-based I/O. The goal of this code is to provide modeling information for control system applications and to serve as a computation engine for general modeling tasks, such as machine studies. This paper performs a comparison between the batch execution and server/client architectures, focusing on design and implementation issues, performance, and general utility towards accelerator modeling demands

  14. Strategic thinking on oil development in China

    International Nuclear Information System (INIS)

    Liu Keyu; Shan Weiguo

    2005-01-01

    It is expected that crude oil production in China will maintain its current level until 2020. Driven by higher living standards and the rapid development of energy intensive industries, China's oil demand will increase rapidly and might lead to heavier import dependency. Three cases of demand forecasts are presented, but for the sake of sustainable economic and social development, neither the high nor the middle case is favourable for China. Thus, China must seek a path of oil saving economic development, and limit oil consumption to no more than 350MT in 2010 and 450MT in 2020. Meanwhile, in order to secure the oil supply, the following strategies should be adopted: save oil and develop alternative energies; stabilise domestic oil production and to diversify oil imports and overseas oil exploration and development; accelerate the gas industry and introduce strategic petroleum reserves. (author)

  15. The Advanced Test Reactor Strategic Evaluation Program

    International Nuclear Information System (INIS)

    Buescher, B.J.

    1990-01-01

    A systematic evaluation of safety, environmental, and operational issues has been initiated at the Advanced Test Reactor (ATR). This program, the Strategic Evaluation Program (STEP), provides an integrated review of safety and operational issues against the standards applied to licensed commercial facilities. In the review of safety issues, 18 deviations were identified which required prompt attention. Resolution of these items has been accelerated in the program. An integrated living schedule is being developed to address the remaining findings. A risk evaluation is being performed on the proposed corrective actions and these actions will then be formally ranked in order of priority based on considerations of safety and operational significance. Once the final ranking is completed, an integrated schedule will be developed, which will include considerations of availability of funding and operating schedule. 3 refs., 2 figs

  16. The new generation of PowerPC VMEbus front end computers for the CERN SPS and LEP accelerators control system

    OpenAIRE

    Van den Eynden, M

    1995-01-01

    The CERN SPS and LEP PowerPC project is aimed at introducing a new generation of PowerPC VMEbus processor modules running the LynxOS real-time operating system. This new generation of front end computers using the state-of-the-art microprocessor technology will first replace the obsolete Xenix PC based systems (about 140 installations) successfully used since 1988 to control the LEP accelerator. The major issues addressed in the scope of this large scale project are the technical specificatio...

  17. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  18. Crowdnursing - Strategizing Shitstorms

    DEFF Research Database (Denmark)

    Christensen, Lars Holmgaard

    2018-01-01

    This paper will introduce a framework for distinguishing between shitstorm types and social media crises. In need of strategies for handling social media crowds the paper suggests a strategic approach that focus on the cultivation of social media crowds and offers a valuable conceptual...... understanding of crowdnursing as a strategic tool....

  19. Strategic Planning and Financial Management

    Science.gov (United States)

    Conneely, James F.

    2010-01-01

    Strong financial management is a strategy for strategic planning success in student affairs. It is crucial that student affairs professionals understand the necessity of linking their strategic planning with their financial management processes. An effective strategic planner needs strong financial management skills to implement the plan over…

  20. Strategic Partnerships in Higher Education

    Science.gov (United States)

    Ortega, Janet L.

    2013-01-01

    The purpose of this study was to investigate the impacts of strategic partnerships between community colleges and key stakeholders; to specifically examine strategic partnerships; leadership decision-making; criteria to evaluate strategic partnerships that added value to the institution, value to the students, faculty, staff, and the local…

  1. The Possibilities of Strategic Finance

    Science.gov (United States)

    Chaffee, Ellen

    2010-01-01

    Strategic finance is aligning financial decisions--regarding revenues, creating and maintaining institutional assets, and using those assets--with the institution's mission and strategic plan. The concept known as "strategic finance" increasingly is being seen as a useful perspective for helping boards and presidents develop a sustainable…

  2. An "Elective Replacement" Approach to Providing Extra Help in Math: The Talent Development Middle Schools' Computer- and Team-Assisted Mathematics Acceleration (CATAMA) Program.

    Science.gov (United States)

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephan B.

    1999-01-01

    Two studies evaluated the Computer- and Team-Assisted Mathematics Acceleration course (CATAMA) in Talent Development Middle Schools. The first study compared growth in math achievement for 96 seventh-graders (48 of whom participated in CATAMA and 48 of whom did not); the second study gathered data from interviews with, and observations of, CATAMA…

  3. Strategic and non-strategic problem gamblers differ on decision-making under risk and ambiguity.

    Science.gov (United States)

    Lorains, Felicity K; Dowling, Nicki A; Enticott, Peter G; Bradshaw, John L; Trueblood, Jennifer S; Stout, Julie C

    2014-07-01

    To analyse problem gamblers' decision-making under conditions of risk and ambiguity, investigate underlying psychological factors associated with their choice behaviour and examine whether decision-making differed in strategic (e.g., sports betting) and non-strategic (e.g., electronic gaming machine) problem gamblers. Cross-sectional study. Out-patient treatment centres and university testing facilities in Victoria, Australia. Thirty-nine problem gamblers and 41 age, gender and estimated IQ-matched controls. Decision-making tasks included the Iowa Gambling Task (IGT) and a loss aversion task. The Prospect Valence Learning (PVL) model was used to provide an explanation of cognitive, motivational and response style factors involved in IGT performance. Overall, problem gamblers performed more poorly than controls on both the IGT (P = 0.04) and the loss aversion task (P = 0.01), and their IGT decisions were associated with heightened attention to gains (P = 0.003) and less consistency (P = 0.002). Strategic problem gamblers did not differ from matched controls on either decision-making task, but non-strategic problem gamblers performed worse on both the IGT (P = 0.006) and the loss aversion task (P = 0.02). Furthermore, we found differences in the PVL model parameters underlying strategic and non-strategic problem gamblers' choices on the IGT. Problem gamblers demonstrated poor decision-making under conditions of risk and ambiguity. Strategic (e.g. sports betting, poker) and non-strategic (e.g. electronic gaming machines) problem gamblers differed in decision-making and the underlying psychological processes associated with their decisions. © 2014 Society for the Study of Addiction.

  4. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU

    Science.gov (United States)

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ˜600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ˜0.25 s/excitation source.

  5. Design and Optimization of Large Accelerator Systems through High-Fidelity Electromagnetic Simulations

    International Nuclear Information System (INIS)

    Ng, Cho; Akcelik, Volkan; Candel, Arno; Chen, Sheng; Ge, Lixin; Kabel, Andreas; Lee, Lie-Quan; Li, Zenghai; Prudencio, Ernesto; Schussman, Greg; Uplenchwar1, Ravi; Xiao1, Liling; Ko1, Kwok; Austin, T.; Cary, J.R.; Ovtchinnikov, S.; Smith, D.N.; Werner, G.R.; Bellantoni, L.; TechX Corp.; Fermilab

    2008-01-01

    SciDAC1, with its support for the 'Advanced Computing for 21st Century Accelerator Science and Technology' (AST) project, witnessed dramatic advances in electromagnetic (EM) simulations for the design and optimization of important accelerators across the Office of Science. In SciDAC2, EM simulations continue to play an important role in the 'Community Petascale Project for Accelerator Science and Simulation' (ComPASS), through close collaborations with SciDAC CETs/Institutes in computational science. Existing codes will be improved and new multi-physics tools will be developed to model large accelerator systems with unprecedented realism and high accuracy using computing resources at petascale. These tools aim at targeting the most challenging problems facing the ComPASS project. Supported by advances in computational science research, they have been successfully applied to the International Linear Collider (ILC) and the Large Hadron Collider (LHC) in High Energy Physics (HEP), the JLab 12-GeV Upgrade in Nuclear Physics (NP), as well as the Spallation Neutron Source (SNS) and the Linac Coherent Light Source (LCLS) in Basic Energy Sciences (BES)

  6. Design and optimization of large accelerator systems through high-fidelity electromagnetic simulations

    International Nuclear Information System (INIS)

    Ng, C; Akcelik, V; Candel, A; Chen, S; Ge, L; Kabel, A; Lee, Lie-Quan; Li, Z; Prudencio, E; Schussman, G; Uplenchwar, R; Xiao, L; Ko, K; Austin, T; Cary, J R; Ovtchinnikov, S; Smith, D N; Werner, G R; Bellantoni, L

    2008-01-01

    SciDAC-1, with its support for the 'Advanced Computing for 21st Century Accelerator Science and Technology' project, witnessed dramatic advances in electromagnetic (EM) simulations for the design and optimization of important accelerators across the Office of Science. In SciDAC2, EM simulations continue to play an important role in the 'Community Petascale Project for Accelerator Science and Simulation' (ComPASS), through close collaborations with SciDAC Centers and Insitutes in computational science. Existing codes will be improved and new multi-physics tools will be developed to model large accelerator systems with unprecedented realism and high accuracy using computing resources at petascale. These tools aim at targeting the most challenging problems facing the ComPASS project. Supported by advances in computational science research, they have been successfully applied to the International Linear Collider and the Large Hadron Collider in high energy physics, the JLab 12-GeV Upgrade in nuclear physics, and the Spallation Neutron Source and the Linac Coherent Light Source in basic energy sciences

  7. 7 CFR 25.202 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... contributed to the planning process; (3) Identify the amount of State, local, and private resources that will... 7 Agriculture 1 2010-01-01 2010-01-01 false Strategic plan. 25.202 Section 25.202 Agriculture... Procedure § 25.202 Strategic plan. (a) Principles of strategic plan. The strategic plan included in the...

  8. Burnup calculations for KIPT accelerator driven subcritical facility using Monte Carlo computer codes-MCB and MCNPX

    International Nuclear Information System (INIS)

    Gohar, Y.; Zhong, Z.; Talamo, A.

    2009-01-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an electron accelerator driven subcritical (ADS) facility, using the KIPT electron accelerator. The neutron source of the subcritical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The electron beam has a uniform spatial distribution and electron energy in the range of 100 to 200 MeV. The main functions of the subcritical assembly are the production of medical isotopes and the support of the Ukraine nuclear power industry. Neutron physics experiments and material structure analyses are planned using this facility. With the 100 KW electron beam power, the total thermal power of the facility is ∼375 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products reduce continuously the reactivity during the operation, which reduces the neutron flux level and consequently the facility performance. To preserve the neutron flux level during the operation, fuel assemblies should be added after long operating periods to compensate for the lost reactivity. This process requires accurate prediction of the fuel burnup, the decay behavior of the fission produces, and the introduced reactivity from adding fresh fuel assemblies. The recent developments of the Monte Carlo computer codes, the high speed capability of the computer processors, and the parallel computation techniques made it possible to perform three-dimensional detailed burnup simulations. A full detailed three-dimensional geometrical model is used for the burnup simulations with continuous energy nuclear data libraries for the transport calculations and 63-multigroup or one group cross sections libraries for the depletion calculations. Monte Carlo Computer code MCNPX and MCB are utilized for this study. MCNPX transports the electrons and the

  9. Cloud Computing in KAUST Library: Beyond Remote Hosting

    KAUST Repository

    Yu, Yi

    2013-12-01

    Enterprise computing is the key strategic approach for KAUST to build its modern IT landscape. In such a strategic direction and technical environment, the library tries to establish library technology by catching new trends which help to make the library more efficient and sufficient. This paper focuses on the cloud computing development in the KAUST library, by using real world scenarios and first-hand experiences to describe what cloud computing means for KAUST library. It addresses the difficulties that were met by the library during the implementation process, how cloud computing affects the functional performance and work procedure of the library, how it impacts the style and modal of the library’s technical service and systems administration, how it changes the relationships and cooperation among the involved players (the library, campus IT and vendors), and what the benefits and disadvantages are. The story of cloud computing at KAUST will share the knowledge and lessons that the KAUST library learnt during its development, and will also point out the future direction of cloud computing at KAUST.

  10. Peaceful Development and Strategic Opportunity

    Institute of Scientific and Technical Information of China (English)

    Yang Yi

    2006-01-01

    @@ The international strategic situation and environment China faces have changed dramatically since September 11. China has closely followed and adapted itself to the ever-changing situation, seized strategic opportunity, adjusted its global strategy, adhered to peaceful development and displayed diplomacy and strategic flexibility. These are manifested in the following four aspects:

  11. Equipartitioning in linear accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1981-01-01

    Emittance growth has long been a concern in linear accelerators, as has the idea that some kind of energy balance, or equipartitioning, between the degrees of freedom, would ameliorate the growth. M. Prome observed that the average transverse and longitudinal velocity spreads tend to equalize as current in the channel is increased, while the sum of the energy in the system stays nearly constant. However, only recently have we shown that an equipartitioning requirement on a bunched injected beam can indeed produce remarkably small emittance growth. The simple set of equations leading to this condition are outlined below. At the same time, Hofmann, using powerful analytical and computational methods, has investigated collective instabilities in transported beams and has identified thresholds and regions in parameter space where instabilities occur. This is an important generalization. Work that he will present at this conference shows that the results are essentially the same in r-z coordinates for transport systems, and evidence is presented that shows transport system boundaries to be quite accurate in computer simulations of accelerating systems also. Discussed are preliminary results of efforts to design accelerators that avoid parameter regions where emittance is affected by the instabilities identified by Hofmann. These efforts suggest that other mechanisms are present. The complicated behavior of the RFQ linac in this framework also is shown

  12. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  13. Lua(Jit) for computing accelerator beam physics

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    As mentioned in the 2nd developers meeting, I would like to open the debate with a special presentation on another language - Lua, and a tremendous technology - LuaJit. Lua is much less known at CERN, but it is very simple, much smaller than Python and its JIT is extremely performant. The language is a dynamic scripting language easy to learn and easy to embedded in applications. I will show how we use it in HPC for accelerator beam physics as a replacement for C, C++, Fortran and Python, with some benchmarks versus Python, PyPy4 and C/C++.

  14. Computational Science and Innovation

    International Nuclear Information System (INIS)

    Dean, David Jarvis

    2011-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  15. Strategic Context of Project Portfolio Management

    Directory of Open Access Journals (Sweden)

    Nedka Nikolova

    2016-06-01

    Full Text Available In 2014 Bulgaria entered its second programming period (2014-2020 which opened a new stage in the development of project management in our country. Project-oriented companies are entering a new stage in which based on experience and increased design capacity they will develop their potential and will accelerate growth. This poses new challenges for science and business to identify strategic opportunities and formulation of project objectives, programs and portfolios of projects that will increase the competitive potential of companies and the economy as a whole. This article is an expression of the shared responsibility of science to develop the scientific front to solve methodologically difficult and practically new tasks that are derived from the needs to increase the competitive potential of the business-based project approach. The main objective of this study is based on the systematization of the results of theoretical research and development of methodology of Project Portfolio Management to explore the opportunities for its application in Bulgarian industrial companies.

  16. Strategic management in company information centre

    International Nuclear Information System (INIS)

    Judita Kopacikova, J.

    2004-01-01

    The presentation deals with the necessity of strategic management in libraries and information centres, with the process of creation, realization and regulation of settlement of strategic objectives and plans. It analyzes two levels of strategic management - information support of strategic management towards the superior body, provider, top management of the enterprise and organisation and proper strategic management of the information workplace. Marginally it also interferes with the problems of the so-called functional strategies - personal, technical provision and marketing. The current political, economical, social and for librarians and informative workers even information environs are subject to review of continual changes and show the organisations, institutions, enterprises and libraries how to compete successfully in competition. Changes, which are typical for the current period, will continue constantly. Consequently we must try to get them under the control, respond to them elastically, to be ready for them and to expect and predict them. For their managing we keep the modern management tools and methods at disposal - strategic management, TQM, knowledge management, management of human sources, etc. Increasing intensity and change ranges in the environs around us effect exceeding of strategic management demand - strategy. The higher uncertainty of the future development and the more solution alternatives are, the more important demand for strategic thinking and strategic proceeding is. By the strategic management the strategic thinking is the supposition of success and increasing of the effectiveness, performance and quality of products and services are the target. The final outcome is a satisfied customer, reader, user and its purpose is a long-term success in the activity or in the business. (author)

  17. Strategic management in company information centre

    International Nuclear Information System (INIS)

    Judita Kopacikova, J.

    2004-01-01

    The article deals with the necessity of strategic management in libraries and information centres, with the process of creation, realization and regulation of settlement of strategic objectives and plans. It analyzes two levels of strategic management - information support of strategic management towards the superior body, provider, top management of the enterprise and organisation and proper strategic management of the information workplace. Marginally it also interferes with the problems of the so-called functional strategies - personal, technical provision and marketing. The current political, economical, social and for librarians and informative workers even information environs are subject to review of continual changes and show the organisations, institutions, enterprises and libraries how to compete successfully in competition. Changes, which are typical for the current period, will continue constantly. Consequently we must try to get them under the control, respond to them elastically, to be ready for them and to expect and predict them. For their managing we keep the modern management tools and methods at disposal - strategic management, TQM, knowledge management, management of human sources, etc. Increasing intensity and change ranges in the environs around us effect exceeding of strategic management demand - strategy. The higher uncertainty of the future development and the more solution alternatives are, the more important demand for strategic thinking and strategic proceeding is. By the strategic management the strategic thinking is the supposition of success and increasing of the effectiveness, performance and quality of products and services are the target. The final outcome is a satisfied customer, reader, user and its purpose is a long-term success in the activity or in the business. (author)

  18. Using Model to Plan of Strategic Objectives

    OpenAIRE

    Terezie Bartusková; Jitka Baňařová; Zuzana Kusněřová

    2012-01-01

    Importance of strategic planning is unquestionable. However, the practical implementation of a strategic plan faces too many obstacles. The aim of the article is explained the importance of strategic planning and to find how companies in Moravian-Silesian Region deal with strategic planning, and to introduce the model, which helps to set strategic goals in financial indicators area. This model should be part of the whole process of strategic planning and can be use to predict the future value...

  19. Accelerated H-LBP-based edge extraction method for digital radiography

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, Shuang; Zhao, Chen-yi; Huang, Ji-peng [School of Physics, Northeast Normal University, Changchun 130024 (China); Sun, Jia-ning, E-mail: sunjn118@nenu.edu.cn [School of Mathematics and Statistics, Northeast Normal University, Changchun 130024 (China)

    2015-01-11

    With the goal of achieving real time and efficient edge extraction for digital radiography, an accelerated H-LBP-based edge extraction method (AH-LBP) is presented in this paper by improving the existing framework of local binary pattern with the H function (H-LBP). Since the proposed method avoids computationally expensive operations with no loss of quality, it possesses much lower computational complexity than H-LBP. Experimental results on real radiographies show desirable performance of our method. - Highlights: • An accelerated H-LBP method for edge extraction on digital radiography is proposed. • The novel AH-LBP relies on numerical analysis of the existing H-LBP method. • Aiming at accelerating, H-LBP is reformulated as a direct binary processing. • AH-LBP provides the same edge extraction result as H-LBP does. • AH-LBP has low computational complexity satisfying real time requirements.

  20. Strategic versus financial investors: The role of strategic objectives in financial contracting

    NARCIS (Netherlands)

    Arping, S.; Falconieri, S.

    2009-01-01

    Strategic investors, such as corporate venture capitalists, engage in the financing of start-up firms to complement their core businesses and to facilitate the internalization of externalities. We argue that while strategic objectives make it more worthwhile for an investor to elicit high

  1. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  2. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  3. Applying Strategic Visualization(Registered Trademark) to Lunar and Planetary Mission Design

    Science.gov (United States)

    Frassanito, John R.; Cooke, D. R.

    2002-01-01

    NASA teams, such as the NASA Exploration Team (NEXT), utilize advanced computational visualization processes to develop mission designs and architectures for lunar and planetary missions. One such process, Strategic Visualization (trademark), is a tool used extensively to help mission designers visualize various design alternatives and present them to other participants of their team. The participants, which may include NASA, industry, and the academic community, are distributed within a virtual network. Consequently, computer animation and other digital techniques provide an efficient means to communicate top-level technical information among team members. Today,Strategic Visualization(trademark) is used extensively both in the mission design process within the technical community, and to communicate the value of space exploration to the general public. Movies and digital images have been generated and shown on nationally broadcast television and the Internet, as well as in magazines and digital media. In our presentation will show excerpts of a computer-generated animation depicting the reference Earth/Moon L1 Libration Point Gateway architecture. The Gateway serves as a staging corridor for human expeditions to the lunar poles and other surface locations. Also shown are crew transfer systems and current reference lunar excursion vehicles as well as the Human and robotic construction of an inflatable telescope array for deployment to the Sun/Earth Libration Point.

  4. Trends in accelerator control systems

    International Nuclear Information System (INIS)

    Crowley-Milling, M.C.

    1984-04-01

    Over the years, we have seen a revolution in control systems that has followed the ever decreasing cost of computer power and memory. It started with the data gathering, when people distrusted the computer to perform control actions correctly, through the stage of using a computer to perform control actions correctly, through the stage of using a computer system to provide a convenient remote look and adjust facility, to the present day, when more and more emphasis is being placed on using a computer system to simulate or model all or parts of the accelerator, feed in the required performance and calling for the computers to set the various parameters and then measure the actual performance, with iteration if necessary. The progress that has been made in the fields of architecture, communications, computers, interface, software design and operator interface is reviewed

  5. Achievements in ISICs/SAPP collaborations for electromagnetic modeling of accelerators

    International Nuclear Information System (INIS)

    Lee Liequan; Ge Lixin; Li Zenghai; Ng, Cho; Schussman, Greg; Ko, Kwok

    2005-01-01

    SciDAC provides the unique opportunity and the resources for the Electromagnetic System Simulations (ESS) component of High Energy Physics (HEP)'s Accelerator Science and Technology (AST) project to work with researchers in the Integrated Software Infrastructure Centres (ISICs) and Scientific Application Pilot Program (SAPP) to overcome challenging barriers in computer science and applied mathematics in order to perform the large-scale simulations required to support the ongoing R and D efforts on accelerators across the Office of Science. This paper presents the resultant achievements made under SciDAC in important areas of computational science relevant to electromagnetic modelling of accelerators which include nonlinear eigensolvers, shape optimization, adaptive mesh refinement, parallel meshing, and visualization

  6. Computation of Normal Conducting and Superconducting Linear Accelerator (LINAC) Availabilities

    International Nuclear Information System (INIS)

    Haire, M.J.

    2000-01-01

    A brief study was conducted to roughly estimate the availability of a superconducting (SC) linear accelerator (LINAC) as compared to a normal conducting (NC) one. Potentially, SC radio frequency cavities have substantial reserve capability, which allows them to compensate for failed cavities, thus increasing the availability of the overall LINAC. In the initial SC design, there is a klystron and associated equipment (e.g., power supply) for every cavity of an SC LINAC. On the other hand, a single klystron may service eight cavities in the NC LINAC. This study modeled that portion of the Spallation Neutron Source LINAC (between 200 and 1,000 MeV) that is initially proposed for conversion from NC to SC technology. Equipment common to both designs was not evaluated. Tabular fault-tree calculations and computer-event-driven simulation (EDS) computer computations were performed. The estimated gain in availability when using the SC option ranges from 3 to 13% under certain equipment and conditions and spatial separation requirements. The availability of an NC LINAC is estimated to be 83%. Tabular fault-tree calculations and computer EDS modeling gave the same 83% answer to within one-tenth of a percent for the NC case. Tabular fault-tree calculations of the availability of the SC LINAC (where a klystron and associated equipment drive a single cavity) give 97%, whereas EDS computer calculations give 96%, a disagreement of only 1%. This result may be somewhat fortuitous because of limitations of tabular fault-tree calculations. For example, tabular fault-tree calculations can not handle spatial effects (separation distance between failures), equipment network configurations, and some failure combinations. EDS computer modeling of various equipment configurations were examined. When there is a klystron and associated equipment for every cavity and adjacent cavity, failure can be tolerated and the SC availability was estimated to be 96%. SC availability decreased as

  7. IKONET: distributed accelerator and experiment control

    International Nuclear Information System (INIS)

    Koldewijn, P.

    1986-01-01

    IKONET is a network consisting of some 35 computers used to control the 500 MeV Medium Energy Amsterdam electron accelerator (MEA) and its various experiments. The control system is distributed over a whole variety of machines, which are combined in a transparent central-oriented network. The local hardware is switched and tuned via Camac by a series of mini-computers with a real-time multitask operating system. Larger systems provide central intelligence for the higher-level control layers. An image of the complete accelerator settings is maintained by central database administrators. Different operator facilities handle touchpanels, multi-purpose knobs and graphical displays. The network provides remote login facilities and file servers. On basis of the present layout, an overview is given of future developments for subsystems of the network. (Auth.)

  8. Being Strategic in HE Management

    Science.gov (United States)

    West, Andrew

    2008-01-01

    The call to be strategic--and with it the concept of strategic management--can bring to mind a wide range of definitions, and there is now a huge array of academic literature supporting the different schools of thought. At a basic level, however, strategic thinking is probably most simply about focusing on the whole, rather than the part. In…

  9. 2011 Computation Directorate Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence

  10. Status report on the accelerators operation

    International Nuclear Information System (INIS)

    Biri, S.; Kormany, Z.; Berzi, I.; Racz, R.

    2011-01-01

    During 2011 our particle accelerators operated as scheduled, safely and without major or long breakdowns. The utilization rates of the accelerators were similar to the preceding year. The cyclotron delivered 1735 hours and the 40- years old 5 MeV Van de Graaff generator supplied more than 1900 hours. The 1 MeV Van de Graaff accelerator was also operated for several short basic physics experiments last year (84 hours) with requests for much more beamtime in 2012. The plasma and beam-on-target time at the ECR ion source was less than in previous years (322 hours) due to several time consuming technical developments in this laboratory. The isotope separator, as ion beam accelerator was utilized only for a few hours in 2011, since the research and development in this lab focused on other fields. Nevertheless it is continuously available for research requiring special deposition techniques and for isotope tracing studies. After judging and accepting the importance and quality of our accelerators and staff the title 'Strategic Research Infrastructure' was addressed to the Accelerator Center by the Hungarian authorities. In order to get access to the accelerators a new system was introduced. The beamtime (within the frames of the capacities) is available for everyone with equal chance if an acceptable scientific program is provided together with the request. The users have to contact our Program Advisory Committee (PAC). Since last year the requests - both from external or local users - must be delivered by filling out and e-sending on on-line form available in the homepage of the institute. In the next sub-chapters the 2011 operation and development details at the cyclotron, VdG-5 and ECR accelerators are summarized. Cyclotron operation. The operation of the cyclotron in 2011 was concentrated to the usual 9 months; January, July and August were reserved for maintenance and holidays. The overall working time of the accelerator was 2603 hours; the time used for systematic

  11. Strategic management process in hospitals.

    Science.gov (United States)

    Zovko, V

    2001-01-01

    Strategic management is concerned with strategic choices and strategic implementation; it provides the means by which organizations meet their objectives. In the case of hospitals it helps executives and all employees to understand the real purpose and long term goals of the hospital. Also, it helps the hospital find its place in the health care service provision chain, and enables the hospital to coordinate its activities with other organizations in the health care system. Strategic management is a tool, rather than a solution, that helps executives to identify root causes of major problems in the hospital.

  12. Numerical Nudging: Using an Accelerating Score to Enhance Performance.

    Science.gov (United States)

    Shen, Luxi; Hsee, Christopher K

    2017-08-01

    People often encounter inherently meaningless numbers, such as scores in health apps or video games, that increase as they take actions. This research explored how the pattern of change in such numbers influences performance. We found that the key factor is acceleration-namely, whether the number increases at an increasing velocity. Six experiments in both the lab and the field showed that people performed better on an ongoing task if they were presented with a number that increased at an increasing velocity than if they were not presented with such a number or if they were presented with a number that increased at a decreasing or constant velocity. This acceleration effect occurred regardless of the absolute magnitude or the absolute velocity of the number, and even when the number was not tied to any specific rewards. This research shows the potential of numerical nudging-using inherently meaningless numbers to strategically alter behaviors-and is especially relevant in the present age of digital devices.

  13. Value oriented strategic marketing

    Directory of Open Access Journals (Sweden)

    Milisavljević Momčilo

    2013-01-01

    Full Text Available Changes in today's business environment require companies to orient to strategic marketing. The company accepting strategic marketing has a proactive approach and focus on continuous review and reappraisal of existing and seeking new strategic business areas. Difficulties in achieving target profit and growth require turning marketing from the dominant viewpoint of the tangible product to creating superior value and developing relationships with customers. Value orientation implies gaining competitive advantage through continuous research and understanding of what value represents to the consumers and discovering new ways to meet their required values. Strategic marketing investment requires that the investment in the creation of values should be regularly reviewed in order to ensure a focus on customers with high profit potential and environmental value. This increases customer satisfaction and retention and long-term return on investment of companies.

  14. Acceleration of Meshfree Radial Point Interpolation Method on Graphics Hardware

    International Nuclear Information System (INIS)

    Nakata, Susumu

    2008-01-01

    This article describes a parallel computational technique to accelerate radial point interpolation method (RPIM)-based meshfree method using graphics hardware. RPIM is one of the meshfree partial differential equation solvers that do not require the mesh structure of the analysis targets. In this paper, a technique for accelerating RPIM using graphics hardware is presented. In the method, the computation process is divided into small processes suitable for processing on the parallel architecture of the graphics hardware in a single instruction multiple data manner.

  15. Strategic analysis of PKM Duda SA. on Polish meat market with the application of BCG growth-share matrix

    Directory of Open Access Journals (Sweden)

    Anna Zielińska-Chmielewska

    2012-01-01

    Full Text Available The main goal of this paper was to examine the market position of one leading meat processing enterprise PKM Duda SA. on the domestic meat market. The assessment of the activity portfolio on its three strategic units was undertaken with the usage of BCG matrix. The PKM Duda SA. was chosen for the study because: a processes more than 20 tons of slaughter per week, b is located in the country of origin, c exists on Warsaw Stock Exchange Market, d preserves continuity of its database in Monitor Polski „B”. The analysis proved that all three examined strategic units have different market shares and operate on markets of a different acceleration. The highest income rate brings the meat processing unit (B, the lowest slaughter unit (A. The market position of PKM Duda SA. can be improved when a retail trade unit (B moves away from question marks into stars. Although BCG matrix draws a fast and a complex strategic situation, is not free from disadvantages. That is the reason why further, also portfolio, analysis should be im-plemented.

  16. A new approach to developing and optimizing organization strategy based on stochastic quantitative model of strategic performance

    Directory of Open Access Journals (Sweden)

    Marko Hell

    2014-03-01

    Full Text Available This paper presents a highly formalized approach to strategy formulation and optimization of strategic performance through proper resource allocation. A stochastic quantitative model of strategic performance (SQMSP is used to evaluate the efficiency of the strategy developed. The SQMSP follows the theoretical notions of the balanced scorecard (BSC and strategy map methodologies, initially developed by Kaplan and Norton. Parameters of the SQMSP are suggested to be random variables and be evaluated by experts who give two-point (optimistic and pessimistic values and three-point (optimistic, most probable and pessimistic values evaluations. The Monte-Carlo method is used to simulate strategic performance. Having been implemented within a computer application and applied to solve the real problem (planning of an IT-strategy at the Faculty of Economics, University of Split the proposed approach demonstrated its high potential as a basis for development of decision support tools related to strategic planning.

  17. SOLVING BY PARALLEL COMPUTATION THE POISSON PROBLEM FOR HIGH INTENSITY BEAMS IN CIRCULAR ACCELERATORS

    International Nuclear Information System (INIS)

    LUCCIO, A.U.; DIMPERIO, N.L.; SAMULYAK, R.; BEEB-WANG, J.

    2001-01-01

    Simulation of high intensity accelerators leads to the solution of the Poisson Equation, to calculate space charge forces in the presence of acceleration chamber walls. We reduced the problem to ''two-and-a-half'' dimensions for long particle bunches, characteristic of large circular accelerators, and applied the results to the tracking code Orbit

  18. Jesus the Strategic Leader

    National Research Council Canada - National Science Library

    Martin, Gregg

    2000-01-01

    Jesus was a great strategic leader who changed the world in many ways. Close study of what he did and how he did it reveals a pattern of behavior that is extremely useful and relevant to the modern strategic leader...

  19. Software tools for the particle accelerator designs

    International Nuclear Information System (INIS)

    Sugimoto, Masayoshi

    1988-01-01

    The software tools used for the designs of the particle accelerators are going to be implemented on the small computer systems, such as the personal computers or the work stations. These are called from the interactive environment like a window application program. The environment contains the small expert system to make easy to select the design parameters. (author)

  20. Strategic Alliance Development - A Process Model A Case Study Integrating Elements of Strategic Alliances

    OpenAIRE

    Mohd Yunos, Mohd Bulkiah

    2007-01-01

    There has been enormous increase in the formation of strategic alliance and the research efforts devoted to understanding alliance development process over the last few decades. However, the critical elements that influence the each stage of alliance development are yet unexplored. This dissertation aims to fill this gap and to supplement it by introducing an integrated process model of strategic alliance development and its critical elements. The process model for strategic alliance developm...

  1. The paradox of strategic environmental assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bidstrup, Morten, E-mail: bidstrup@plan.aau.dk; Hansen, Anne Merrild, E-mail: merrild@plan.aau.dk

    2014-07-01

    Strategic Environmental Assessment (SEA) is a tool that can facilitate sustainable development and improve decision-making by introducing environmental concern early in planning processes. However, various international studies conclude that current planning practice is not taking full advantage of the tool, and we therefore define the paradox of SEA as the methodological ambiguity of non-strategic SEA. This article explores causality through at three-step case study on aggregates extraction planning in Denmark, which consists of a document analysis; a questionnaire survey and follow-up communication with key planners. Though the environmental reports on one hand largely lack strategic considerations, practitioners express an inherent will for strategy and reveal that their SEAs in fact have been an integrated part of the planning process. Institutional context is found to be the most significant barrier for a strategy and this suggests that non-strategic planning setups can prove more important than non-strategic planning in SEA practice. Planners may try to execute strategy within the confinements of SEA-restricted planning contexts; however, such efforts can be overlooked if evaluated by a narrow criterion for strategy formation. Consequently, the paradox may also spark from challenged documentation. These findings contribute to the common understanding of SEA quality; however, further research is needed on how to communicate and influence the strategic options which arguably remain inside non-strategic planning realities. - Highlights: • International studies conclude that SEAs are not strategic. = The paradox of SEA. • Even on the highest managerial level, some contexts do not leave room for strategy. • Non-strategic SEA can derive from challenged documentation. • Descriptive and emergent strategy formation can, in practice, be deemed non-strategic.

  2. The paradox of strategic environmental assessment

    International Nuclear Information System (INIS)

    Bidstrup, Morten; Hansen, Anne Merrild

    2014-01-01

    Strategic Environmental Assessment (SEA) is a tool that can facilitate sustainable development and improve decision-making by introducing environmental concern early in planning processes. However, various international studies conclude that current planning practice is not taking full advantage of the tool, and we therefore define the paradox of SEA as the methodological ambiguity of non-strategic SEA. This article explores causality through at three-step case study on aggregates extraction planning in Denmark, which consists of a document analysis; a questionnaire survey and follow-up communication with key planners. Though the environmental reports on one hand largely lack strategic considerations, practitioners express an inherent will for strategy and reveal that their SEAs in fact have been an integrated part of the planning process. Institutional context is found to be the most significant barrier for a strategy and this suggests that non-strategic planning setups can prove more important than non-strategic planning in SEA practice. Planners may try to execute strategy within the confinements of SEA-restricted planning contexts; however, such efforts can be overlooked if evaluated by a narrow criterion for strategy formation. Consequently, the paradox may also spark from challenged documentation. These findings contribute to the common understanding of SEA quality; however, further research is needed on how to communicate and influence the strategic options which arguably remain inside non-strategic planning realities. - Highlights: • International studies conclude that SEAs are not strategic. = The paradox of SEA. • Even on the highest managerial level, some contexts do not leave room for strategy. • Non-strategic SEA can derive from challenged documentation. • Descriptive and emergent strategy formation can, in practice, be deemed non-strategic

  3. Strategic aspects of innovation management

    Directory of Open Access Journals (Sweden)

    Baruk Jerzy

    2017-12-01

    Full Text Available Innovations are regarded as the main factor for the development of organizations, regions and whole economies. In practice the innovativeness of economic entities is limited by many factors of internal and external origin. Among the internal factors there are factors associated with management itself focusing the attention of managers on the current problems, limited utilization of modern methods of management, especially strategic management and innovation management. In this publication the emphasis was put on the discussion of the essence of strategic approach to innovation management; the essence of strategic innovations and their role in the development of organizations; three model solutions were proposed, they facilitate: rationalization of decision-making processes for the selection of the strategy of innovative activity; making rational decisions with regard to the moments for the implementation of strategic and facilitating innovations; making rational decisions based on the cycle of strategic innovation in the horizontal and vertical system. Thus, the goal of this publication is to propose a strategic approach to innovation management based not on an intuitive approach, but on a rational approach using chosen model solutions.

  4. Crisis - Strategic Management in Public Relation

    OpenAIRE

    Saari Ahmad

    2012-01-01

    This is a concept paper to explore the strategic management approaches in public relations during crisis. The main objective of this article is to identify the most effective action plan for Public relation. The review of the strategic management in public relations literature reveals that the relationship between strategic management and public relations is still vague. Four stages were identified in the process of establishing the action plan for public relations and eleven strategic action...

  5. Strategic Management in Times of Crisis

    OpenAIRE

    Groh, Maximilian

    2014-01-01

    This aim of this article is to identify unusual strategic-management matters in times of crisis. The research scope is strategic management processes, the characteristics of the processes and methods of strategic crisis management. The study reports research on the contemporary state of strategic crisis-management problems and provides an analysis of some theoretical and methodological principles. The analysis includes a classification of the main problems which must be solved for efficient, ...

  6. STRATEGIC MANAGEMENT ACCOUNTING: DEFINITION AND TOOLS

    OpenAIRE

    Pylypiv, Nadiia; Pіatnychuk, Iryna

    2017-01-01

    The article is dedicated to learning the essence of the definition of “strategic management accounting” in domestic and foreign literature. Strategic management accounting tools has been studied and identified constraints that affect its choice. The result of the study is that the understanding of strategic management accounting was formed by authors. The tools which are common for both traditional managerial accounting and strategic and the specific tools necessary for efficient implementati...

  7. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Multipactoring studies in accelerating structures

    International Nuclear Information System (INIS)

    Kravachuk, L.V.; Puntus, V.A.; Romanov, G.V.; Tarsov, S.G.

    1992-01-01

    A multipactor discharge takes place in the accelerating tanks of the Moscow meson factory linac. The RF power level, the place and the characteristics of the discharge were determined based on experimental results and the computer simulation. The results of the investigation are given. (Author) 5 refs

  9. Strategic Self-Ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    We examine strategic self-ignorance—the use of ignorance as an excuse to overindulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals — a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58 percent) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  10. Strategic self-ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    2016-01-01

    We examine strategic self-ignorance—the use of ignorance as an excuse to over-indulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals—a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58%) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  11. Acceleration techniques for the discrete ordinate method

    International Nuclear Information System (INIS)

    Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego; Trautmann, Thomas

    2013-01-01

    In this paper we analyze several acceleration techniques for the discrete ordinate method with matrix exponential and the small-angle modification of the radiative transfer equation. These techniques include the left eigenvectors matrix approach for computing the inverse of the right eigenvectors matrix, the telescoping technique, and the method of false discrete ordinate. The numerical simulations have shown that on average, the relative speedup of the left eigenvector matrix approach and the telescoping technique are of about 15% and 30%, respectively. -- Highlights: ► We presented the left eigenvector matrix approach. ► We analyzed the method of false discrete ordinate. ► The telescoping technique is applied for matrix operator method. ► Considered techniques accelerate the computations by 20% in average.

  12. STRATEGIC BUSINESS UNIT – THE CENTRAL ELEMENT OF THE BUSINESS PORTFOLIO STRATEGIC PLANNING PROCESS

    OpenAIRE

    FLORIN TUDOR IONESCU

    2011-01-01

    Over time, due to changes in the marketing environment, generated by the tightening competition, technological, social and political pressures the companies have adopted a new approach, by which the potential businesses began to be treated as strategic business units. A strategic business unit can be considered a part of a company, a product line within a division, and sometimes a single product or brand. From a strategic perspective, the diversified companies represent a collection of busine...

  13. Ice-sheet modelling accelerated by graphics cards

    Science.gov (United States)

    Brædstrup, Christian Fredborg; Damsgaard, Anders; Egholm, David Lundbek

    2014-11-01

    Studies of glaciers and ice sheets have increased the demand for high performance numerical ice flow models over the past decades. When exploring the highly non-linear dynamics of fast flowing glaciers and ice streams, or when coupling multiple flow processes for ice, water, and sediment, researchers are often forced to use super-computing clusters. As an alternative to conventional high-performance computing hardware, the Graphical Processing Unit (GPU) is capable of massively parallel computing while retaining a compact design and low cost. In this study, we present a strategy for accelerating a higher-order ice flow model using a GPU. By applying the newest GPU hardware, we achieve up to 180× speedup compared to a similar but serial CPU implementation. Our results suggest that GPU acceleration is a competitive option for ice-flow modelling when compared to CPU-optimised algorithms parallelised by the OpenMP or Message Passing Interface (MPI) protocols.

  14. Strategic R&D transactions in personalized drug development.

    Science.gov (United States)

    Makino, Tomohiro; Lim, Yeongjoo; Kodama, Kota

    2018-03-21

    Although external collaboration capability influences the development of personalized medicine, key transactions in the pharmaceutical industry have not been addressed. To explore specific trends in interorganizational transactions and key players, we longitudinally surveyed strategic transactions, comparing them with other advanced medical developments, such as antibody therapy, as controls. We found that the financing deals of start-ups have surged over the past decade, accelerating intellectual property (IP) creation. Our correlation and regression analyses identified determinants of financing deals among alliance deals, acquisition deals, patents, research and development (R&D) licenses, market licenses, and scientific papers. They showed that patents positively correlated with transactions, and that the number of R&D licenses significantly predicted financing deals. This indicates, for the first time, that start-ups and investors lead progress in personalized medicine. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Is strategic stockpiling essential?

    International Nuclear Information System (INIS)

    Anon.

    2007-01-01

    As mentioned by the European Commission, a consultant has surveyed stakeholders on the concept of setting up strategic stockpiles of natural gas, namely to boost the security of Europe's supply, much like the strategic stockpiling for petroleum products the OECD member countries carried out after the petroleum crisis. If strategic stockpiling consists in blocking off a quantity of gas in addition to the usable stockpile, the AFG believes it is necessary to assess the implications of such a measure and to examine the security gain it would actually offer compared to the measures that have already been implemented to secure supplies. (author)

  16. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU.

    Science.gov (United States)

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ∼0.25  s/excitation source. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  17. Transverse particle acceleration techniques using lasers and masers

    International Nuclear Information System (INIS)

    Schoen, N.C.

    1983-01-01

    The concept discussed herein uses an intense traveling electromagnetic wave, produced by a laser or maser source, to accelerate electrons in the Rayleigh region of a focused beam. Although the possibility of non-synchronous acceleration has been considered, very little analysis of potential device configurations has been reported. Computer simulations of the acceleration process indicate practical figure of merit values in the range of 100 MeV/m for achievable electric field strengths with current technology. The development of compact, high energy electron accelerators will provide an essential component for many new technologies. Such as high power free electron lasers, X-ray and VUV sources, and high power millimeter and microwave devices. Considerable effort has been directed toward studies of new concepts for electron acceleration, including inverse free electron lasers, GYRACS, and modified betatrons

  18. S sub(N) transport and diffusion acceleration

    International Nuclear Information System (INIS)

    Gho, C.J.

    1986-01-01

    After brief description of the characteristics and history of S sub(N) transport method and the present development of transport codes, the technique of diffusion acceleration and the characteristics of its implementation in BISTRO computer code are exposed. It is showed that the method to discretize algorithms leads to strongly results using some simple calculations which alloy to compare the performance of BISTRO computer code to distinguished versions of DOT computer code. (M.C.K.) [pt

  19. Factors Influencing the Adoption of Cloud Computing by Decision Making Managers

    Science.gov (United States)

    Ross, Virginia Watson

    2010-01-01

    Cloud computing is a growing field, addressing the market need for access to computing resources to meet organizational computing requirements. The purpose of this research is to evaluate the factors that influence an organization in their decision whether to adopt cloud computing as a part of their strategic information technology planning.…

  20. KEYNOTE: Simulation, computation, and the Global Nuclear Energy Partnership

    Science.gov (United States)

    Reis, Victor, Dr.

    2006-01-01

    Dr. Victor Reis delivered the keynote talk at the closing session of the conference. The talk was forward looking and focused on the importance of advanced computing for large-scale nuclear energy goals such as Global Nuclear Energy Partnership (GNEP). Dr. Reis discussed the important connections of GNEP to the Scientific Discovery through Advanced Computing (SciDAC) program and the SciDAC research portfolio. In the context of GNEP, Dr. Reis talked about possible fuel leasing configurations, strategies for their implementation, and typical fuel cycle flow sheets. A major portion of the talk addressed lessons learnt from ‘Science Based Stockpile Stewardship’ and the Accelerated Strategic Computing Initiative (ASCI) initiative and how they can provide guidance for advancing GNEP and SciDAC goals. Dr. Reis’s colorful and informative presentation included international proverbs, quotes and comments, in tune with the international flavor that is part of the GNEP philosophy and plan. He concluded with a positive and motivating outlook for peaceful nuclear energy and its potential to solve global problems. An interview with Dr. Reis, addressing some of the above issues, is the cover story of Issue 2 of the SciDAC Review and available at http://www.scidacreview.org This summary of Dr. Reis’s PowerPoint presentation was prepared by Institute of Physics Publishing, the complete PowerPoint version of Dr. Reis’s talk at SciDAC 2006 is given as a multimedia attachment to this summary.

  1. Sensation seeking in a community sample of French gamblers: Comparison between strategic and non-strategic gamblers.

    Science.gov (United States)

    Bonnaire, Céline; Bungener, Catherine; Varescon, Isabelle

    2017-04-01

    The purpose of this research is to examine the relationship between sensation seeking and gambling disorder (GD) in a community sample of gamblers (when controlling for the effect of substance use, gender and age) and see whether sensation seeking scores depend on the gambling activity when comparing strategic and non-strategic gamblers. A total of 380 gamblers was recruited. First, pathological gamblers (PGs) (n =143) were compared to non-pathological gamblers (NPGs) (n =237). Second, strategic gamblers (n =93) were compared to non-strategic gamblers (n =110). Sociodemographic data, gambling behavior (SOGS, DSM-IV), tobacco and alcohol use (CAGE), and sensation seeking (SSS) were evaluated. PGs have higher boredom susceptibility scores than NPGs and this factor is associated with GD. Nevertheless, the relationship between sensation seeking and GD depends on the gambling activity. In fact, sensation seeking is associated with GD in strategic gamblers only. PGs playing strategic games display different profiles from non-strategic PGs. Thus, factors associated with GD differ when the gambling activity is taken into account. These findings are consistent with the idea of it being essential to identify clinically distinct subgroups of PGs in the treatment of GD. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  2. Inverse Free Electron Laser accelerator

    International Nuclear Information System (INIS)

    Fisher, A.; Gallardo, J.; van Steenbergen, A.; Sandweiss, J.

    1992-09-01

    The study of the INVERSE FREE ELECTRON LASER, as a potential mode of electron acceleration, is being pursued at Brookhaven National Laboratory. Recent studies have focussed on the development of a low energy, high gradient, multi stage linear accelerator. The elementary ingredients for the IFEL interaction are the 50 MeV Linac e - beam and the 10 11 Watt CO 2 laser beam of BNL's Accelerator Test Facility (ATF), Center for Accelerator Physics (CAP) and a wiggler. The latter element is designed as a fast excitation unit making use of alternating stacks of Vanadium Permendur (VaP) ferromagnetic laminations, periodically interspersed with conductive, nonmagnetic laminations, which act as eddy current induced field reflectors. Wiggler parameters and field distribution data will be presented for a prototype wiggler in a constant period and in a ∼ 1.5 %/cm tapered period configuration. The CO 2 laser beam will be transported through the IFEL interaction region by means of a low loss, dielectric coated, rectangular waveguide. Short waveguide test sections have been constructed and have been tested using a low power cw CO 2 laser. Preliminary results of guide attenuation and mode selectivity will be given, together with a discussion of the optical issues for the IFEL accelerator. The IFEL design is supported by the development and use of 1D and 3D simulation programs. The results of simulation computations, including also wiggler errors, for a single module accelerator and for a multi-module accelerator will be presented

  3. The analysis of strategic planning in transport

    OpenAIRE

    Išoraitė, Margarita

    2006-01-01

    Strategic planning is a process whish brings to life the mission and vision of an enterprise. The article analyses the following issues: 1. Concepts of strategy. 2. Components of strategic planning. 3. The basis of strategic planning. 4. Formal strategic planning. 5. Tools used in strategy development. 6. Problems of strategic planning. Strateginis planavimas yra procesas, kurio metu įgyvendinami įmonės tikslai. Šiame straipsnyje nagrinėjama: strategijos sąvoka; strateginio planavimo kompo...

  4. OpenCL-Based FPGA Accelerator for 3D FDTD with Periodic and Absorbing Boundary Conditions

    Directory of Open Access Journals (Sweden)

    Hasitha Muthumala Waidyasooriya

    2017-01-01

    Full Text Available Finite difference time domain (FDTD method is a very poplar way of numerically solving partial differential equations. FDTD has a low operational intensity so that the performances in CPUs and GPUs are often restricted by the memory bandwidth. Recently, deeply pipelined FPGA accelerators have shown a lot of success by exploiting streaming data flows in FDTD computation. In spite of this success, many FPGA accelerators are not suitable for real-world applications that contain complex boundary conditions. Boundary conditions break the regularity of the data flow, so that the performances are significantly reduced. This paper proposes an FPGA accelerator that computes commonly used absorbing and periodic boundary conditions in many 3D FDTD applications. Accelerator is designed using a “C-like” programming language called OpenCL (open computing language. As a result, the proposed accelerator can be customized easily by changing the software code. According to the experimental results, we achieved over 3.3 times and 1.5 times higher processing speed compared to the CPUs and GPUs, respectively. Moreover, the proposed accelerator is more than 14 times faster compared to the recently proposed FPGA accelerators that are capable of handling complex boundary conditions.

  5. Strategic arms limitation

    Science.gov (United States)

    Allen Greb, G.; Johnson, Gerald W.

    1983-10-01

    Following World War II, American scientists and politicians proposed in the Baruch plan a radical solution to the problem of nuclear weapons: to eliminate them forever under the auspices of an international nuclear development authority. The Soviets, who as yet did not possess the bomb, rejected this plan. Another approach suggested by Secretary of War Henry Stimson to negotiate directly with the Soviet Union was not accepted by the American leadership. These initial arms limitation failures both reflected and exacerbated the hostile political relationship of the superpowers in the 1950s and 1960s. Since 1969, the more modest focus of the Soviet-American arms control process has been on limiting the numbers and sizes of both defensive and offensive strategic systems. The format for this effort has been the Strategic Arms Limitatins Talks (Salt) and more recently the Strategic Arms Reduction Talks (START). Both sides came to these negotiations convinced that nuclear arsenals had grown so large that some for of mutual restraint was needed. Although the SALT/START process has been slow and ponderous, it has produced several concrete the agreements and collateral benefits. The 1972 ABM Treaty restricts the deployment of ballistic missile defense systems, the 1972 Interim Agreement places a quantitative freeze on each side's land based and sea based strategic launchers, and the as yet unratified 1979 SALT II Treaty sets numerical limits on all offensive strategic systems and sublimits on MIRVed systems. Collateral benefits include improved verification procedures, working definitions and counting rules, and permanent bureaucratic apparatus which enhance stability and increase the chances for achieving additional agreements.

  6. Strategic decision quality in Flemish municipalities

    NARCIS (Netherlands)

    B.R.J. George (Bert); S. Desmidt (Sebastian); J. De Moyer (Julie)

    2016-01-01

    textabstractStrategic planning (SP) has taken the public sector by storm because it is widely believed that SP’s approach to strategic decision-making strengthens strategic decision quality (SDQ) in public organizations. However, if or how SP relates to SDQ seems to lack empirical evidence. Drawing

  7. Strategic Decision Making Paradigms: A Primer for Senior Leaders

    Science.gov (United States)

    2009-07-01

    decision making . STRATEGIC DECISION MAKING Strategic Change: There are several strategic...influenced by stakeholders outside of the organization. The Ontology of Strategic Decision Making . Strategic decisions are non-routine and involve...Coates USAWC, July 2009 5 The Complexity of Strategic Decision Making Strategic decisions entail “ill-structured,”6 “messy” or

  8. A Neural Mechanism of Strategic Social Choice under Sanction-Induced Norm Compliance

    Science.gov (United States)

    Makwana, Aidan; Grön, Georg; Fehr, Ernst; Hare, Todd A

    2015-01-01

    In recent years, much has been learned about the representation of subjective value in simple, nonstrategic choices. However, a large fraction of our daily decisions are embedded in social interactions in which value guided decisions require balancing benefits for self against consequences imposed by others in response to our choices. Yet, despite their ubiquity, much less is known about how value computation takes place in strategic social contexts that include the possibility of retribution for norm violations. Here, we used functional magnetic resonance imaging (fMRI) to show that when human subjects face such a context connectivity increases between the temporoparietal junction (TPJ), implicated in the representation of other peoples' thoughts and intentions, and regions of ventromedial prefrontal cortex (vmPFC) that are associated with value computation. In contrast, we find no increase in connectivity between these regions in social nonstrategic cases where decision-makers are immune from retributive monetary punishments from a human partner. Moreover, there was also no increase in TPJ-vmPFC connectivity when the potential punishment was performed by a computer programmed to punish fairness norm violations in the same manner as a human would. Thus, TPJ-vmPFC connectivity is not simply a function of the social or norm enforcing nature of the decision, but rather occurs specifically in situations where subjects make decisions in a social context and strategically consider putative consequences imposed by others.

  9. Strategic management of population programs

    OpenAIRE

    Bernhart, Michael H.

    1992-01-01

    Formal strategic planning and management appear to contribute to organizational effectiveness. The author surveys the literature on strategic management in private/for-profit organizations and applies lessons from that literature to population programs. Few would argue that population programs would not benefit from strategic planning and management, but it would be inadvisable to initiate the process when the organization is faced with a short-term crisis; during or immediately before a chan...

  10. Enhancing the Strategic Capability of the Army: An Investigation of Strategic Thinking Tasks, Skills, and Development

    Science.gov (United States)

    2016-02-01

    Army assignments. Teaching can also help develop visualization skills and innovative thinking through the use of certain teaching methods...required. Some of the specific strategic thinking KSAs built through exposure to complex problems that were mentioned in the interviews were visualization ...Research Report 1995 Enhancing the Strategic Capability of the Army: An Investigation of Strategic Thinking Tasks, Skills

  11. Children's strategic theory of mind.

    Science.gov (United States)

    Sher, Itai; Koenig, Melissa; Rustichini, Aldo

    2014-09-16

    Human strategic interaction requires reasoning about other people's behavior and mental states, combined with an understanding of their incentives. However, the ontogenic development of strategic reasoning is not well understood: At what age do we show a capacity for sophisticated play in social interactions? Several lines of inquiry suggest an important role for recursive thinking (RT) and theory of mind (ToM), but these capacities leave out the strategic element. We posit a strategic theory of mind (SToM) integrating ToM and RT with reasoning about incentives of all players. We investigated SToM in 3- to 9-y-old children and adults in two games that represent prevalent aspects of social interaction. Children anticipate deceptive and competitive moves from the other player and play both games in a strategically sophisticated manner by 7 y of age. One game has a pure strategy Nash equilibrium: In this game, children achieve equilibrium play by the age of 7 y on the first move. In the other game, with a single mixed-strategy equilibrium, children's behavior moved toward the equilibrium with experience. These two results also correspond to two ways in which children's behavior resembles adult behavior in the same games. In both games, children's behavior becomes more strategically sophisticated with age on the first move. Beyond the age of 7 y, children begin to think about strategic interaction not myopically, but in a farsighted way, possibly with a view to cooperating and capitalizing on mutual gains in long-run relationships.

  12. Guide to accelerator physics program SYNCH: VAX version 1987.2

    International Nuclear Information System (INIS)

    Parsa, Z.; Courant, E.

    1987-01-01

    This guide is written to accommodate users of Accelerator Physics Data Base BNLDAG::DUAO:[PARSA1]. It describes the contents of the on line Accelerator Physics data base DUAO:[PARSA1.SYNCH]. SYNCH is a computer program used for the design and analysis of synchrotrons, storage rings and beamlines

  13. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  14. Collaborative Strategic Planning: Myth or Reality?

    Science.gov (United States)

    Mbugua, Flora; Rarieya, Jane F. A.

    2014-01-01

    The concept and practice of strategic planning, while entrenched in educational institutions in the West, is just catching on in Kenya. While literature emphasizes the importance of collaborative strategic planning, it does not indicate the challenges presented by collaboratively engaging in strategic planning. This article reports on findings of…

  15. Feasibility and advantages of commercial process I/O systems for accelerator control

    International Nuclear Information System (INIS)

    Belshe, R.A.; Elischer, V.P.; Jacobson, V.

    1975-03-01

    Control systems for large particle accelerators must be able to handle analog and digital signals and timing coordination for devices which are spread over a large physical area. Many signals must be converted and transmitted to and from a central control area during each accelerator cycle. Digital transmission is often used to combat common mode and RF interference. Most accelerators in use today have met these requirements with custom process I/O hardware, data transmission systems, and computer interfaces. In-house development of hardware and software has been a very costly and time consuming process, but due to the lack of available commercial equipment, there was often no other alternative. Today, a large portion of these development costs can be avoided. Small control computers are now available off the shelf which have extensive process control I/O hardware and software capability. Computer control should be designed into accelerator systems from the beginning, using operating systems available from manufacturer. With most of the systems programming done, the designers can begin immediately on the applications software. (U.S.)

  16. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  17. Recent trends in particle accelerator radiation safety

    International Nuclear Information System (INIS)

    Ohnesorge, W.F.; Butler, H.M.

    1974-01-01

    The use of particle accelerators in applied and research activities continues to expand, bringing new machines with higher energy and current capabilities which create radiation safety problems not commonly encountered before. An overview is given of these increased ionizing radiation hazards, along with a discussion of some of the new techniques required in evaluating and controlling them. A computer search of the literature provided a relatively comprehensive list of publications describing accelerator radiation safety problems and related subjects

  18. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  19. ASCI's Vision for supercomputing future

    International Nuclear Information System (INIS)

    Nowak, N.D.

    2003-01-01

    The full text of publication follows. Advanced Simulation and Computing (ASC, formerly Accelerated Strategic Computing Initiative [ASCI]) was established in 1995 to help Defense Programs shift from test-based confidence to simulation-based confidence. Specifically, ASC is a focused and balanced program that is accelerating the development of simulation capabilities needed to analyze and predict the performance, safety, and reliability of nuclear weapons and certify their functionality - far exceeding what might have been achieved in the absence of a focused initiative. To realize its vision, ASC is creating simulation and proto-typing capabilities, based on advanced weapon codes and high-performance computing

  20. Hardware Accelerated Sequence Alignment with Traceback

    Directory of Open Access Journals (Sweden)

    Scott Lloyd

    2009-01-01

    in a timely manner. Known methods to accelerate alignment on reconfigurable hardware only address sequence comparison, limit the sequence length, or exhibit memory and I/O bottlenecks. A space-efficient, global sequence alignment algorithm and architecture is presented that accelerates the forward scan and traceback in hardware without memory and I/O limitations. With 256 processing elements in FPGA technology, a performance gain over 300 times that of a desktop computer is demonstrated on sequence lengths of 16000. For greater performance, the architecture is scalable to more processing elements.

  1. Neural mechanisms mediating degrees of strategic uncertainty.

    Science.gov (United States)

    Nagel, Rosemarie; Brovelli, Andrea; Heinemann, Frank; Coricelli, Giorgio

    2018-01-01

    In social interactions, strategic uncertainty arises when the outcome of one's choice depends on the choices of others. An important question is whether strategic uncertainty can be resolved by assessing subjective probabilities to the counterparts' behavior, as if playing against nature, and thus transforming the strategic interaction into a risky (individual) situation. By means of functional magnetic resonance imaging with human participants we tested the hypothesis that choices under strategic uncertainty are supported by the neural circuits mediating choices under individual risk and deliberation in social settings (i.e. strategic thinking). Participants were confronted with risky lotteries and two types of coordination games requiring different degrees of strategic thinking of the kind 'I think that you think that I think etc.' We found that the brain network mediating risk during lotteries (anterior insula, dorsomedial prefrontal cortex and parietal cortex) is also engaged in the processing of strategic uncertainty in games. In social settings, activity in this network is modulated by the level of strategic thinking that is reflected in the activity of the dorsomedial and dorsolateral prefrontal cortex. These results suggest that strategic uncertainty is resolved by the interplay between the neural circuits mediating risk and higher order beliefs (i.e. beliefs about others' beliefs). © The Author(s) (2017). Published by Oxford University Press.

  2. Manage "Human Capital" Strategically

    Science.gov (United States)

    Odden, Allan

    2011-01-01

    To strategically manage human capital in education means restructuring the entire human resource system so that schools not only recruit and retain smart and capable individuals, but also manage them in ways that support the strategic directions of the organization. These management practices must be aligned with a district's education improvement…

  3. Managing transdisciplinarity in strategic foresight

    DEFF Research Database (Denmark)

    Rasmussen, Birgitte; Andersen, Per Dannemand; Borch, Kristian

    2010-01-01

    Strategic foresight deals with the long term future and is a transdisciplinary exercise which, among other aims, addresses the prioritization of science and other decision making in science and innovation advisory and funding bodies. This article discusses challenges in strategic foresight...... in relation to transdisciplinarity based on empirical as well as theoretical work in technological domains. By strategic foresight is meant future oriented, participatory consultation of actors and stakeholders, both within and outside a scientific community. It therefore allows multiple stakeholders...... strategic foresight has now been widely accepted for strategy-making and priority-setting in science and innovation policy, the methodologies underpinning it still need further development. Key findings are the identification of challenges, aspects and issues related to management and facilitation...

  4. Peran Strategic Entrepreneurship Dalam Membangun Sustainable Competitive Advantage

    OpenAIRE

    Handrimurtjahjo, Agustinus Dedy

    2014-01-01

    Strategic entrepreneurship has emerged as a new concept in examining convergence in entrepreneurship studies (opportunity-seeking behavior) and strategic management (advantage-seeking behavior). Studies in the area of strategic management have gradually exposed the relationship betweenstrategic management and entrepreneurship: entrepreneurial strategy making; intrapreneurship; entrepreneurial strategic posture within organizations; entrepreneurial orientation; strategic managementintegration ...

  5. Monte Carlo computation of Bremsstrahlung intensity and energy spectrum from a 15 MV linear electron accelerator tungsten target to optimise LINAC head shielding

    International Nuclear Information System (INIS)

    Biju, K.; Sharma, Amiya; Yadav, R.K.; Kannan, R.; Bhatt, B.C.

    2003-01-01

    The knowledge of exact photon intensity and energy distributions from the target of an electron target is necessary while designing the shielding for the accelerator head from radiation safety point of view. The computations were carried out for the intensity and energy distribution of photon spectrum from a 0.4 cm thick tungsten target in different angular directions for 15 MeV electrons using a validated Monte Carlo code MCNP4A. Similar results were computed for 30 MeV electrons and found agreeing with the data available in literature. These graphs and the TVT values in lead help to suggest an optimum shielding thickness for 15 MV Linac head. (author)

  6. GAO Strategic Plan 2007-2012

    National Research Council Canada - National Science Library

    2007-01-01

    In keeping with GAO's commitment to update its strategic plan at least once every 3 years consistent with the Government Performance and Results Act this strategic plan describes our proposed goals...

  7. Approach to the open advanced facilities initiative for innovation (strategic use by industry) at the University of Tsukuba, Tandem Accelerator Complex

    International Nuclear Information System (INIS)

    Sasa, K.; Tagishi, Y.; Naramoto, H.; Kudo, H.; Kita, E.

    2010-01-01

    The University of Tsukuba, Tandem Accelerator Complex (UTTAC) possesses the 12UD Pelletron tandem accelerator and the 1 MV Tandetron accelerator for University's inter-department education research. We have actively advanced collaborative researches with other research institutes and industrial users. Since the Open Advanced Facilities Initiative for Innovation by the Ministry of Education, Culture, Sports, Science and Technology started in 2007, 12 industrial experiments have been carried out at the UTTAC. This report describes efforts by University's accelerator facility to get industrial users. (author)

  8. Acceleration of saddle-point searches with machine learning

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Andrew A., E-mail: andrew-peterson@brown.edu [School of Engineering, Brown University, Providence, Rhode Island 02912 (United States)

    2016-08-21

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  9. Acceleration of saddle-point searches with machine learning

    International Nuclear Information System (INIS)

    Peterson, Andrew A.

    2016-01-01

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  10. Acceleration of saddle-point searches with machine learning.

    Science.gov (United States)

    Peterson, Andrew A

    2016-08-21

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  11. Secure multiparty computation goes live

    NARCIS (Netherlands)

    Bogetoft, P.; Christensen, D.L.; Damgard, Ivan; Geisler, M.; Jakobsen, T.; Kroigaard, M.; Nielsen, J.D.; Nielsen, J.B.; Nielsen, K.; Pagter, J.; Schwartzbach, M.; Toft, T.; Dingledine, R.; Golle, Ph.

    2009-01-01

    In this note, we report on the first large-scale and practical application of secure multiparty computation, which took place in January 2008. We also report on the novel cryptographic protocols that were used. This work was supported by the Danish Strategic Research Council and the European

  12. Secure multiparty computation goes live

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Christensen, Dan Lund; Damgård, Ivan Bjerre

    2009-01-01

    In this note, we report on the first large-scale and practical application of secure multiparty computation, which took place in January 2008. We also report on the novel cryptographic protocols that were used. This work was supported by the Danish Strategic Research Council and the European...

  13. Improving Strategic Planning for Federal Public Health Agencies Through Collaborative Strategic Management

    Science.gov (United States)

    2013-03-01

    and Results Act (GPRA) was passed, requiring all federal agencies to engage in strategic planning and nudging them towards comprehensive strategic...involves the social- psychological process of sense making that leads to negotiations. This stage is when the individual partner organizations...expectations through informal bargaining and informal sense making Commitments For future action through formal legal contract or psychological contract

  14. galario: Gpu Accelerated Library for Analyzing Radio Interferometer Observations

    Science.gov (United States)

    Tazzari, Marco; Beaujean, Frederik; Testi, Leonardo

    2017-10-01

    The galario library exploits the computing power of modern graphic cards (GPUs) to accelerate the comparison of model predictions to radio interferometer observations. It speeds up the computation of the synthetic visibilities given a model image (or an axisymmetric brightness profile) and their comparison to the observations.

  15. Strategic issues in information technology international implications for decision makers

    CERN Document Server

    Schütte, Hellmut

    1988-01-01

    Strategic Issues in Information Technology: International Implications for Decision Makers presents the significant development of information technology in the output of components, computers, and communication equipment and systems. This book discusses the integration of information technology into factories and offices to increase productivity.Organized into six parts encompassing 12 chapters, this book begins with an overview of the advancement towards an automated interpretation communication system to achieve real international communication. This text then examines the main determining

  16. Alibaba's strategic drift

    OpenAIRE

    Kim, Young-Chan; Chen, Pi-Chi

    2016-01-01

    It is fundamental in both a theoretical and practical sense, to analyse the strategies of successful e-businesses who were formulated and operated alongside incumbent competitors. Thus, there have been an array of strategic arguments concerning the rapidly-burgeoning virtual powerhouse Alibaba, who amidst a sea of fortified competitors, found their ground to become one of the most prominent e-businesses of the decade. At the commencing stages, Alibaba lacked a specific strategic goal, aside f...

  17. New Military Strategic Communications System

    National Research Council Canada - National Science Library

    Baldwin, Robert F

    2007-01-01

    ... audience through unified action. The Quadrennial Defense Review Roadmap for Strategic Communications and the Department of Defense, Report of the Defense Science Board Task Force on Strategic Communication both concluded that the US...

  18. Strategic planning and managerial control

    OpenAIRE

    Mihaela Ghicajanu

    2004-01-01

    In this paper present relationship among strategic planning and managerial control process. For begin I want present few elements about strategic planning and managerial control in order to identify link inter these elements.

  19. Strategic alliances in engineering, technology and development

    International Nuclear Information System (INIS)

    Jazrawi, W.

    1991-01-01

    The role of strategic alliances in the development of heavy oil resources, both mineable and in-situ, is discussed. A strategic alliance is defined as a custom designed, long term collaborative working arrangement between two parties to pool, exchange, and integrate their resources to maximize mutual gain. A combination of one or more of the following success factors is seen as contributing to the unlocking of static heavy oil resources: sufficiently high and sustained crude oil prices; strategic intent to pursue heavy oil development regardless of short-term setbacks or economic downturns; technology breakthroughs that can reduce bitumen supply and upgrading costs; and strategic alliances. An idealized model for strategic alliances designed to help develop heavy oil resources is illustrated. The advantages and pitfalls involved in strategic alliances are listed along with the characteristics of viable contract agreements for such alliances. Some examples of strategic alliances in engineering and technology development are presented from Alberta experience. 2 figs., 1 tab

  20. The strategic impact of social networks on the online gaming industry : strategic use of technology

    OpenAIRE

    Sousa, Sofia Taveira de

    2012-01-01

    This dissertation focuses on assessing the strategic potential of social networks by answering the following research question: Is there any strategic impact of social networks on the online gaming industry? In order to analyze the strategic potential of social networks for online games, we identify the main factors that online players consider as crucial for them to keep playing. These factors can either be related to the game’s strategy itself, such as all the details, graphics and ambig...

  1. 12 CFR 228.27 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Strategic plan. 228.27 Section 228.27 Banks and... REINVESTMENT (REGULATION BB) Standards for Assessing Performance § 228.27 Strategic plan. (a) Alternative...(s) under a strategic plan if: (1) The bank has submitted the plan to the Board as provided for in...

  2. 13 CFR 313.6 - Strategic Plans.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Strategic Plans. 313.6 Section 313... § 313.6 Strategic Plans. (a) General. An Impacted Community that intends to apply for a grant for implementation assistance under § 313.7 shall develop and submit a Strategic Plan to EDA for evaluation and...

  3. The value contribution of strategic foresight

    DEFF Research Database (Denmark)

    Rohrbeck, René; Schwarz, Jan Oliver

    2013-01-01

    This paper focuses on exploring the potential and empirically observable value creation of strategic foresight activities in firms. We first review the literature on strategic foresight, innovation management and strategic management in order to identify the potential value contributions. We use ......, (3) influencing other actors, (4) and through an enhanced capacity for organizational learning....

  4. 23 CFR 1335.6 - Strategic plan.

    Science.gov (United States)

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Strategic plan. 1335.6 Section 1335.6 Highways NATIONAL... § 1335.6 Strategic plan. A strategic plan shall— (a) Be a multi-year plan that identifies and prioritizes... performance-based measures by which progress toward those goals will be determined; and (c) Be submitted to...

  5. Accurate and efficient spin integration for particle accelerators

    International Nuclear Information System (INIS)

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; Barber, Desmond P.

    2015-01-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  6. Accurate and efficient spin integration for particle accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Abell, Dan T.; Meiser, Dominic [Tech-X Corporation, Boulder, CO (United States); Ranjbar, Vahid H. [Brookhaven National Laboratory, Upton, NY (United States); Barber, Desmond P. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2015-01-15

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  7. Strategic planning: today's hot buttons.

    Science.gov (United States)

    Bohlmann, R C

    1998-01-01

    The first generation of mergers and managed care hasn't slowed down group practices' need for strategic planning. Even groups that already went through one merger are asking about new mergers or ownership possibilities, the future of managed care, performance standards and physician unhappiness. Strategic planning, including consideration of bench-marking, production of ancillary services and physician involvement, can help. Even if only a short, general look at the future, strategic planning shows the proactive leadership needed in today's environment.

  8. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  9. Lo Strategic Management Accounting

    OpenAIRE

    G. INVERNIZZI

    2005-01-01

    Il saggio indaga gli aggregati informativi e gli elementi che compongono lo strategic management accounting. Sono quindi analizzate le funzioni svolte nei diversi stadi del processo di gestione strategica osservando il suo ruolo all’interno del management accounting. Infine sono approfonditi i rapporti fra i livelli della gestione strategica e lo strategic management accounting.

  10. Induction Accelerator Efficiency at 5 Hz

    International Nuclear Information System (INIS)

    Molvik, A.W.; Faltens, A.

    2000-01-01

    We simulate fusion power plant driver efficiency by pulsing small induction cores at 5 Hz (a typical projected power plant repetition rate), with a resistive load in the secondary winding that is scaled to simulate the beam loading for induction acceleration. Starting from a power plant driver design that is based on other constraints, we obtain the core mass and acceleration efficiency for several energy ranges of the driver accelerator and for three magnetic alloys. The resistor in the secondary is chosen to give the same acceleration efficiency, the ratio of beam energy gain to energy input to the core module (core plus acceleration gap), as was computed for the driver. The pulser consists of a capacitor switched by FETs, Field Effect Transistors, which are gated on for the desired pulse duration. The energy to the resistor is evaluated during the portion of the pulse that is adequately flat. We present data over a range of 0.6 to 5 μs pulse lengths. With 1 μs pulses, the acceleration efficiency at 5 Hz is measured to be 75%, 52%, and 32% for thin-tape-wound cores of nanocrystalline, amorphous, and 3% silicon steel materials respectively, including only core losses. The efficiency increases for shorter pulse durations

  11. Rewarding Stakeholders: The Perspective of Strategic Entrepreneurship

    OpenAIRE

    Dissanayake, Srinath

    2013-01-01

    Prime concern on stakeholders is a crucial aspect in each business success. Among the wide spectrum of organizational strategies, Strategic Entrepreneurship pays a greater emphasis. This essay details practical as well as empirical grounds with regard to the notion of Strategic Entrepreneurship. Focally, strategic Entrepreneurship is an integration of Entrepreneurship (Opportunity Seeking Behavior) and Strategic Management (Advantage Seeking Behavior). Thus I conclude, an amalgamation of Str...

  12. Strategic Planning for Interdisciplinary Science: a Geoscience Success Story

    Science.gov (United States)

    Harshvardhan, D.; Harbor, J. M.

    2003-12-01

    The Department of Earth and Atmospheric Sciences at Purdue University has engaged in a continuous strategic planning exercise for several years, including annual retreats since 1997 as an integral part of the process. The daylong Saturday retreat at the beginning of the fall semester has been used to flesh out the faculty hiring plan for the coming year based on the prior years' plans. The finalized strategic plan is built around the choice of three signature areas, two in disciplinary fields, (i) geodynamics and active tectonics, (ii) multi-scale atmospheric interactions and one interdisciplinary area, (iii) atmosphere/surface interactions. Our experience with strategic planning and the inherently interdisciplinary nature of geoscience helped us recently when our School of Science, which consists of seven departments, announced a competition for 60 new faculty positions that would be assigned based on the following criteria, listed in order of priority - (i) scientific merit and potential for societal impact, (ii) multidisciplinary nature of topic - level of participation and leveraging potential, (iii) alignment with Purdue's strategic plan - discovery, learning, engagement, (iv) existence of critical mass at Purdue and availability of faculty and student candidate pools, (v) corporate and federal sponsor interest. Some fifty white papers promoting diverse fields were submitted to the school and seven were chosen after a school-wide retreat. The department fared exceedingly well and we now have significant representation on three of the seven school areas of coalescence - (i) climate change, (ii) computational science and (iii) science education research. We are now in the process of drawing up hiring plans and developing strategies for allocation and reallocation of resources such as laboratory space and faculty startup to accommodate the 20% growth in faculty strength that is expected over the next five years.

  13. Heavy ions acceleration in RF wells of 2-frequency electromagnetic field and in the inverted FEL

    International Nuclear Information System (INIS)

    Dzergach, A.I.; Kabanov, V.S.; Nikulin, M.G.; Vinogradov, S.V.

    1995-03-01

    Last results of the study of heavy ions acceleration by electrons trapped in moving 2-frequency 3-D RF wells are described. A linearized theoretical model of ions acceleration in a polarized spheroidal plasmoid is proposed. The equilibrium state of this plasmoid is described by the modified microcanonical distribution of the Courant-Snyder invariant (open-quotes quasienergyclose quotes of electrons). Some new results of computational simulation of the acceleration process are given. The method of computation takes into account the given cylindrical field E 011 (var-phi,r,z) and the self fields of electrons and ions. The results of the computation at relatively short time intervals confirm the idea and estimated parameters of acceleration. The heavy ion accelerator using this principle may be constructed with the use of compact cm band iris-loaded and biperiodical waveguides with double-sided 2-frequency RF feeding. It can accelerate heavy ions with a charge number Z i from small initial energies ∼ 50 keV/a.u. with the rate ∼ Z i · 10 MeV/m. Semirelativistic ions may be accelerated with similar rate also in the inverted FEL

  14. Strategizing in multiple ways

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Madsen, Charlotte Øland; Rasmussen, Jørgen Gulddahl

    2013-01-01

    Strategy processes are kinds of wayfaring where different actors interpret a formally defined strat-egy differently. In the everyday practice of organizations strategizing takes place in multiple ways through narratives and sensible actions. This forms a meshwork of polyphonic ways to enact one...... and the same strategy. The paper focusses on such processes as they develop in a Danish service company. It is done on the basis of an empirical and longitudinal study of a strategy process in the Service Company where the strategic purpose was to implement value-based management. The theme to be developed...... based on this development paper is whether one can understand these diver-gent strategic wayfaring processes as constructive for organizations....

  15. Department of Defense Strategic and Business Case Analyses for Commercial Products in Secure Mobile Computing

    Science.gov (United States)

    2011-06-01

    Solicitation / Modification of Contract. Fort Meade: National Security Agency. Mankiw , N. G. (2006). Essentials of Economics , 4 th Ed. Mason, OH: South...for current smartphone implementations. Results indicate growing strategic opportunities for the DoD to acquire more economical commercial handsets...opportunities for the DoD to acquire more economical commercial handsets and more flexible network services. The business cases may potentially save

  16. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    Science.gov (United States)

    Smith, L. M.; Hochstedler, R. D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  17. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    Smith, L.M.; Hochstedler, R.D.

    1997-01-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code)

  18. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  19. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2004-01-01

    Full text: Due to the drastic reduction (in previous years) of scientific and technical staff of the Department, our basic work in 2003 was limited to the following subjects: - the development of radiographic 4 MeV electron accelerator, - computational verification of basic parameters of a simplified version of ''6/15 MeV'' medical accelerator. - continuation of the study of photon and electron spectra of narrow photon beams with the use of the BEAMnrc Monte Carlo codes, - a study of accelerating and deflecting travelling wave RF structures based on experience already gained. The small 4-6 MeV electron linac was constructed in the Department as a tool for radiographic services which may be offered by our Institute. In 2003, the most important sub-units of the accelerator were constructed and completed. Accelerated electron beam intensity up to 80 mA was already obtained and for the following year the energy spectrum measurement, energy and intensity optimisation for e - /X-ray conversion and also first exposures are planned. Because in the realisation of the 6/15 MeV Accelerator Project, the Department was responsible for calculations of beam guiding and acceleration (accelerating section with triode electron gun, beam focusing, achromatic deviation), last year some verifying computations were done. This concerned mainly the influence of the variation of gun injection energy and RF frequency shifts on beam dynamics. The computational codes written in the Department are still used and continuously developed for this and similar purposes. The triode gun, originally thought as a part of 6/15 MeV medical accelerator, is on long term testing, showing very good performance; a new pulse modulator for that sub-unit was designed. The Monte Carlo calculations of narrow photon beams are continued. Intensity modulated radiation therapy (IMRT) is expected to play a dominant role in the years to come. Our principal researcher hereafter receiving PhD degree collaborates on IMRT

  20. Concept of an accelerator-driven subcritical research reactor within the TESLA accelerator installation

    International Nuclear Information System (INIS)

    Pesic, Milan; Neskovic, Nebojsa

    2006-01-01

    Study of a small accelerator-driven subcritical research reactor in the Vinca Institute of Nuclear Sciences was initiated in 1999. The idea was to extract a beam of medium-energy protons or deuterons from the TESLA accelerator installation, and to transport and inject it into the reactor. The reactor core was to be composed of the highly enriched uranium fuel elements. The reactor was designated as ADSRR-H. Since the use of this type of fuel elements was not recommended any more, the study of a small accelerator-driven subcritical research reactor employing the low-enriched uranium fuel elements began in 2004. The reactor was designated as ADSRR-L. We compare here the results of the initial computer simulations of ADSRR-H and ADSRR-L. The results have confirmed that our concept could be the basis for designing and construction of a low neutron flux model of the proposed accelerator-driven subcritical power reactor to be moderated and cooled by lead. Our objective is to study the physics and technologies necessary to design and construct ADSRR-L. The reactor would be used for development of nuclear techniques and technologies, and for basic and applied research in neutron physics, metrology, radiation protection and radiobiology

  1. Networks and meshworks in strategizing

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Andersen, Poul Houman

    The purpose of this paper is to examine the business network metaphor in relation to strategizing in business and to tentatively propose an alternative metaphor, that of the business meshwork. The paper reviews existing work on strategy and strategizing within the IMP literature, particularly...... the literature on networks and network pictures, and identifies several shortcomings of this work. To develop the notion of business meshworks as an alternative for understanding strategizing practices in business interaction, the paper draws on recent writings within anthropology and the strategy...

  2. Computationally efficient dynamic modeling of robot manipulators with multiple flexible-links using acceleration-based discrete time transfer matrix method

    DEFF Research Database (Denmark)

    Zhang, Xuping; Sørensen, Rasmus; RahbekIversen, Mathias

    2018-01-01

    This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations, and then is linea......This paper presents a novel and computationally efficient modeling method for the dynamics of flexible-link robot manipulators. In this method, a robot manipulator is decomposed into components/elements. The component/element dynamics is established using Newton–Euler equations......, and then is linearized based on the acceleration-based state vector. The transfer matrices for each type of components/elements are developed, and used to establish the system equations of a flexible robot manipulator by concatenating the state vector from the base to the end-effector. With this strategy, the size...... manipulators, and only involves calculating and transferring component/element dynamic equations that have small size. The numerical simulations and experimental testing of flexible-link manipulators are conducted to validate the proposed methodologies....

  3. Strengthening Deterrence for 21st Century Strategic Conflicts and Competition: Accelerating Adaptation and Integration - Annotated Bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Durkalec, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-11-01

    This was the fourth in a series of annual events convened at Livermore to exploring the emerging place of the “new domains” in U.S. deterrence strategies. The purposes of the series are to facilitate the emergence of a community of interest that cuts across the policy, military, and technical communities and to inform laboratory strategic planning. U.S. allies have also been drawn into the conversation, as U.S. deterrence strategies are in part about their protection. Discussion in these workshops is on a not-for-attribution basis. It is also makes no use of classified information. On this occasion, there were nearly 100 participants from a dozen countries.

  4. Peran Strategic Entrepreneurship dalam Membangun Sustainable Competitive Advantage

    Directory of Open Access Journals (Sweden)

    Agustinus Dedy Handrimurtjahjo

    2014-11-01

    Full Text Available Strategic entrepreneurship has emerged as a new concept in examining convergence in entrepreneurship studies (opportunity-seeking behavior and strategic management (advantage-seeking behavior. Studies in the area of strategic management have gradually exposed the relationship betweenstrategic management and entrepreneurship: entrepreneurial strategy making; intrapreneurship; entrepreneurial strategic posture within organizations; entrepreneurial orientation; strategic managementintegration as a context for entrepreneurial actions; and entrepreneurship theory with strategic management and the resource-based view (RBV. A conceptual model of SE that has been developed by Ireland et al.suggested that a firm which linearly and sequentially: employs an entrepreneurial mindset to identify opportunities; manages resources strategically to tackle the opportunity; applies creativity and innovation; andgenerates a competitive advantage is strategic and entrepreneurship operation. Managers must maximize the pursuit of new business opportunities while simultaneously maximize the generation and application of temporary competitive advantages to sustainably create organizational value. This paper develops a conceptual framework that demonstrate the role of strategic entrepreneurship in building sustainable competitiveadvantage.

  5. THE STRATEGIC DIAGNOSIS ANALYSIS - AN ESSENTIAL STAGE OF STRATEGIC MANAGEMENT PROCESS IN SMALL AND MEDIUM ENTERPRISES

    OpenAIRE

    Vladimir-Codrin IONESCU; Horea COROIU

    2010-01-01

    The strategic diagnosis analysis aims to assess the potential of small and medium enterprises by evaluating their inner resources and the business environment within which these enterprises perform their activity. As a first stage in the strategic management process, the strategic diagnosis analysis ensures the premises for founding, elaborating and operationalizing a competitive managerial strategy. In this context, the paper presents the conceptual criteria which are essential for thematica...

  6. How strategic dynamics complicate the framing of alternatives in strategic environmental assessment

    DEFF Research Database (Denmark)

    Lyhne, Ivar

    2012-01-01

    of the Danish Natural Gas Security of Supply Plan. Special emphasis is given to the framing of alternatives in the SEA process, since alternatives are directly related to the contextual developments. Based on a participative approach, strategic dynamics are mapped and the reactions and concerns in the SEA team......Unpredictable and complex developments challenge the application of strategic environmental assessment (SEA), e.g. in terms of timing, prediction, and relevance of assessments. Especially multi-actor and unstructured strategic level decision-making processes often seem to be characterised...... by unpredictable and complex changes. Despite apparent implications, explorative investigations about how unpredictability influences SEA application in practice are rare. This article aims at shedding light on contextual changes and reactions to such changes in practice by a case study of the specific SEA process...

  7. Equipartitioning in linear accelerators

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1982-01-01

    Emittance growth has long been a concern in linear accelerators, as has the idea that some kind of energy balance, or equipartitioning, between the degrees of freedom, would ameliorate the growth. M. Prome observed that the average transverse and longitudinal velocity spreads tend to equalize as current in the channel is increased, while the sum of the energy in the system stays nearly constant. However, only recently have we shown that an equipartitioning requirement on a bunched injected beam can indeed produce remarkably small emittance growth. The simple set of equations leading to this condition are outlined. At the same time, Hofmann has investigated collective instabilities in transported beams and has identified thresholds and regions in parameter space where instabilities occur. Evidence is presented that shows transport system boundaries to be quite accurate in computer simulations of accelerating systems. Discussed are preliminary results of efforts to design accelerators that avoid parameter regions where emittance is affected by the instabilities identified by Hofmann. These efforts suggest that other mechanisms are present. The complicated behavior of the RFQ linac in this framework also is shown

  8. Particle acceleration and injection problem in relativistic and nonrelativistic shocks

    International Nuclear Information System (INIS)

    Hoshino, M.

    2008-01-01

    Acceleration of charged particles at the collisionless shock is believed to be responsible for production of cosmic rays in a variety of astrophysical objects such as supernova, AGN jet, and GRB etc., and the diffusive shock acceleration model is widely accepted as a key process for generating cosmic rays with non-thermal, power-law energy spectrum. Yet it is not well understood how the collisionless shock can produce such high energy particles. Among several unresolved issues, two major problems are the so-called '' injection '' problem of the supra-thermal particles and the generation of plasma waves and turbulence in and around the shock front. With recent advance of computer simulations, however, it is now possible to discuss those issues together with dynamical evolution of the kinetic shock structure. A wealth of modern astrophysical observations also inspires the dynamical shock structure and acceleration processes along with the theoretical and computational studies on shock. In this presentation, we focus on the plasma wave generation and the associated particle energization that directly links to the injection problem by taking into account the kinetic plasma processes of both non-relativistic and relativistic shocks by using a particle-in-cell simulation. We will also discuss some new particle acceleration mechanisms such as stochastic surfing acceleration and wakefield acceleration by the action of nonlinear electrostatic fields. (author)

  9. NASA strategic plan

    Science.gov (United States)

    1994-01-01

    The NASA Strategic Plan is a living document. It provides far-reaching goals and objectives to create stability for NASA's efforts. The Plan presents NASA's top-level strategy: it articulates what NASA does and for whom; it differentiates between ends and means; it states where NASA is going and what NASA intends to do to get there. This Plan is not a budget document, nor does it present priorities for current or future programs. Rather, it establishes a framework for shaping NASA's activities and developing a balanced set of priorities across the Agency. Such priorities will then be reflected in the NASA budget. The document includes vision, mission, and goals; external environment; conceptual framework; strategic enterprises (Mission to Planet Earth, aeronautics, human exploration and development of space, scientific research, space technology, and synergy); strategic functions (transportation to space, space communications, human resources, and physical resources); values and operating principles; implementing strategy; and senior management team concurrence.

  10. Strategic Management Accounting Development during Last 30 Years

    OpenAIRE

    Šoljaková, Libuše

    2012-01-01

    This paper analyses some reasons why strategic management accounting was not widely accepted. After initial boom of strategic management accounting there is stagnation in recent year. Application of strategic management accounting in practice does not exceed pilot case study. Strategic management accounting lessons are not commonly included in educational programs. Finally researches on strategic management accounting have only limited results. Paper is based on literature review and empirica...

  11. Leadership side in changing strategic creation of firms

    OpenAIRE

    Malinovska, Elizabeta

    2013-01-01

    The research of this master paper focuses on the strategic leadership or the role that the strategic leadership plays when creating strategic changes within companies. Particular matters that this paper considers refer to concepts of leadership and strategic leadership which may be found in countries with developed market economies and enormous knowledge and experience in management, furthermore it is the concepts of strategic management which the leadership becomes vital eleme...

  12. Crew and Thermal Systems Strategic Communications Initiatives in Support of NASA's Strategic Goals

    Science.gov (United States)

    Paul, Heather L.; Lamberth, Erika Guillory; Jennings, Mallory A.

    2012-01-01

    NASA has defined strategic goals to invest in next-generation technologies and innovations, inspire students to become the future leaders of space exploration, and expand partnerships with industry and academia around the world. The Crew and Thermal Systems Division (CTSD) at the NASA Johnson Space Center actively supports these NASA initiatives. In July 2011, CTSD created a strategic communications team to communicate CTSD capabilities, technologies, and personnel to external technical audiences for business development and collaborative initiatives, and to students, educators, and the general public for education and public outreach efforts. This paper summarizes the CTSD Strategic Communications efforts and metrics through the first half of fiscal year 2012 with projections for end of fiscal year data.

  13. Crew and Thermal Systems Strategic Communications Initiatives in Support of NASA's Strategic Goals

    Science.gov (United States)

    Paul, Heather L.

    2012-01-01

    NASA has defined strategic goals to invest in next-generation technologies and innovations, to inspire students to become the future leaders of space exploration, and to expand partnerships with industry and academia around the world. The Crew and Thermal Systems Division (CTSD) at the NASA Johnson Space Center actively supports these NASA initiatives. In July 2011, CTSD created a strategic communications team to communicate CTSD capabilities, technologies, and personnel to internal NASA and external technical audiences for business development and collaborative initiatives, and to students, educators, and the general public for education and public outreach efforts. This paper summarizes the CTSD Strategic Communications efforts and metrics through the first nine months of fiscal year 2012.

  14. Extending Ansoff’s Strategic Diagnosis Model

    OpenAIRE

    Daniel Kipley; Alfred O. Lewis; Jau-Lian Jeng

    2012-01-01

    Given the complex and disruptive open-ended dynamics in the current dynamic global environment, senior management recognizes the need for a formalized, consistent, and comprehensive framework to analyze the firm’s strategic posture. Modern assessment tools, such as H. Igor Ansoff’s seminal contributions to strategic diagnosis, primarily focused on identifying and enhancing the firm’s strategic performance potential thr...

  15. Accelerator simulation and theoretical modelling of radiation effects (SMoRE)

    CERN Document Server

    2018-01-01

    This publication summarizes the findings and conclusions of the IAEA coordinated research project (CRP) on accelerator simulation and theoretical modelling of radiation effects, aimed at supporting Member States in the development of advanced radiation-resistant structural materials for implementation in innovative nuclear systems. This aim can be achieved through enhancement of both experimental neutron-emulation capabilities of ion accelerators and improvement of the predictive efficiency of theoretical models and computer codes. This dual approach is challenging but necessary, because outputs of accelerator simulation experiments need adequate theoretical interpretation, and theoretical models and codes need high dose experimental data for their verification. Both ion irradiation investigations and computer modelling have been the specific subjects of the CRP, and the results of these studies are presented in this publication which also includes state-ofthe- art reviews of four major aspects of the project...

  16. Computer automation of an accelerator mass spectrometry system

    International Nuclear Information System (INIS)

    Gressett, J.D.; Maxson, D.L.; Matteson, S.; McDaniel, F.D.; Duggan, J.L.; Mackey, H.J.; North Texas State Univ., Denton, TX; Anthony, J.M.

    1989-01-01

    The determination of trace impurities in electronic materials using accelerator mass spectrometry (AMS) requires efficient automation of the beam transport and mass discrimination hardware. The ability to choose between a variety of charge states, isotopes and injected molecules is necessary to provide survey capabilities similar to that available on conventional mass spectrometers. This paper will discuss automation hardware and software for flexible, high-sensitivity trace analysis of electronic materials, e.g. Si, GaAs and HgCdTe. Details regarding settling times will be presented, along with proof-of-principle experimental data. Potential and present applications will also be discussed. (orig.)

  17. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  18. Using electronic patient records to inform strategic decision making in primary care.

    Science.gov (United States)

    Mitchell, Elizabeth; Sullivan, Frank; Watt, Graham; Grimshaw, Jeremy M; Donnan, Peter T

    2004-01-01

    Although absolute risk of death associated with raised blood pressure increases with age, the benefits of treatment are greater in elderly patients. Despite this, the 'rule of halves' particularly applies to this group. We conducted a randomised controlled trial to evaluate different levels of feedback designed to improve identification, treatment and control of elderly hypertensives. Fifty-two general practices were randomly allocated to either: Control (n=19), Audit only feedback (n=16) or Audit plus Strategic feedback, prioritising patients by absolute risk (n=17). Feedback was based on electronic data, annually extracted from practice computer systems. Data were collected for 265,572 patients, 30,345 aged 65-79. The proportion of known hypertensives in each group with BP recorded increased over the study period and the numbers of untreated and uncontrolled patients reduced. There was a significant difference in mean systolic pressure between the Audit plus Strategic and Audit only groups and significantly greater control in the Audit plus Strategic group. Providing patient-specific practice feedback can impact on identification and management of hypertension in the elderly and produce a significant increase in control.

  19. Strategic Planning in Irish Quantity Surveying Pracitces

    OpenAIRE

    Murphy, Roisin

    2011-01-01

    The role and usefulness of strategic planning has been well documented over several decades of strategic management research. Despite the significant body of existing knowledge in the field of strategic planning, there remains a paucity of investigation into the construction sector, specifically in Professional Service Firms (PSF’s) operating within it. The aim of this research was to ascertain the type, scope and extent of strategic planning within Irish Quantity Surveying (QS) practices and...

  20. STRATEGIC MANAGEMENT OF A TERRITORIAL DISTRIBUTED COMPLEX

    OpenAIRE

    Vidovskiy L. A.; Yanaeva M. V.; Murlin A. G.; Murlinа V. A.

    2015-01-01

    The article is devoted to strategic management and implementation of the strategy. Management strategy is based on the management of strategic potential of the enterprise. The strategic potential of the company generates only those resources that can be changed because of strategic decisions. Analysis of the potential of the enterprise should cover almost all spheres of its activity: the enterprise management, production, marketing, finance, human resources. The article has designed a system ...

  1. Strategic entrepreneurship in the future: Theoretical views

    Directory of Open Access Journals (Sweden)

    Martinović Milan

    2016-01-01

    Full Text Available The paper describes the role and importance of strategy in contemporary entrepreneurship, which operates under conditions of rapid change. Strategic Enterprise includes a set of entrepreneurial actions undertaken for the purpose of strategic perspective. It describes the key characteristics of modern entrepreneurs with strategic emphasis on the needs of the strategic analysis of business opportunities and possibilities of business enterprises in the specific conditions, at the beginning of the third millennium.

  2. Services of strategic consulting: special features and types

    OpenAIRE

    Klenin Oleh Volodymyrovych

    2016-01-01

    In the article essence of terms “consulting” and “strategic consulting” was studied. It was proved that strategic consulting must be analyzed as professional activity in the system of enterprise strategic management. Specific features of consultative services and its composition was added from the point of view of strategic management and innovative development were studied. In the process of providing consulting services it was suggested to take into account the strategic orientation of huma...

  3. Strategic environmental assessment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    1997-01-01

    The integration of environmental considerations into strategic decision making is recognized as a key to achieving sustainability. In the European Union a draft directive on Strategic Environmental Assessment (SEA) is currently being reviewed by the member states. The nature of the proposed SEA...... that the SEA directive will influence the decision-making process positively and will help to promote improved environmental decisions. However, the guidelines for public participation are not sufficient and the democratic element is strongly limited. On the basis of these findings, recommendations relating...

  4. Estimating strategic interactions in petroleum exploration

    International Nuclear Information System (INIS)

    Lin, C.-Y. Cynthia

    2009-01-01

    When individual petroleum-producing firms make their exploration decisions, information externalities and extraction externalities may lead them to interact strategically with their neighbors. If they do occur, strategic interactions in exploration would lead to a loss in both firm profit and government royalty revenue. Since these strategic interactions would be inefficient, changes in the government offshore leasing policy would need to be considered. The possibility of strategic interactions thus poses a concern to policy-makers and affects the optimal government policy. This paper examines whether these inefficient strategic interactions take place in U.S. federal lands in the Gulf of Mexico. In particular, it analyzes whether a firm's exploration decisions depend on the decisions of firms owning neighboring tracts of land. A discrete response model of a firm's exploration timing decision that uses variables based on the timing of a neighbor's lease term as instruments for the neighbor's decision is employed. The results suggest that strategic interactions do not actually take place, at least not in exploration, and therefore that the current parameters of the government offshore leasing policy do not lead to inefficient petroleum exploration. (author)

  5. GPU accelerated manifold correction method for spinning compact binaries

    Science.gov (United States)

    Ran, Chong-xi; Liu, Song; Zhong, Shuang-ying

    2018-04-01

    The graphics processing unit (GPU) acceleration of the manifold correction algorithm based on the compute unified device architecture (CUDA) technology is designed to simulate the dynamic evolution of the Post-Newtonian (PN) Hamiltonian formulation of spinning compact binaries. The feasibility and the efficiency of parallel computation on GPU have been confirmed by various numerical experiments. The numerical comparisons show that the accuracy on GPU execution of manifold corrections method has a good agreement with the execution of codes on merely central processing unit (CPU-based) method. The acceleration ability when the codes are implemented on GPU can increase enormously through the use of shared memory and register optimization techniques without additional hardware costs, implying that the speedup is nearly 13 times as compared with the codes executed on CPU for phase space scan (including 314 × 314 orbits). In addition, GPU-accelerated manifold correction method is used to numerically study how dynamics are affected by the spin-induced quadrupole-monopole interaction for black hole binary system.

  6. Solving radiation problems at particle accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Nikolai V. Mokhov

    2001-12-11

    At high-intensity high-energy particle accelerators, consequences of a beam-induced radiation impact on machine and detector components, people, environment and complex performance can range from negligible to severe. The specifics, general approach and tools used at such machines for radiation analysis are described. In particular, the world leader Fermilab accelerator complex is considered, with its fixed target and collider experiments, as well as new challenging projects such as LHC, VLHC, muon collider and neutrino factory. The emphasis is on mitigation of deleterious beam-induced radiation effects and on the key role of effective computer simulations.

  7. Solving radiation problems at particle accelerators

    International Nuclear Information System (INIS)

    Mokhov, N.V.

    2001-01-01

    At high-intensity high-energy particle accelerators, consequences of a beam-induced radiation impact on machine and detector components, people, environment and complex performance can range from negligible to severe. The specifics, general approach and tools used at such machines for radiation analysis are described. In particular, the world leader Fermilab accelerator complex is considered, with its fixed target and collider experiments, as well as new challenging projects such as LHC, VLHC, muon collider and neutrino factory. The emphasis is on mitigation of deleterious beam-induced radiation effects and on the key role of effective computer simulations

  8. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  9. 2015 Enterprise Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-08-01

    This document aligns with the Department of Energy Strategic Plan for 2014-2018 and provides a framework for integrating our missions and direction for pursuing DOE’s strategic goals. The vision is a guide to advancing world-class science and engineering, supporting our people, modernizing our infrastructure, and developing a management culture that operates a safe and secure enterprise in an efficient manner.

  10. Department of Accelerator Physics and Technology: Overview

    International Nuclear Information System (INIS)

    Plawski, E.

    2003-01-01

    Full text: The main activities of the Accelerator Physics and Technology Department were focused on following subjects: - contribution to development and building of New Therapeutical Electron Accelerator delivering the photon beams of 6 and 15 MeV, - study of the photon and electron spectra of narrow photon beams with the use of the BEAM/EGSnrc codes, - design and construction of special RF structures for use in CLIC Test Facility in CERN, - design and construction of 1:1 copper, room temperature models of accelerating superconducting 1.3 GHz structures for TESLA Project in DESY. In spite of drastic reduction of scientific and technical staff (from 16 to 10 persons) the planned works were successfully completed, but requested some extraordinary efforts. In realisation of 6/15 MeV Accelerator Project, the Department was responsible all along the project for calculations of all most important parts (electron gun, accelerating structure, beam focusing, achromatic deviation) and also for construction and physical modelling of some strategic subassemblies. The results of scientific and technical achievements of our Department in this work are documented in the Annex to Final Report on realisation of KBN Scientific Project No PBZ 009-13 and earlier Annual Reports 2000 and 2001. The results of Monte Carlo calculations of narrow photon beams and experimental verification using Varian Clinac 2003CD, Simens Mevatron and CGR MeV Saturn accelerators ended up with PhD thesis prepared by MSc Anna Wysocka. Her thesis: Collimation and Dosimetry of X-ray Beams for Stereotactic Radiotherapy with Linear Accelerators was sponsored by KBN scientific Project Nr T11E 04121. In collaboration with LNF INFN Frascati the electron beam deflectors were designed for CERN CLIC Test Facility CTF3. These special type travelling wave RF structures were built by our Department and are actually operated in CTF3 experiment. As the result of collaboration with TESLA-FEL Project in DESY, the set of RF

  11. The Science of Strategic Communication

    Science.gov (United States)

    The field of Strategic Communication involves a focused effort to identify, develop, and present multiple types of communication media on a given subject. A Strategic Communication program recognizes the limitations of the most common communication models (primarily “one s...

  12. Strategic Partnerships in International Development

    Science.gov (United States)

    Treat, Tod; Hartenstine, Mary Beth

    2013-01-01

    This chapter provides a framework and recommendations for development of strategic partnerships in a variety of cultural contexts. Additionally, this study elucidates barriers and possibilities in interagency collaborations. Without careful consideration regarding strategic partnerships' approaches, functions, and goals, the ability to…

  13. Computing on Knights and Kepler Architectures

    International Nuclear Information System (INIS)

    Bortolotti, G; Caberletti, M; Ferraro, A; Giacomini, F; Manzali, M; Maron, G; Salomoni, D; Crimi, G; Zanella, M

    2014-01-01

    A recent trend in scientific computing is the increasingly important role of co-processors, originally built to accelerate graphics rendering, and now used for general high-performance computing. The INFN Computing On Knights and Kepler Architectures (COKA) project focuses on assessing the suitability of co-processor boards for scientific computing in a wide range of physics applications, and on studying the best programming methodologies for these systems. Here we present in a comparative way our results in porting a Lattice Boltzmann code on two state-of-the-art accelerators: the NVIDIA K20X, and the Intel Xeon-Phi. We describe our implementations, analyze results and compare with a baseline architecture adopting Intel Sandy Bridge CPUs.

  14. Hospital boards and hospital strategic focus: the impact of board involvement in strategic decision making.

    Science.gov (United States)

    Ford-Eickhoff, Karen; Plowman, Donde Ashmos; McDaniel, Reuben R

    2011-01-01

    Despite pressures to change the role of hospital boards, hospitals have made few changes in board composition or director selection criteria. Hospital boards have often continued to operate in their traditional roles as either "monitors" or "advisors." More attention to the direct involvement of hospital boards in the strategic decision-making process of the organizations they serve, the timing and circumstances under which board involvement occurs, and the board composition that enhances their abilities to participate fully is needed. We investigated the relationship between broader expertise among hospital board members, board involvement in the stages of strategic decision making, and the hospital's strategic focus. We surveyed top management team members of 72 nonacademic hospitals to explore the participation of critical stakeholder groups such as the board of directors in the strategic decision-making process. We used hierarchical regression analysis to explore our hypotheses that there is a relationship between both the nature and involvement of the board and the hospital's strategic orientation. Hospitals with broader expertise on their boards reported an external focus. For some of their externally-oriented goals, hospitals also reported that their boards were involved earlier in the stages of decision making. In light of the complex and dynamic environment of hospitals today, those charged with developing hospital boards should match the variety in the external issues that the hospital faces with more variety in board makeup. By developing a board with greater breadth of expertise, the hospital responds to its complex environment by absorbing that complexity, enabling a greater potential for sensemaking and learning. Rather than acting only as monitors and advisors, boards impact their hospitals' strategic focus through their participation in the strategic decision-making process.

  15. Conjectural variation based learning model of strategic bidding in spot market

    International Nuclear Information System (INIS)

    Yiqun Song; Yixin Ni; Fushuan Wen; Wu, F.F.

    2004-01-01

    In actual electricity market, which operates repeatedly on the basis of one hour or half hour, each firm might learn or estimate other competitors' strategic behaviors from available historical market operation data, and rationally aims at its maximum profit in the repeated biddings. A conjectural variation based learning method is proposed in this paper for generation firm to improve its strategic bidding performance. In the method, each firm learns and dynamically regulates its conjecture upon the reactions of its rivals to its bidding according to available information published in the electricity market, and then makes its optimal generation decision based on the updated conjectural variation of its rivals. Through such learning process, the equilibrium reached in the market is proven a Nash equilibrium. Motivation of generation firm to learn in the changing market environment and consequence of learning behavior in the market are also discussed through computer tests. (author)

  16. 77 FR 25706 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Science.gov (United States)

    2012-05-01

    ... DEPARTMENT OF DEFENSE Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of Advisory Committee closed meeting.... [[Page 25707

  17. Radiation shielding technology development for proton linear accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Ouk; Lee, Y. O.; Cho, Y. S. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Kim, M. H.; Sin, M. W.; Park, B. I. [Kyunghee Univ., Seoul (Korea, Republic of)] [and others

    2005-09-01

    This report was presented as an output of 2-year project of the first phase Proton Engineering Frontier Project(PEFP) on 'Radiation Shielding Technology Development for Proton Linear Accelerator' for 20/100 MeV accelerator beam line and facility. It describes a general design concept, provision and update of basic design data, and establishment of computer code system. It also includes results of conceptual and preliminary designs of beam line, beam dump and beam facilities as well as an analysis of air-activation inside the accelerator equipment. This report will guides the detailed shielding design and production of radiation safety analysis report scheduled in the second phase project.

  18. Accurate and efficient spin integration for particle accelerators

    Directory of Open Access Journals (Sweden)

    Dan T. Abell

    2015-02-01

    Full Text Available Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code gpuSpinTrack. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  19. Accelerated spike resampling for accurate multiple testing controls.

    Science.gov (United States)

    Harrison, Matthew T

    2013-02-01

    Controlling for multiple hypothesis tests using standard spike resampling techniques often requires prohibitive amounts of computation. Importance sampling techniques can be used to accelerate the computation. The general theory is presented, along with specific examples for testing differences across conditions using permutation tests and for testing pairwise synchrony and precise lagged-correlation between many simultaneously recorded spike trains using interval jitter.

  20. Correlated histogram representation of Monte Carlo derived medical accelerator photon-output phase space

    Science.gov (United States)

    Schach Von Wittenau, Alexis E.

    2003-01-01

    A method is provided to represent the calculated phase space of photons emanating from medical accelerators used in photon teletherapy. The method reproduces the energy distributions and trajectories of the photons originating in the bremsstrahlung target and of photons scattered by components within the accelerator head. The method reproduces the energy and directional information from sources up to several centimeters in radial extent, so it is expected to generalize well to accelerators made by different manufacturers. The method is computationally both fast and efficient overall sampling efficiency of 80% or higher for most field sizes. The computational cost is independent of the number of beams used in the treatment plan.