WorldWideScience

Sample records for accelerated strategic computing

  1. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories, along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.

  2. The DOE Accelerated Strategic Computing Initiative: Challenges and opportunities for predictive materials simulation capabilities

    Science.gov (United States)

    Mailhiot, Christian

    1998-05-01

    In response to the unprecedented national security challenges emerging from the end of nuclear testing, the Defense Programs of the Department of Energy has developed a long-term strategic plan based on a vigorous Science-Based Stockpile Stewardship (SBSS) program. The main objective of the SBSS program is to ensure confidence in the performance, safety, and reliability of the stockpile on the basis of a fundamental science-based approach. A central element of this approach is the development of predictive, ‘full-physics’, full-scale computer simulation tools. As a critical component of the SBSS program, the Accelerated Strategic Computing Initiative (ASCI) was established to provide the required advances in computer platforms and to enable predictive, physics-based simulation capabilities. In order to achieve the ASCI goals, fundamental problems in the fields of computer and physical sciences of great significance to the entire scientific community must be successfully solved. Foremost among the key elements needed to develop predictive simulation capabilities, the development of improved physics-based materials models is a cornerstone. We indicate some of the materials theory, modeling, and simulation challenges and illustrate how the ASCI program will enable both the hardware and the software tools necessary to advance the state-of-the-art in the field of computational condensed matter and materials physics.

  3. National Strategic Computing Initiative Strategic Plan

    Science.gov (United States)

    2016-07-01

    explore and accelerate new paths for future computing architectures and technologies, including digital computing and alternative computing...technologies that will move digital computing performance past the theoretical limits of complementary metal-oxide semiconductors, and (2) the R&D of...select individual technology paths for alternative computing paradigms or alternative digital computing technologies, so the NSCI will initially focus on

  4. Accelerating Strategic Change Through Action Learning

    DEFF Research Database (Denmark)

    Younger, Jon; Sørensen, René; Cleemann, Christine;

    2013-01-01

    Purpose – The purpose of this paper is to describe how a leading global company used action-learning based leadership development to accelerate strategic culture change. Design/methodology/approach – It describes the need for change, and the methodology and approach by which the initiative, Impac...

  5. Applications of the Strategic Defense Initiative's compact accelerators

    Science.gov (United States)

    Montanarelli, Nick; Lynch, Ted

    1991-12-01

    The Strategic Defense Initiative's (SDI) investment in particle accelerator technology for its directed energy weapons program has produced breakthroughs in the size and power of new accelerators. These accelerators, in turn, have produced spinoffs in several areas: the radio frequency quadrupole linear accelerator (RFQ linac) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI-sponsored compact induction linear accelerator may replace Cobalt-60 radiation and hazardous ethylene-oxide as a method for sterilizing medical products, and other SDIO-funded accelerators may be used to produce the radioactive isotopes oxygen-15, nitrogen-13, carbon-11, and fluorine-18 for positron emission tomography (PET). Other applications of these accelerators include bomb detection, non-destructive inspection, decomposing toxic substances in contaminated ground water, and eliminating nuclear waste.

  6. Computational Biology: A Strategic Initiative LDRD

    Energy Technology Data Exchange (ETDEWEB)

    Barksy, D; Colvin, M

    2002-02-07

    The goal of this Strategic Initiative LDRD project was to establish at LLNL a new core capability in computational biology, combining laboratory strengths in high performance computing, molecular biology, and computational chemistry and physics. As described in this report, this project has been very successful in achieving this goal. This success is demonstrated by the large number of referred publications, invited talks, and follow-on research grants that have resulted from this project. Additionally, this project has helped build connections to internal and external collaborators and funding agencies that will be critical to the long-term vitality of LLNL programs in computational biology. Most importantly, this project has helped establish on-going research groups in the Biology and Biotechnology Research Program, the Physics and Applied Technology Directorate, and the Computation Directorate. These groups include three laboratory staff members originally hired as post-doctoral researchers for this strategic initiative.

  7. Accelerating Clean Energy Commercialization. A Strategic Partnership Approach

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Richard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Arent, Douglas J. [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Locklin, Ken [Impax Asset Management Group (United Kingdom)

    2016-04-01

    Technology development in the clean energy and broader clean tech space has proven to be challenging. Long-standing methods for advancing clean energy technologies from science to commercialization are best known for relatively slow, linear progression through research and development, demonstration, and deployment (RDD&D); and characterized by well-known valleys of death for financing. Investment returns expected by traditional venture capital investors have been difficult to achieve, particularly for hardware-centric innovations, and companies that are subject to project finance risks. Commercialization support from incubators and accelerators has helped address these challenges by offering more support services to start-ups; however, more effort is needed to fulfill the desired clean energy future. The emergence of new strategic investors and partners in recent years has opened up innovative opportunities for clean tech entrepreneurs, and novel commercialization models are emerging that involve new alliances among clean energy companies, RDD&D, support systems, and strategic customers. For instance, Wells Fargo and Company (WFC) and the National Renewable Energy Laboratory (NREL) have launched a new technology incubator that supports faster commercialization through a focus on technology development. The incubator combines strategic financing, technology and technical assistance, strategic customer site validation, and ongoing financial support.

  8. Snowmass 2013 Computing Frontier: Accelerator Science

    CERN Document Server

    Spentzouris, P; Joshi, C; Amundson, J; An, W; Bruhwiler, D L; Cary, J R; Cowan, B; Decyk, V K; Esarey, E; Fonseca, R A; Friedman, A; Geddes, C G R; Grote, D P; Kourbanis, I; Leemans, W P; Lu, W; Mori, W B; Ng, C; Qiang, Ji; Roberts, T; Ryne, R D; Schroeder, C B; Silva, L O; Tsung, F S; Vay, J -L; Vieira, J

    2013-01-01

    This is the working summary of the Accelerator Science working group of the Computing Frontier of the Snowmass meeting 2013. It summarizes the computing requirements to support accelerator technology in both Energy and Intensity Frontiers.

  9. JGI Computing 5-Year Strategic Plan

    Energy Technology Data Exchange (ETDEWEB)

    Bader, D A; Brettin, T S; Cottingham, R W; Folta, P A; Golder, Y; Gregurick, S K; Himmel, M E; Mann, R C; Remington, K A; Slezak, T R

    2008-10-01

    A broad range of scientific goals and a similarly diverse set of consumers drive the informatics requirements and computing needs of the JGI. The scope of work in this area encompasses not only the informatics and analysis pipelines in support of the PGF sequence production, but also the integration of data from a variety of sources and sophisticated large scale analyses led by investigators within JGI and driven by the user science community. In laying out a forward looking strategy, the full range of these activities need to be examined together to build a comprehensive program that will serve as a catalyst for the DOE research community. The science landscape envisioned in the overall strategic plan calls for significantly increasing the throughput of microbial genomes sequenced to cover their phylogenetic space and building a set of finished reference plant genomes to enable DOE relevant science. Additionally, the established impact of microbial communities on global energy cycles and their potential in remediation endeavors, warrant building upon JGI's established expertise in metagenomic analysis. Not only is each of these program areas relevant and exciting in their own right, but they also can and should be undertaken in a way that allows synthesis across domains (e.g. utilize knowledge from sequence of plants and the soil from which they are grown). Both dramatic increases in the scale of genomic data collection and the synergistic potential of integrating data across domains will demand new strategies in the informatics pipeline within the JGI and in the facility's approach to computational analysis and user access to the data in aggregated form. In addition to a robust and scalable informatics infrastructure, fulfilling the strategic science goals of the JGI will require ongoing investment in usability of the data, to ensure that the data collected will be used to maximal effect. It must be recognized that 'usability' will have a

  10. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  11. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  12. Cloud computing strategic framework (FY13 - FY15).

    Energy Technology Data Exchange (ETDEWEB)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.; Cox, Philip M.; Rogers, G. Kelly

    2012-11-01

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  13. Detonation Type Ram Accelerator: A Computational Investigation

    Directory of Open Access Journals (Sweden)

    Sunil Bhat

    2000-01-01

    Full Text Available An analytical model explaining the functional characteristics of detonation type ram accelerator is presented. Major flow processes, namely, (i supersonic flow over the cone of the projectile, (ii initiation ofconical shock wave and its reflection from the tube wall, (iii supersonic combustion, and (iv expansion wave and its reflection are modelled. Taylor-Maccoll approach is adopted for modellingthe flow over the cone of the projectile. Shock reflection is treated in accordance with wave angle theorytor flows over the wedge. Prandtl-Mayer analysis is used to model the expansion wave and its reflection.Steady one-dimensional flow with heat transfer along with Rayleigh line equation for perfect gases isused to model supersonic combustion. A computer code is developed to compute the thrust producedby combustion of gases. Ballistic parameters like thrust-pressure ratio and ballistic efficiency of the accelerator are evaluated and their maximum values are 0.032 and 0.068, respectively. The code indicates possibility ofachieving high velocity of 7 km/s on utilising gaseous mixture of 2H2+O2 in the operation.Velocity range suitable for operation of the accelerator lies between 3.8 - 7.0 km/s. Maximum thrust valueis 33721 N which corresponds to the projectile velocity of 5 km/s.

  14. GPU-accelerated computation of electron transfer.

    Science.gov (United States)

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes.

  15. Computers and Strategic Advantage: III. Games, Computer Technology, and a Strategic Power Ratio

    Science.gov (United States)

    1975-05-01

    decisionmaker were given the (r, P) quadrant as a tabula rasa and ex- pressed the same opinion about the gamble based on a rectangle, then the... role of technology, permitting the force sizes though not costs to stay constant. We take the position that each player is trying to maximize his...nonaggregated costing must play its role in strategic modeling, as it must in actual posture decisions. Costing must be done at least by classes of weapon

  16. Strategic engineering for cloud computing and big data analytics

    CERN Document Server

    Ramachandran, Muthu; Sarwar, Dilshad

    2017-01-01

    This book demonstrates the use of a wide range of strategic engineering concepts, theories and applied case studies to improve the safety, security and sustainability of complex and large-scale engineering and computer systems. It first details the concepts of system design, life cycle, impact assessment and security to show how these ideas can be brought to bear on the modeling, analysis and design of information systems with a focused view on cloud-computing systems and big data analytics. This informative book is a valuable resource for graduate students, researchers and industry-based practitioners working in engineering, information and business systems as well as strategy. .

  17. Strategic Plan for a Scientific Cloud Computing infrastructure for Europe

    CERN Document Server

    Lengert, Maryline

    2011-01-01

    Here we present the vision, concept and direction for forming a European Industrial Strategy for a Scientific Cloud Computing Infrastructure to be implemented by 2020. This will be the framework for decisions and for securing support and approval in establishing, initially, an R&D European Cloud Computing Infrastructure that serves the need of European Research Area (ERA ) and Space Agencies. This Cloud Infrastructure will have the potential beyond this initial user base to evolve to provide similar services to a broad range of customers including government and SMEs. We explain how this plan aims to support the broader strategic goals of our organisations and identify the benefits to be realised by adopting an industrial Cloud Computing model. We also outline the prerequisites and commitment needed to achieve these objectives.

  18. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  19. Molecular dynamics-based virtual screening: accelerating the drug discovery process by high-performance computing.

    Science.gov (United States)

    Ge, Hu; Wang, Yu; Li, Chanjuan; Chen, Nanhao; Xie, Yufang; Xu, Mengyan; He, Yingyan; Gu, Xinchun; Wu, Ruibo; Gu, Qiong; Zeng, Liang; Xu, Jun

    2013-10-28

    High-performance computing (HPC) has become a state strategic technology in a number of countries. One hypothesis is that HPC can accelerate biopharmaceutical innovation. Our experimental data demonstrate that HPC can significantly accelerate biopharmaceutical innovation by employing molecular dynamics-based virtual screening (MDVS). Without using HPC, MDVS for a 10K compound library with tens of nanoseconds of MD simulations requires years of computer time. In contrast, a state of the art HPC can be 600 times faster than an eight-core PC server is in screening a typical drug target (which contains about 40K atoms). Also, careful design of the GPU/CPU architecture can reduce the HPC costs. However, the communication cost of parallel computing is a bottleneck that acts as the main limit of further virtual screening improvements for drug innovations.

  20. Computational Examination of Parameters Influencing Practicability of Ram Accelerator

    Directory of Open Access Journals (Sweden)

    Sunil Bhat

    2004-07-01

    Full Text Available The problems concerning practicability aspects of a ram accelerator, such as intense in-bore projectile ablation, large accelerator tube length to achieve high projectile muzzle velocity, and high entry velocity of projectile in the accelerator tube for starting the accelerator have been examined. Computational models of the processes like phenomenon of projectile ablation, flow in the aero-window used as accelerator tube-end closure device in case of high drive gas filling pressure in the ram accelerator tube have been presented. New projectile design to minimise the starting velocity of the ram accelerator is discussed. Possibility of deployment of ram accelerator in the defence-oriented role has been investigated to utilise its high velocity potential.

  1. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  2. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    OpenAIRE

    Hules, John A.

    2009-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics.

  3. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  4. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, Panagiotis; /Fermilab; Cary, John; /Tech-X, Boulder; Mcinnes, Lois Curfman; /Argonne; Mori, Warren; /UCLA; Ng, Cho; /SLAC; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  5. Scientific computing with multicore and accelerators

    CERN Document Server

    Kurzak, Jakub; Dongarra, Jack

    2010-01-01

    Dense Linear Algebra Implementing Matrix Multiplication on the Cell B.E, Wesley Alvaro, Jakub Kurzak, and Jack DongarraImplementing Matrix Factorizations on the Cell BE, Jakub Kurzak and Jack DongarraDense Linear Algebra for Hybrid GPU-Based Systems, Stanimire Tomov and Jack DongarraBLAS for GPUs, Rajib Nath, Stanimire Tomov, and Jack DongarraSparse Linear Algebra Sparse Matrix-Vector Multiplication on Multicore and Accelerators, Samuel Williams, Nathan B

  6. Quality Function Deployment (QFD House of Quality for Strategic Planning of Computer Security of SMEs

    Directory of Open Access Journals (Sweden)

    Jorge A. Ruiz-Vanoye

    2013-01-01

    Full Text Available This article proposes to implement the Quality Function Deployment (QFD House of Quality for strategic planning of computer security for Small and Medium Enterprises (SME. The House of Quality (HoQ applied to computer security of SME is a framework to convert the security needs of corporate computing in a set of specifications to improve computer security.

  7. Accelerating Iterative Big Data Computing Through MPI

    Institute of Scientific and Technical Information of China (English)

    梁帆; 鲁小亿

    2015-01-01

    Current popular systems, Hadoop and Spark, cannot achieve satisfied performance because of the inefficient overlapping of computation and communication when running iterative big data applications. The pipeline of computing, data movement, and data management plays a key role for current distributed data computing systems. In this paper, we first analyze the overhead of shuffle operation in Hadoop and Spark when running PageRank workload, and then propose an event-driven pipeline and in-memory shuffle design with better overlapping of computation and communication as DataMPI-Iteration, an MPI-based library, for iterative big data computing. Our performance evaluation shows DataMPI-Iteration can achieve 9X∼21X speedup over Apache Hadoop, and 2X∼3X speedup over Apache Spark for PageRank and K-means.

  8. GPU-accelerated micromagnetic simulations using cloud computing

    Science.gov (United States)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  9. GPU-accelerated micromagnetic simulations using cloud computing

    CERN Document Server

    Jermain, C L; Buhrman, R A; Ralph, D C

    2015-01-01

    Highly-parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  10. Accelerated Matrix Element Method with Parallel Computing

    CERN Document Server

    Schouten, Doug; Stelzer, Bernd

    2014-01-01

    The matrix element method utilizes ab initio calculations of probability densities as powerful discriminants for processes of interest in experimental particle physics. The method has already been used successfully at previous and current collider experiments. However, the computational complexity of this method for final states with many particles and degrees of freedom sets it at a disadvantage compared to supervised classification methods such as decision trees, k nearest-neighbour, or neural networks. This note presents a concrete implementation of the matrix element technique using graphics processing units. Due to the intrinsic parallelizability of multidimensional integration, dramatic speedups can be readily achieved, which makes the matrix element technique viable for general usage at collider experiments.

  11. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  12. Accelerating scientific computations with mixed precision algorithms

    Science.gov (United States)

    Baboulin, Marc; Buttari, Alfredo; Dongarra, Jack; Kurzak, Jakub; Langou, Julie; Langou, Julien; Luszczek, Piotr; Tomov, Stanimire

    2009-12-01

    On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution. The approach presented here can apply not only to conventional processors but also to other technologies such as Field Programmable Gate Arrays (FPGA), Graphical Processing Units (GPU), and the STI Cell BE processor. Results on modern processor architectures and the STI Cell BE are presented. Program summaryProgram title: ITER-REF Catalogue identifier: AECO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7211 No. of bytes in distributed program, including test data, etc.: 41 862 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: desktop, server Operating system: Unix/Linux RAM: 512 Mbytes Classification: 4.8 External routines: BLAS (optional) Nature of problem: On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution. Solution method: Mixed precision algorithms stem from the observation that, in many cases, a single precision solution of a problem can be refined to the point where double precision accuracy is achieved. A common approach to the solution of linear systems, either dense or sparse, is to perform the LU

  13. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  14. A computer-based aid for the design of a strategic organizational culture

    OpenAIRE

    1998-01-01

    This paper presents a theoretical framework for the alignment of organizational culture and strategy by integrating knowledge from diverse areas of organizational studies including strategic human resource management, organizational culture, and the specific design of human resource practices. It then describes a computer-based aid which offers practitioners a step by step guide for improving their competitive position through the development of a "strategic" culture. It is proposed that orga...

  15. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  16. The computer-based control system of the NAC accelerator

    Science.gov (United States)

    Burdzik, G. F.; Bouckaert, R. F. A.; Cloete, I.; Dutoit, J. S.; Kohler, I. H.; Truter, J. N. J.; Visser, K.; Wikner, V. C. S. J.

    The National Accelerator Center (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable.

  17. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  18. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Directory of Open Access Journals (Sweden)

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  19. Quantum computing accelerator I/O : LDRD 52750 final report.

    Energy Technology Data Exchange (ETDEWEB)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-12-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be

  20. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    Science.gov (United States)

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  1. Pennsylvania's Transition to Enterprise Computing as a Study in Strategic Alignment

    Science.gov (United States)

    Sawyer, Steve; Hinnant, Charles C.; Rizzuto, Tracey

    2008-01-01

    We theorize about the strategic alignment of computing with organizational mission, using the Commonwealth of Pennsylvania's efforts to pursue digital government initiatives as evidence. To do this we draw on a decade (1995-2004) of changes in Pennsylvania to characterize how a state government shifts from an organizational to an enterprise…

  2. Computing at DESY — current setup, trends and strategic directions

    Science.gov (United States)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  3. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    Directory of Open Access Journals (Sweden)

    Wei Wang

    Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other

  4. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  5. A Study on Strategic Provisioning of Cloud Computing Services

    Science.gov (United States)

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  6. A study on strategic provisioning of cloud computing services.

    Science.gov (United States)

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  7. A Study on Strategic Provisioning of Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Md Whaiduzzaman

    2014-01-01

    Full Text Available Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  8. On-Chip Reconfigurable Hardware Accelerators for Popcount Computations

    Directory of Open Access Journals (Sweden)

    Valery Sklyarov

    2016-01-01

    Full Text Available Popcount computations are widely used in such areas as combinatorial search, data processing, statistical analysis, and bio- and chemical informatics. In many practical problems the size of initial data is very large and increase in throughput is important. The paper suggests two types of hardware accelerators that are (1 designed in FPGAs and (2 implemented in Zynq-7000 all programmable systems-on-chip with partitioning of algorithms that use popcounts between software of ARM Cortex-A9 processing system and advanced programmable logic. A three-level system architecture that includes a general-purpose computer, the problem-specific ARM, and reconfigurable hardware is then proposed. The results of experiments and comparisons with existing benchmarks demonstrate that although throughput of popcount computations is increased in FPGA-based designs interacting with general-purpose computers, communication overheads (in experiments with PCI express are significant and actual advantages can be gained if not only popcount but also other types of relevant computations are implemented in hardware. The comparison of software/hardware designs for Zynq-7000 all programmable systems-on-chip with pure software implementations in the same Zynq-7000 devices demonstrates increase in performance by a factor ranging from 5 to 19 (taking into account all the involved communication overheads between the programmable logic and the processing systems.

  9. Distance Computation Between Non-Holonomic Motions with Constant Accelerations

    Directory of Open Access Journals (Sweden)

    Enrique J. Bernabeu

    2013-09-01

    Full Text Available A method for computing the distance between two moving robots or between a mobile robot and a dynamic obstacle with linear or arc‐like motions and with constant accelerations is presented in this paper. This distance is obtained without stepping or discretizing the motions of the robots or obstacles. The robots and obstacles are modelled by convex hulls. This technique obtains the future instant in time when two moving objects will be at their minimum translational distance ‐ i.e., at their minimum separation or maximum penetration (if they will collide. This distance and the future instant in time are computed in parallel. This method is intended to be run each time new information from the world is received and, consequently, it can be used for generating collision‐free trajectories for non‐holonomic mobile robots.

  10. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  11. Computation of Normal Conducting and Superconducting Linear Accelerator (LINAC) Availabilities

    Energy Technology Data Exchange (ETDEWEB)

    Haire, M.J.

    2000-07-11

    A brief study was conducted to roughly estimate the availability of a superconducting (SC) linear accelerator (LINAC) as compared to a normal conducting (NC) one. Potentially, SC radio frequency cavities have substantial reserve capability, which allows them to compensate for failed cavities, thus increasing the availability of the overall LINAC. In the initial SC design, there is a klystron and associated equipment (e.g., power supply) for every cavity of an SC LINAC. On the other hand, a single klystron may service eight cavities in the NC LINAC. This study modeled that portion of the Spallation Neutron Source LINAC (between 200 and 1,000 MeV) that is initially proposed for conversion from NC to SC technology. Equipment common to both designs was not evaluated. Tabular fault-tree calculations and computer-event-driven simulation (EDS) computer computations were performed. The estimated gain in availability when using the SC option ranges from 3 to 13% under certain equipment and conditions and spatial separation requirements. The availability of an NC LINAC is estimated to be 83%. Tabular fault-tree calculations and computer EDS modeling gave the same 83% answer to within one-tenth of a percent for the NC case. Tabular fault-tree calculations of the availability of the SC LINAC (where a klystron and associated equipment drive a single cavity) give 97%, whereas EDS computer calculations give 96%, a disagreement of only 1%. This result may be somewhat fortuitous because of limitations of tabular fault-tree calculations. For example, tabular fault-tree calculations can not handle spatial effects (separation distance between failures), equipment network configurations, and some failure combinations. EDS computer modeling of various equipment configurations were examined. When there is a klystron and associated equipment for every cavity and adjacent cavity, failure can be tolerated and the SC availability was estimated to be 96%. SC availability decreased as

  12. Computer modeling of test particle acceleration at oblique shocks

    Science.gov (United States)

    Decker, Robert B.

    1988-01-01

    The present evaluation of the basic techniques and illustrative results of charged particle-modeling numerical codes suitable for particle acceleration at oblique, fast-mode collisionless shocks emphasizes the treatment of ions as test particles, calculating particle dynamics through numerical integration along exact phase-space orbits. Attention is given to the acceleration of particles at planar, infinitessimally thin shocks, as well as to plasma simulations in which low-energy ions are injected and accelerated at quasi-perpendicular shocks with internal structure.

  13. Accelerating Computation of the Unit Commitment Problem (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Hummon, M.; Barrows, C.; Jones, W.

    2013-10-01

    Production cost models (PCMs) simulate power system operation at hourly (or higher) resolution. While computation times often extend into multiple days, the sequential nature of PCM's makes parallelism difficult. We exploit the persistence of unit commitment decisions to select partition boundaries for simulation horizon decomposition and parallel computation. Partitioned simulations are benchmarked against sequential solutions for optimality and computation time.

  14. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    Science.gov (United States)

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  15. GpuCV : a GPU-accelerated framework for image processing and computer vision

    OpenAIRE

    ALLUSSE, Yannick; Horain, Patrick; Agarwal, Ankit; Saipriyadarshan, Cindula

    2008-01-01

    International audience; This paper presents briefly describes the state of the art of accelerating image processing with graphics hardware (GPU) and discusses some of its caveats. Then it describes GpuCV, an open source multi-platform library for GPU-accelerated image processing and Computer Vision operators and applications. It is meant for computer vision scientist not familiar with GPU technologies. GpuCV is designed to be compatible with the popular OpenCV library by offering GPU-accelera...

  16. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  17. Accelerating patch-based directional wavelets with multicore parallel computing in compressed sensing MRI.

    Science.gov (United States)

    Li, Qiyue; Qu, Xiaobo; Liu, Yunsong; Guo, Di; Lai, Zongying; Ye, Jing; Chen, Zhong

    2015-06-01

    Compressed sensing MRI (CS-MRI) is a promising technology to accelerate magnetic resonance imaging. Both improving the image quality and reducing the computation time are important for this technology. Recently, a patch-based directional wavelet (PBDW) has been applied in CS-MRI to improve edge reconstruction. However, this method is time consuming since it involves extensive computations, including geometric direction estimation and numerous iterations of wavelet transform. To accelerate computations of PBDW, we propose a general parallelization of patch-based processing by taking the advantage of multicore processors. Additionally, two pertinent optimizations, excluding smooth patches and pre-arranged insertion sort, that make use of sparsity in MR images are also proposed. Simulation results demonstrate that the acceleration factor with the parallel architecture of PBDW approaches the number of central processing unit cores, and that pertinent optimizations are also effective to make further accelerations. The proposed approaches allow compressed sensing MRI reconstruction to be accomplished within several seconds.

  18. Lua(Jit) for computing accelerator beam physics

    CERN Document Server

    CERN. Geneva

    2016-01-01

    As mentioned in the 2nd developers meeting, I would like to open the debate with a special presentation on another language - Lua, and a tremendous technology - LuaJit. Lua is much less known at CERN, but it is very simple, much smaller than Python and its JIT is extremely performant. The language is a dynamic scripting language easy to learn and easy to embedded in applications. I will show how we use it in HPC for accelerator beam physics as a replacement for C, C++, Fortran and Python, with some benchmarks versus Python, PyPy4 and C/C++.

  19. Accelerating Scientific Discovery Through Computation and Visualization III. Tight-Binding Wave Functions for Quantum Dots.

    Science.gov (United States)

    Sims, James S; George, William L; Griffin, Terence J; Hagedorn, John G; Hung, Howard K; Kelso, John T; Olano, Marc; Peskin, Adele P; Satterfield, Steven G; Terrill, Judith Devaney; Bryant, Garnett W; Diaz, Jose G

    2008-01-01

    This is the third in a series of articles that describe, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing, visualization, and machine learning to accelerate scientific discovery. In this article we focus on the use of high performance computing and visualization for simulations of nanotechnology.

  20. Accelerating Scientific Discovery Through Computation and Visualization III. Tight-Binding Wave Functions for Quantum Dots

    OpenAIRE

    2008-01-01

    This is the third in a series of articles that describe, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing, visualization, and machine learning to accelerate scientific discovery. In this article we focus on the use of high performance computing and visualization for simulations of nanotechnology.

  1. Computational algorithms for multiphase magnetohydrodynamics and applications to accelerator targets

    Directory of Open Access Journals (Sweden)

    R.V. Samulyak

    2010-01-01

    Full Text Available An interface-tracking numerical algorithm for the simulation of magnetohydrodynamic multiphase/free surface flows in the low-magnetic-Reynolds-number approximation of (Samulyak R., Du J., Glimm J., Xu Z., J. Comp. Phys., 2007, 226, 1532 is described. The algorithm has been implemented in multi-physics code FronTier and used for the simulation of MHD processes in liquids and weakly ionized plasmas. In this paper, numerical simulations of a liquid mercury jet entering strong and nonuniform magnetic field and interacting with a powerful proton pulse have been performed and compared with experiments. Such a mercury jet is a prototype of the proposed Muon Collider/Neutrino Factory, a future particle accelerator. Simulations demonstrate the elliptic distortion of the mercury jet as it enters the magnetic solenoid at a small angle to the magnetic axis, jet-surface instabilities (filamentation induced by the interaction with proton pulses, and the stabilizing effect of the magnetic field.

  2. Accelerating Missile Threat Engagement Simulations Using Personal Computer Graphics Cards

    Science.gov (United States)

    2005-03-01

    personal computer on the market today, have reached a level of power and programmability that enables them to be used as high performance stream...expected to continue at this rate for another five years, perhaps achieving tera-FLOP performance by 2005 [Mac03]. While the main, market -driven...JEFFERS // 11 nov 04 -- multiplies 1x1 scene by 8x8 reticle pallette, then does // 4:1 redux; results in RT that is quarter sized of

  3. Unified Compression-Based Acceleration of Edit-Distance Computation

    CERN Document Server

    Hermelin, Danny; Landau, Shir; Weimann, Oren

    2010-01-01

    The edit distance problem is a classical fundamental problem in computer science in general, and in combinatorial pattern matching in particular. The standard dynamic programming solution for this problem computes the edit-distance between a pair of strings of total length O(N) in O(N^2) time. To this date, this quadratic upper-bound has never been substantially improved for general strings. However, there are known techniques for breaking this bound in case the strings are known to compress well under a particular compression scheme. The basic idea is to first compress the strings, and then to compute the edit distance between the compressed strings. As it turns out, practically all known o(N^2) edit-distance algorithms work, in some sense, under the same paradigm described above. It is therefore natural to ask whether there is a single edit-distance algorithm that works for strings which are compressed under any compression scheme. A rephrasing of this question is to ask whether a single algorithm can explo...

  4. Modern hardware architectures accelerate porous media flow computations

    Science.gov (United States)

    Kulczewski, Michal; Kurowski, Krzysztof; Kierzynka, Michal; Dohnalik, Marek; Kaczmarczyk, Jan; Borujeni, Ali Takbiri

    2012-05-01

    Investigation of rock properties, porosity and permeability particularly, which determines transport media characteristic, is crucial to reservoir engineering. Nowadays, micro-tomography (micro-CT) methods allow to obtain vast of petro-physical properties. The micro-CT method facilitates visualization of pores structures and acquisition of total porosity factor, determined by sticking together 2D slices of scanned rock and applying proper absorption cut-off point. Proper segmentation of pores representation in 3D is important to solve the permeability of porous media. This factor is recently determined by the means of Computational Fluid Dynamics (CFD), a popular method to analyze problems related to fluid flows, taking advantage of numerical methods and constantly growing computing powers. The recent advent of novel multi-, many-core and graphics processing unit (GPU) hardware architectures allows scientists to benefit even more from parallel processing and built-in new features. The high level of parallel scalability offers both, the time-to-solution decrease and greater accuracy - top factors in reservoir engineering. This paper aims to present research results related to fluid flow simulations, particularly solving the total porosity and permeability of porous media, taking advantage of modern hardware architectures. In our approach total porosity is calculated by the means of general-purpose computing on multiple GPUs. This application sticks together 2D slices of scanned rock and by the means of a marching tetrahedra algorithm, creates a 3D representation of pores and calculates the total porosity. Experimental results are compared with data obtained via other popular methods, including Nuclear Magnetic Resonance (NMR), helium porosity and nitrogen permeability tests. Then CFD simulations are performed on a large-scale high performance hardware architecture to solve the flow and permeability of porous media. In our experiments we used Lattice Boltzmann

  5. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  6. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  7. Accelerate!

    Science.gov (United States)

    Kotter, John P

    2012-11-01

    The old ways of setting and implementing strategy are failing us, writes the author of Leading Change, in part because we can no longer keep up with the pace of change. Organizational leaders are torn between trying to stay ahead of increasingly fierce competition and needing to deliver this year's results. Although traditional hierarchies and managerial processes--the components of a company's "operating system"--can meet the daily demands of running an enterprise, they are rarely equipped to identify important hazards quickly, formulate creative strategic initiatives nimbly, and implement them speedily. The solution Kotter offers is a second system--an agile, networklike structure--that operates in concert with the first to create a dual operating system. In such a system the hierarchy can hand off the pursuit of big strategic initiatives to the strategy network, freeing itself to focus on incremental changes to improve efficiency. The network is populated by employees from all levels of the organization, giving it organizational knowledge, relationships, credibility, and influence. It can Liberate information from silos with ease. It has a dynamic structure free of bureaucratic layers, permitting a level of individualism, creativity, and innovation beyond the reach of any hierarchy. The network's core is a guiding coalition that represents each level and department in the hierarchy, with a broad range of skills. Its drivers are members of a "volunteer army" who are energized by and committed to the coalition's vividly formulated, high-stakes vision and strategy. Kotter has helped eight organizations, public and private, build dual operating systems over the past three years. He predicts that such systems will lead to long-term success in the 21st century--for shareholders, customers, employees, and companies themselves.

  8. Accelerating Computation of Large Biological Datasets using MapReduce Framework.

    Science.gov (United States)

    Wang, Chao; Dai, Dong; Li, Xi; Wang, Aili; Zhou, Xuehai

    2016-04-05

    The maximal information coefficient (MIC) has been proposed to discover relationships and associations between pairs of variables. It poses significant challenges for bioinformatics scientists to accelerate the MIC calculation, especially in genome sequencing and biological annotations. In this paper we explore a parallel approach which uses MapReduce framework to improve the computing efficiency and throughput of the MIC computation. The acceleration system includes biological data storage on HDFS, preprocessing algorithms, distributed memory cache mechanism, and the partition of MapReduce jobs. Based on the acceleration approach, we extend the traditional two-variable algorithm to multiple variables algorithm. The experimental results show that our parallel solution provides a linear speedup comparing with original algorithm without affecting the correctness and sensitivity.

  9. Adaptation and optimization of basic operations for an unstructured mesh CFD algorithm for computation on massively parallel accelerators

    Science.gov (United States)

    Bogdanov, P. B.; Gorobets, A. V.; Sukov, S. A.

    2013-08-01

    The design of efficient algorithms for large-scale gas dynamics computations with hybrid (heterogeneous) computing systems whose high performance relies on massively parallel accelerators is addressed. A high-order accurate finite volume algorithm with polynomial reconstruction on unstructured hybrid meshes is used to compute compressible gas flows in domains of complex geometry. The basic operations of the algorithm are implemented in detail for massively parallel accelerators, including AMD and NVIDIA graphics processing units (GPUs). Major optimization approaches and a computation transfer technique are covered. The underlying programming tool is the Open Computing Language (OpenCL) standard, which performs on accelerators of various architectures, both existing and emerging.

  10. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Shin-ichi Kuribayashi

    2013-02-01

    Full Text Available The widespread use of cloud computing services is expected to deteriorate a Quality of Service andtoincrease the power consumption of ICT devices, since the distance to a server becomes longer thanbefore. Migration of virtual machines over a wide area can solve many problems such as load balancingand power saving in cloud computing environments.This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine ismoved to a distant center, in order to prevent the degradation in performance after live migration ofvirtual machines over a wide area. mSCTP-based data transfer using different TCP connections beforeand after migration is proposed in order to use a currently available WAN accelerator. This paper doesnot consider the performance degradation of live migration itself. Then, this paper proposes to reduce thepower consumption of ICT devices, which consists of installing WAN accelerators as part of cloudresources actively and increasing the packet transfer rate of communication link temporarily. It isdemonstrated that the power consumption with WAN accelerator could be reduced to one-tenth of thatwithout WAN accelerator.

  11. Ultrasound window-modulated compounding Nakagami imaging: Resolution improvement and computational acceleration for liver characterization.

    Science.gov (United States)

    Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang

    2016-08-01

    Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation.

  12. Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Chan [Univ. of California, Los Angeles, CA (United States); Mori, W. [Univ. of California, Los Angeles, CA (United States)

    2013-10-21

    This is the final report on the DOE grant number DE-FG02-92ER40727 titled, “Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators.” During this grant period the UCLA program on Advanced Plasma Based Accelerators, headed by Professor C. Joshi has made many key scientific advances and trained a generation of students, many of whom have stayed in this research field and even started research programs of their own. In this final report however, we will focus on the last three years of the grant and report on the scientific progress made in each of the four tasks listed under this grant. Four tasks are focused on: Plasma Wakefield Accelerator Research at FACET, SLAC National Accelerator Laboratory, In House Research at UCLA’s Neptune and 20 TW Laser Laboratories, Laser-Wakefield Acceleration (LWFA) in Self Guided Regime: Experiments at the Callisto Laser at LLNL, and Theory and Simulations. Major scientific results have been obtained in each of the four tasks described in this report. These have led to publications in the prestigious scientific journals, graduation and continued training of high quality Ph.D. level students and have kept the U.S. at the forefront of plasma-based accelerators research field.

  13. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  14. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach was evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.

  15. Convergence acceleration for vector sequences and applications to computational fluid dynamics

    Science.gov (United States)

    Sidi, Avram; Celestina, Mark L.

    1990-01-01

    Some recent developments in acceleration of convergence methods for vector sequences are reviewed. The methods considered are the minimal polynomial extrapolation, the reduced rank extrapolation, and the modified minimal polynomial extrapolation. The vector sequences to be accelerated are those that are obtained from the iterative solution of linear or nonlinear systems of equations. The convergence and stability properties of these methods as well as different ways of numerical implementation are discussed in detail. Based on the convergence and stability results, strategies that are useful in practical applications are suggested. Two applications to computational fluid mechanics involving the three dimensional Euler equations for ducted and external flows are considered. The numerical results demonstrate the usefulness of the methods in accelerating the convergence of the time marching techniques in the solution of steady state problems.

  16. A contribution to the computation of the impedance in acceleration resonators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong

    2016-05-15

    This thesis is focusing on the numerical computation of the impedance in acceleration resonators and corresponding components. For this purpose, a dedicated solver based on the Finite Element Method (FEM) has been developed to compute the broadband impedance in accelerating components. In addition, various numerical approaches have been used to calculate the narrow-band impedance in superconducting radio frequency (RF) cavities. From that an overview of the calculated results as well as the comparisons between the applied numerical approaches is provided. During the design phase of superconducting RF accelerating cavities and components, a challenging and difficult task is the determination of the impedance inside the accelerators with the help of proper computer simulations. Impedance describes the electromagnetic interaction between the particle beam and the accelerators. It can affect the stability of the particle beam. For a superconducting RF accelerating cavity with waveguides (beam pipes and couplers), the narrow-band impedance, which is also called shunt impedance, corresponds to the eigenmodes of the cavity. It depends on the eigenfrequencies and its electromagnetic field distribution of the eigenmodes inside the cavity. On the other hand, the broadband impedance describes the interaction of the particle beam in the waveguides with its environment at arbitrary frequency and beam velocity. With the narrow-band and broadband impedance the detailed knowledges of the impedance for the accelerators can be given completely. In order to calculate the broadband longitudinal space charge impedance for acceleration components, a three-dimensional (3D) solver based on the FEM in frequency domain has been developed. To calculate the narrow-band impedance for superconducting RF cavities, we used various numerical approaches. Firstly, the eigenmode solver based on Finite Integration Technique (FIT) and a parallel real-valued FEM (CEM3Dr) eigenmode solver based on

  17. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  18. Proceedings of the conference on computer codes and the linear accelerator community

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.K. (comp.)

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned.

  19. A Low-Power Scalable Stream Compute Accelerator for General Matrix Multiply (GEMM

    Directory of Open Access Journals (Sweden)

    Antony Savich

    2014-01-01

    play an important role in determining the performance of such applications. This paper proposes a novel efficient, highly scalable hardware accelerator that is of equivalent performance to a 2 GHz quad core PC but can be used in low-power applications targeting embedded systems requiring high performance computation. Power, performance, and resource consumption are demonstrated on a fully-functional prototype. The proposed hardware accelerator is 36× more energy efficient per unit of computation compared to state-of-the-art Xeon processor of equal vintage and is 14× more efficient as a stand-alone platform with equivalent performance. An important comparison between simulated system estimates and real system performance is carried out.

  20. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  1. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  2. BioEM: GPU-accelerated computing of Bayesian inference of electron microscopy images

    CERN Document Server

    Cossio, Pilar; Baruffa, Fabio; Rampp, Markus; Lindenstruth, Volker; Hummer, Gerhard

    2016-01-01

    In cryo-electron microscopy (EM), molecular structures are determined from large numbers of projection images of individual particles. To harness the full power of this single-molecule information, we use the Bayesian inference of EM (BioEM) formalism. By ranking structural models using posterior probabilities calculated for individual images, BioEM in principle addresses the challenge of working with highly dynamic or heterogeneous systems not easily handled in traditional EM reconstruction. However, the calculation of these posteriors for large numbers of particles and models is computationally demanding. Here we present highly parallelized, GPU-accelerated computer software that performs this task efficiently. Our flexible formulation employs CUDA, OpenMP, and MPI parallelization combined with both CPU and GPU computing. The resulting BioEM software scales nearly ideally both on pure CPU and on CPU+GPU architectures, thus enabling Bayesian analysis of tens of thousands of images in a reasonable time. The g...

  3. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  4. Accelerating Relevance-Vector-Machine-Based Classification of Hyperspectral Image with Parallel Computing

    Directory of Open Access Journals (Sweden)

    Chao Dong

    2012-01-01

    Full Text Available Benefiting from the kernel skill and the sparse property, the relevance vector machine (RVM could acquire a sparse solution, with an equivalent generalization ability compared with the support vector machine. The sparse property requires much less time in the prediction, making RVM potential in classifying the large-scale hyperspectral image. However, RVM is not widespread influenced by its slow training procedure. To solve the problem, the classification of the hyperspectral image using RVM is accelerated by the parallel computing technique in this paper. The parallelization is revealed from the aspects of the multiclass strategy, the ensemble of multiple weak classifiers, and the matrix operations. The parallel RVMs are implemented using the C language plus the parallel functions of the linear algebra packages and the message passing interface library. The proposed methods are evaluated by the AVIRIS Indian Pines data set on the Beowulf cluster and the multicore platforms. It shows that the parallel RVMs accelerate the training procedure obviously.

  5. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  6. Proposing a Strategic Framework for Distributed Manufacturing Execution System Using Cloud Computing

    Directory of Open Access Journals (Sweden)

    Shiva Khalili Gheidari

    2013-07-01

    Full Text Available This paper introduces a strategic framework that uses service-oriented architecture to design distributed MES over cloud. In this study, the main structure of framework is defined in terms of a series of modules that communicate with each other by use of a design pattern, called mediator. Framework focus is on the main module, which handles distributed orders with other ones and finally suggests the benefit of using cloud in comparison with previous architectures. The main structure of framework (mediator and the benefit of focusing on the main module by using cloud, should be pointed more, also the aim and the results of comparing this method with previous architecture whether by quality and quantity is not described.

  7. Computer simulation of 2-D and 3-D ion beam extraction and acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Ido, Shunji; Nakajima, Yuji [Saitama Univ., Urawa (Japan). Faculty of Engineering

    1997-03-01

    The two-dimensional code and the three-dimensional code have been developed to study the physical features of the ion beams in the extraction and acceleration stages. By using the two-dimensional code, the design of first electrode(plasma grid) is examined in regard to the beam divergence. In the computational studies by using the three-dimensional code, the axis-off model of ion beam is investigated. It is found that the deflection angle of ion beam is proportional to the gap displacement of the electrodes. (author)

  8. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  9. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  10. A Unified Algorithm for Accelerating Edit-Distance Computation via Text-Compression

    CERN Document Server

    Hermelin, Danny; Landau, Shir; Weimann, Oren

    2009-01-01

    We present a unified framework for accelerating edit-distance computation between two compressible strings using straight-line programs. For two strings of total length $N$ having straight-line program representations of total size $n$, we provide an algorithm running in $O(n^{1.4}N^{1.2})$ time for computing the edit-distance of these two strings under any rational scoring function, and an $O(n^{1.34}N^{1.34})$ time algorithm for arbitrary scoring functions. This improves on a recent algorithm of Tiskin that runs in $O(nN^{1.5})$ time, and works only for rational scoring functions. Also, in the last part of the paper, we show how the classical four-russians technique can be incorporated into our SLP edit-distance scheme, giving us a simple $\\Omega(\\lg N)$ speed-up in the case of arbitrary scoring functions, for any pair of strings.

  11. Computationally efficient methods for modelling laser wakefield acceleration in the blowout regime

    CERN Document Server

    Cowan, B M; Beck, A; Davoine, X; Bunkers, K; Lifschitz, A F; Lefebvre, E; Bruhwiler, D L; Shadwick, B A; Umstadter, D P

    2012-01-01

    Electron self-injection and acceleration until dephasing in the blowout regime is studied for a set of initial conditions typical of recent experiments with 100 terawatt-class lasers. Two different approaches to computationally efficient, fully explicit, three-dimensional particle-in-cell modelling are examined. First, the Cartesian code VORPAL using a perfect-dispersion electromagnetic solver precisely describes the laser pulse and bubble dynamics, taking advantage of coarser resolution in the propagation direction, with a proportionally larger time step. Using third-order splines for macroparticles helps suppress the sampling noise while keeping the usage of computational resources modest. The second way to reduce the simulation load is using reduced-geometry codes. In our case, the quasi-cylindrical code CALDER-CIRC uses decomposition of fields and currents into a set of poloidal modes, while the macroparticles move in the Cartesian 3D space. Cylindrical symmetry of the interaction allows using just two mo...

  12. Historic Seismicity, Computed Peak Ground Accelerations, and Seismic Site Conditions for Northeast Mexico

    Science.gov (United States)

    Montalvo-Arriet, J. C.; Galván-Ramírez, I. N.; Ramos-Zuñiga, L. G.; Navarro de León, I.; Ramírez-Fernández, J. A.; Quintanilla-López, Y.; Cavazos-Tovar, N. P.

    2007-05-01

    In this study we present the historic seismicity, computed peak ground accelerations, and mapping of seismic site conditions for northeast Mexico. We start with a compilation of the regional seismicity in northeast Mexico (24- 31°N, 87-106°W) for the 1787-2006 period. Our study area lies within three morphotectonic provinces: Basin and Range and Rio Grande rift, Sierra Madre Oriental and Gulf Coastal Plain. Peak ground acceleration (PGA) maps were computed for three different scenarios: 1928 Parral, Chihuahua (MW = 6.5); 1931 Valentine, Texas (MW = 6.4); and a hypothetical earthquake located in central Coahuila (MW = 6.5). Ground acceleration values were computed using attenuation relations developed for central and eastern North America and the Basin and Range province. The hypothetical earthquake in central Coahuila is considered a critical scenario for the main cities of northeast Mexico. The damage associated with this hypothetical earthquake could be severe because the majority of the buildings were constructed without allowance for seismic accelerations. The expected PGA values in Monterrey, Saltillo and Monclova range from 30 to 70 cm/s2 (0.03 to 0.07g). This earthquake might also produce or trigger significant landslides and rock falls in the Sierra Madre Oriental, where several cities are located (e.g. suburbs of Monterrey). Additionally, the Vs30 distribution for the state of Nuevo Leon and the cities of Linares and Monterrey are presented. The Vs30 data was obtained using seismic refraction profiling correlated with borehole information. According to NEHRP soil classification, sites classes A, B and C are dominant. Sites with class D occupy minor areas in both cities. Due to the semi-arid conditions in northeast Mexico, we obtained the highest values of Vs30 in Quaternary deposits (alluvium) cemented by caliche. Similar values of Vs30 were obtained in Reno and Las Vegas, Nevada. This work constitutes the first attempt at understanding and

  13. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  14. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  15. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  16. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    Science.gov (United States)

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  17. Computer-Based Training for Strategic Decision Making Development of Three Tutorials.

    Science.gov (United States)

    1987-09-01

    application package are discussed. ;0 D SR𔃽UTON AVAILABILTY 0’ ABSTRACT 21 ABSTRACT SECURITY CLASSIFICATION elNCASSIFEO1UNL-MiTED 0 SAME AS APT 0COTIC...growth of computer usage and the current state of the art in industry and education. It will present current instructional techniques for teaching

  18. Command, Control, Communication, Computers and Information Technology (C4&IT). Strategic Plan, FY2008 - 2012

    Science.gov (United States)

    2008-01-01

    environment that includes migration of Microsite participants and documents 30 • External Web site via FatWire Content Management System (CMS) to...Integration CMS Content Management System CND Computer Network Defense COBIT Control Objectives for Information and related Technology

  19. Computer-mediated communication as a channel for social resistance : The strategic side of SIDE

    NARCIS (Netherlands)

    Spears, R; Lea, M; Corneliussen, RA; Postmes, T; Ter Haar, W

    2002-01-01

    In two studies, the authors tested predictions derived from the social identity model of deindividuation effects (SIDE) concerning the potential of computer-mediated communication (CMC) to serve as a means to resist powerful out-groups. Earlier research using the SIDE model indicates that the anonym

  20. Teaching Strategic Text Review by Computer and Interaction with Student Characteristics.

    Science.gov (United States)

    Tobias, Sigmund

    1988-01-01

    Discussion of reading strategies focuses on a study of high school students that used three presentation modes via computer, with and without explanations about the value of text review. Highlights include pretests and posttests, student characteristics, test anxiety, prior knowledge, and implications for aptitude treatment interaction research…

  1. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  2. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    Directory of Open Access Journals (Sweden)

    Wei-Jen Wang

    Full Text Available This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP in MATLAB by using external function calls to a graphics processing unit (GPU. DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  3. Parallelizing Epistasis Detection in GWAS on FPGA and GPU-Accelerated Computing Systems.

    Science.gov (United States)

    González-Domínguez, Jorge; Wienbrandt, Lars; Kässens, Jan Christian; Ellinghaus, David; Schimmler, Manfred; Schmidt, Bertil

    2015-01-01

    High-throughput genotyping technologies (such as SNP-arrays) allow the rapid collection of up to a few million genetic markers of an individual. Detecting epistasis (based on 2-SNP interactions) in Genome-Wide Association Studies is an important but time consuming operation since statistical computations have to be performed for each pair of measured markers. Computational methods to detect epistasis therefore suffer from prohibitively long runtimes; e.g., processing a moderately-sized dataset consisting of about 500,000 SNPs and 5,000 samples requires several days using state-of-the-art tools on a standard 3 GHz CPU. In this paper, we demonstrate how this task can be accelerated using a combination of fine-grained and coarse-grained parallelism on two different computing systems. The first architecture is based on reconfigurable hardware (FPGAs) while the second architecture uses multiple GPUs connected to the same host. We show that both systems can achieve speedups of around four orders-of-magnitude compared to the sequential implementation. This significantly reduces the runtimes for detecting epistasis to only a few minutes for moderately-sized datasets and to a few hours for large-scale datasets.

  4. An Experimental and Computational Study of a Shock-Accelerated Heavy Gas Cylinder

    Science.gov (United States)

    Zoldi, Cindy; Prestridge, Katherine; Tomkins, Christopher; Marr-Lyon, Mark; Rightley, Paul; Benjamin, Robert; Vorobieff, Peter

    2002-11-01

    We present updated results of an experimental and computational study that examines the evolution of a heavy gas (SF_6) cylinder surrounded by air when accelerated by a planar Mach 1.2 shock wave. From each shock tube experiment, we obtain one image of the experimental initial conditions and six images of the time evolution of the cylinder. Moreover, the implementation of Particle Image Velocimetry (PIV) also allows us to determine the velocity field at the last experimental time. Simulations incorporating the two-dimensional image of the experimental initial conditions are performed using the adaptive-mesh Eulerian code, RAGE. A computational study shows that agreement between the measured and computed velocities is achieved by decreasing the peak SF6 concentration to 60%, which was measured in the previous "gas curtain" experiments, and diffusing the air/SF6 interface in the experimental initial conditions. These modifications are consistent with the observation that the SF6 gas diffuses faster than the fog particles used to track the gas. Images of the experimental initial conditions, obtained using planar laser Rayleigh scattering, quantifies the diffusion lag between the SF6 gas and the fog particles.

  5. Seize the “Broadband China”Strategic Opportunity to Accelerate the Comprehensive Development of Radio and Television Networks%抓住宽带中国战略机遇加快广电网络全面发展

    Institute of Scientific and Technical Information of China (English)

    李智勇

    2015-01-01

    China has outlined a series of policies for “Broadband China” strategy over recent years .China Telecom, China Unicom, China Mobile and China Radio and Television Network , the four state-owned giants have unveiled their respective strategies , measures and development goals .From the four nation telecommuni-cation giants perspective on development strategies and goals , each of them is express advantages and make full use of their leverage , then proceed to its ultimate , while sparing no effort to expand to other sectors .Inter-net, cloud computing , big data and “Broadband China” has become key factors in the development of modern enterprises , and “Internet plus”ignited the innovation fire across industry .Innovating during reform has be-come a fundamental rule and logic of economy and social development .China Radio and Television Networks must firmly seize the “Broadband China” strategic opportunity , while premeditate network convergence and next-generation broadcast television network strategic evolution , in order to accelerate the development of broadband intelligent network .%近两年,国家围绕“宽带中国”战略相继出台了一系列政策。中国电信、中国联通、中国移动和中国广电网络“四巨头”纷纷亮相表态,阐述各自的策略、措施和发展目标。从四大“国”字号运营商的策略和发展目标看,各自都在充分发挥自己的强项和优势,并把其做到极致,同时不遗余力地向其他领域扩展。互联网、云计算、大数据和“宽带中国”已成为现代企业发展的关键要素,而“互联网+”点燃了各个行业的创新之火,改革创新成为社会经济发展的基本规律和逻辑,广电网络运营商要紧紧抓住“宽带中国”的战略机遇,面向三网融合和下一代广播电视网络长期演进的战略目标,加快宽带智能网络的发展。

  6. X-ray beam hardening correction for measuring density in linear accelerator industrial computed tomography

    Institute of Scientific and Technical Information of China (English)

    ZHOU Ri-Feng; WANG Jue; CHEN Wei-Min

    2009-01-01

    Due to X-ray attenuation being approximately proportional to material density, it is possible to measure the inner density through Industrial Computed Tomography (ICT) images accurately. In practice, however, a number of factors including the non-linear effects of beam hardening and diffuse scattered radia-tion complicate the quantitative measurement of density variations in materials. This paper is based on the linearization method of beam hardening correction, and uses polynomial fitting coefficient which is obtained by the curvature of iron polychromatic beam data to fit other materials. Through theoretical deduction, the paper proves that the density measure error is less than 2% if using pre-filters to make the spectrum of linear accelerator range mainly 0.3 MeV to 3 MeV. Experiment had been set up at an ICT system with a 9 MeV electron linear accelerator. The result is satisfactory. This technique makes the beam hardening correction easy and simple, and it is valuable for measuring the ICT density and making use of the CT images to recognize materials.

  7. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    We present performance results of a mixed-precision strategy developed to improve a recently developed massively parallel GPU-accelerated tool for fast and scalable simulation of unsteady fully nonlinear free surface water waves over uneven depths (Engsig-Karup et.al. 2011). The underlying wave...... model is based on a potential flow formulation, which requires efficient solution of a Laplace problem at large-scales. We report recent results on a new mixed-precision strategy for efficient iterative high-order accurate and scalable solution of the Laplace problem using a multigrid......-preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  8. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    Science.gov (United States)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  9. Flexusi Interface Builder For Computer Based Accelerator Monitoring And Control System

    CERN Document Server

    Kurakin, V G; Kurakin, P V

    2004-01-01

    We have developed computer code for any desired graphics user interface designing for monitoring and control system at the executable level. This means that operator can build up measurement console consisting of virtual devices before or even during real experiment without recompiling source file. Such functionality results in number of advantages comparing with traditional programming. First of all any risk disappears to introduce bug into source code. Another important thing is the fact the both program developers and operator staff do not interface in developing ultimate product (measurement console). Thus, small team without detailed project can design even very complicated monitoring and control system. For the reason mentioned below, approach suggested is especially helpful for large complexes to be monitored and control, accelerator being among them. The program code consists of several modules, responsible for data acquisition, control and representation. Borland C++ Builder technologies based on VCL...

  10. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation...

  11. An improved coarse-grained parallel algorithm for computational acceleration of ordinary Kriging interpolation

    Science.gov (United States)

    Hu, Hongda; Shu, Hong

    2015-05-01

    Heavy computation limits the use of Kriging interpolation methods in many real-time applications, especially with the ever-increasing problem size. Many researchers have realized that parallel processing techniques are critical to fully exploit computational resources and feasibly solve computation-intensive problems like Kriging. Much research has addressed the parallelization of traditional approach to Kriging, but this computation-intensive procedure may not be suitable for high-resolution interpolation of spatial data. On the basis of a more effective serial approach, we propose an improved coarse-grained parallel algorithm to accelerate ordinary Kriging interpolation. In particular, the interpolation task of each unobserved point is considered as a basic parallel unit. To reduce time complexity and memory consumption, the large right hand side matrix in the Kriging linear system is transformed and fixed at only two columns and therefore no longer directly relevant to the number of unobserved points. The MPI (Message Passing Interface) model is employed to implement our parallel programs in a homogeneous distributed memory system. Experimentally, the improved parallel algorithm performs better than the traditional one in spatial interpolation of annual average precipitation in Victoria, Australia. For example, when the number of processors is 24, the improved algorithm keeps speed-up at 20.8 while the speed-up of the traditional algorithm only reaches 9.3. Likewise, the weak scaling efficiency of the improved algorithm is nearly 90% while that of the traditional algorithm almost drops to 40% with 16 processors. Experimental results also demonstrate that the performance of the improved algorithm is enhanced by increasing the problem size.

  12. Strategic Implications for E-Business Organizations in the Ubiquitous Computing Economy

    Institute of Scientific and Technical Information of China (English)

    YUM Jihwan; KIM Hyoungdo

    2004-01-01

    The ubiquitous economy brings both pros and cons for the organizations. The third space emerged by the development of ubiquitous computing generates new concept of community. The community is tightly coupled with people, products, and systems. Organizational strategies need to be reshaped for the changing environment in the third space and community. Organizational structure also needs to change for community serving organization. Community serving concept equipped with the standardized technology will be essential. One of the key technologies, RFID service will play a key role to acknowledge identification and services required. When the needs for sensing the environment increase,technological requirement such as the ubiquitous sensor network (USN) will be critically needed.

  13. Efficient acceleration of mutual information computation for nonrigid registration using CUDA.

    Science.gov (United States)

    Ikeda, Kei; Ino, Fumihiko; Hagihara, Kenichi

    2014-05-01

    In this paper, we propose an efficient acceleration method for the nonrigid registration of multimodal images that uses a graphics processing unit. The key contribution of our method is efficient utilization of on-chip memory for both normalized mutual information (NMI) computation and hierarchical B-spline deformation, which compose a well-known registration algorithm. We implement this registration algorithm as a compute unified device architecture program with an efficient parallel scheme and several optimization techniques such as hierarchical data organization, data reuse, and multiresolution representation. We experimentally evaluate our method with four clinical datasets consisting of up to 512 × 512 × 296 voxels. We find that exploitation of on-chip memory achieves a 12-fold increase in speed over an off-chip memory version and, therefore, it increases the efficiency of parallel execution from 4% to 46%. We also find that our method running on a GeForce GTX 580 card is approximately 14 times faster than a fully optimized CPU-based implementation running on four cores. Some multimodal registration results are also provided to understand the limitation of our method. We believe that our highly efficient method, which completes an alignment task within a few tens of seconds, will be useful to realize rapid nonrigid registration.

  14. Computations of longitudinal electron dynamics in the recirculating cw RF accelerator-recuperator for the high average power FEL

    Science.gov (United States)

    Sokolov, A. S.; Vinokurov, N. A.

    1994-03-01

    The use of optimal longitudinal phase-energy motion conditions for bunched electrons in a recirculating RF accelerator gives the possibility to increase the final electron peak current and, correspondingly, the FEL gain. The computer code RECFEL, developed for simulations of the longitudinal compression of electron bunches with high average current, essentially loading the cw RF cavities of the recirculator-recuperator, is briefly described and illustrated by some computational results.

  15. Computation of Material Demand in the Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) Process

    Science.gov (United States)

    2015-08-01

    Strategic Materials (RAMF-SM) Process Eleanor L. Schwartz James S. Thomason, Project Leader INSTITUTE FOR DEFENSE ANALYSES 4850 Mark Center Drive Alexandria...the Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) Process Eleanor L. Schwartz James S. Thomason, Project Leader iii...Inter-industry Forecasting Project at the University of Maryland (INFORUM), College Park , MD, 2001. Meade, Douglas S., et al. ILIAD. Inter-industry

  16. Accelerating groundwater flow simulation in MODFLOW using JASMIN-based parallel computing.

    Science.gov (United States)

    Cheng, Tangpei; Mo, Zeyao; Shao, Jingli

    2014-01-01

    To accelerate the groundwater flow simulation process, this paper reports our work on developing an efficient parallel simulator through rebuilding the well-known software MODFLOW on JASMIN (J Adaptive Structured Meshes applications Infrastructure). The rebuilding process is achieved by designing patch-based data structure and parallel algorithms as well as adding slight modifications to the compute flow and subroutines in MODFLOW. Both the memory requirements and computing efforts are distributed among all processors; and to reduce communication cost, data transfers are batched and conveniently handled by adding ghost nodes to each patch. To further improve performance, constant-head/inactive cells are tagged and neglected during the linear solving process and an efficient load balancing strategy is presented. The accuracy and efficiency are demonstrated through modeling three scenarios: The first application is a field flow problem located at Yanming Lake in China to help design reasonable quantity of groundwater exploitation. Desirable numerical accuracy and significant performance enhancement are obtained. Typically, the tagged program with load balancing strategy running on 40 cores is six times faster than the fastest MICCG-based MODFLOW program. The second test is simulating flow in a highly heterogeneous aquifer. The AMG-based JASMIN program running on 40 cores is nine times faster than the GMG-based MODFLOW program. The third test is a simplified transient flow problem with the order of tens of millions of cells to examine the scalability. Compared to 32 cores, parallel efficiency of 77 and 68% are obtained on 512 and 1024 cores, respectively, which indicates impressive scalability.

  17. Users' guide for the Accelerated Leach Test Computer Program

    Energy Technology Data Exchange (ETDEWEB)

    Fuhrmann, M.; Heiser, J.H.; Pietrzak, R.; Franz, Eena-Mai; Colombo, P.

    1990-11-01

    This report is a step-by-step guide for the Accelerated Leach Test (ALT) Computer Program developed to accompany a new leach test for solidified waste forms. The program is designed to be used as a tool for performing the calculations necessary to analyze leach test data, a modeling program to determine if diffusion is the operating leaching mechanism (and, if not, to indicate other possible mechanisms), and a means to make extrapolations using the diffusion models. The ALT program contains four mathematical models that can be used to represent the data. The leaching mechanisms described by these models are: (1) diffusion through a semi-infinite medium (for low fractional releases), (2) diffusion through a finite cylinder (for high fractional releases), (3) diffusion plus partitioning of the source term, (4) solubility limited leaching. Results are presented as a graph containing the experimental data and the best-fit model curve. Results can also be output as LOTUS 1-2-3 files. 2 refs.

  18. Computer simulations for a deceleration and radio frequency quadrupole instrument for accelerator ion beams

    Energy Technology Data Exchange (ETDEWEB)

    Eliades, J.A., E-mail: j.eliades@alum.utoronto.ca; Kim, J.K.; Song, J.H.; Yu, B.Y.

    2015-10-15

    Radio-frequency quadrupole (RFQ) technology incorporated into the low energy ion beam line of an accelerator system can greatly broaden the range of applications and facilitate unique experimental capabilities. However, ten’s of keV kinetic energy negative ion beams with large emittances and energy spreads must first be decelerated down to <100 eV for ion–gas interactions, placing special demands on the deceleration optics and RFQ design. A system with large analyte transmission in the presence of gas has so far proven challenging. Presented are computer simulations using SIMION 8.1 for an ion deceleration and RFQ ion guide instrument design. Code included user-defined gas pressure gradients and threshold energies for ion–gas collisional losses. Results suggest a 3 mm diameter, 35 keV {sup 36}Cl{sup −} ion beam with 8 eV full-width half maximum Gaussian energy spread and 35 mrad angular divergence can be efficiently decelerated and then cooled in He gas, with a maximum pressure of 7 mTorr, to 2 eV within 450 mm in the RFQs. Vacuum transmissions were 100%. Ion energy distributions at initial RFQ capture are shown to be much larger than the average value expected from the deceleration potential and this appears to be a general result arising from kinetic energy gain in the RFQ field. In these simulations, a potential for deceleration to 25 eV resulted in a 30 eV average energy distribution with a small fraction of ions >70 eV.

  19. Thinking Strategically.

    Science.gov (United States)

    Jeffress, Conway

    2000-01-01

    Asserts that community college leaders must think strategically and understand the difference between what is important and immediate, and what is strategic and essential to the long-term survival of a college. States that thinking strategically aligns decision-making and actions with the core purpose of the college; produces core competencies in…

  20. FDTD Acceleration for Cylindrical Resonator Design Based on the Hybrid of Single and Double Precision Floating-Point Computation

    Directory of Open Access Journals (Sweden)

    Hasitha Muthumala Waidyasooriya

    2014-01-01

    Full Text Available Acceleration of FDTD (finite-difference time-domain is very important for the fields such as computational electromagnetic simulation. We consider the FDTD simulation model of cylindrical resonator design that requires double precision floating-point and cannot be done using single precision. Conventional FDTD acceleration methods have a common problem of memory-bandwidth limitation due to the large amount of parallel data access. To overcome this problem, we propose a hybrid of single and double precision floating-point computation method that reduces the data-transfer amount. We analyze the characteristics of the FDTD simulation to find out when we can use single precision instead of double precision. According to the experimental results, we achieved over 15 times of speed-up compared to the CPU single-core implementation and over 1.52 times of speed-up compared to the conventional GPU-based implementation.

  1. Strategizing Communication

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    Strategizing Communication offers a unique perspective on the theory and practice of strategic communication. Written for students and practitioners interested in learning about and acquiring tools for dealing with the technological, environmental and managerial challenges, which organizations face...... when communicating in today’s mediascape, this book presents an array of theories, concepts and models through which we can understand and practice communication strategically. The core of the argument is in the title: strategizing – meaning the act of making something strategic. This entails looking...... beyond, but not past instrumental, rational plans in order to become better able to understand and manage the concrete, incremental practices and contexts in which communication becomes strategic. Thus, we argue that although strategic communicators do (and should) make plans, a plan in itself does...

  2. Strategic Entrepreneurship

    DEFF Research Database (Denmark)

    Klein, Peter G.; Barney, Jay B.; Foss, Nicolai Juul

    and capture value through resource acquisition and competitive posi-tioning. (2) Opportunity-seeking and advantage-seeking—the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field—are pro-cesses that should be considered jointly. This entry......Strategic entrepreneurship is a newly recognized field that draws, not surprisingly, from the fields of strategic management and entrepreneurship. The field emerged officially with the 2001 special issue of the Strategic Management Journal on “strategic entrepreneurship”; the first dedicated...... periodical, the Strategic Entrepreneurship Journal, appeared in 2007. Strategic entrepreneurship is built around two core ideas. (1) Strategy formulation and execution involves attributes that are fundamentally entrepreneurial, such as alertness, creativity, and judgment, and entrepreneurs try to create...

  3. Performance analysis and acceleration of cross-correlation computation using FPGA implementation for digital signal processing

    Science.gov (United States)

    Selma, R.

    2016-09-01

    Paper describes comparison of cross-correlation computation speed of most commonly used computation platforms (CPU, GPU) with an FPGA-based design. It also describes the structure of cross-correlation unit implemented for testing purposes. Speedup of computations was achieved using FPGA-based design, varying between 16 and 5400 times compared to CPU computations and between 3 and 175 times compared to GPU computations.

  4. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers

    NARCIS (Netherlands)

    Garza, J.L.B.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.W.; Huysmans, M.A.; Dieën, J.H. van; Beek, A.J. van der; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120

  5. Graphics Processing Unit-Accelerated Code for Computing Second-Order Wiener Kernels and Spike-Triggered Covariance

    Science.gov (United States)

    Mano, Omer

    2017-01-01

    Sensory neuroscience seeks to understand and predict how sensory neurons respond to stimuli. Nonlinear components of neural responses are frequently characterized by the second-order Wiener kernel and the closely-related spike-triggered covariance (STC). Recent advances in data acquisition have made it increasingly common and computationally intensive to compute second-order Wiener kernels/STC matrices. In order to speed up this sort of analysis, we developed a graphics processing unit (GPU)-accelerated module that computes the second-order Wiener kernel of a system’s response to a stimulus. The generated kernel can be easily transformed for use in standard STC analyses. Our code speeds up such analyses by factors of over 100 relative to current methods that utilize central processing units (CPUs). It works on any modern GPU and may be integrated into many data analysis workflows. This module accelerates data analysis so that more time can be spent exploring parameter space and interpreting data. PMID:28068420

  6. Strategic Supply

    Science.gov (United States)

    2003-01-01

    March 7, 2003. [29] Velis, Lil. “ Publishing Industry : Supply Chain Strategy in Action,” briefing presented to ICAF Strategic Supply Seminar, May 6...Lambert and Stock, page 48. [32] Velis, Lil. “ Publishing Industry : Supply Chain Strategy in Action,” briefing presented to ICAF Strategic Supply

  7. Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing

    Science.gov (United States)

    Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim

    2011-03-01

    Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.

  8. The Role of Implicit Motives in Strategic Decision-Making: Computational Models of Motivated Learning and the Evolution of Motivated Agents

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2015-11-01

    Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.

  9. Study of irradiation induced restructuring of high burnup fuel - Use of computer and accelerator for fuel science and engineering -

    Energy Technology Data Exchange (ETDEWEB)

    Sataka, M.; Ishikawa, N.; Chimn, Y.; Nakamura, J.; Amaya, M. [Japan Atomic Energy Agency, Naka Gun (Japan); Iwasawa, M.; Ohnuma, T.; Sonoda, T. [Central Research Institute of Electric Power Industry, Tokyo (Japan); Kinoshita, M.; Geng, H. Y.; Chen, Y.; Kaneta, Y. [The Univ. of Tokyo, Tokyo (Japan); Yasunaga, K.; Matsumura, S.; Yasuda, K. [Kyushu Univ., Motooka (Japan); Iwase [Osaka Prefecture Univ., Osaka (Japan); Ichinomiya, T.; Nishiuran, Y. [Hokkaido Univ., Kitaku (Japan); Matzke, HJ. [Academy of Ceramics, Karlsruhe (Germany)

    2008-10-15

    In order to develop advanced fuel for future LWR reactors, trials were made to simulate the high burnup restructuring of the ceramics fuel, using accelerator irradiation out of pile and with computer simulation. The target is to reproduce the principal complex process as a whole. The reproduction of the grain subdivision (sub grain formation) was successful at experiments with sequential combined irradiation. It was made by recovery process of the accumulated dislocations, making cells and sub-boundaries at grain boundaries and pore surfaces. Details of the grain sub division mechanism is now in front of us outside of the reactor. Extensive computer science studies, first principle and molecular dynamics gave behavior of fission gas atoms and interstitial oxygen, assisting the high burnup restructuring.

  10. Computation of thermal properties via 3D homogenization of multiphase materials using FFT-based accelerated scheme

    CERN Document Server

    Lemaitre, Sophie; Choi, Daniel; Karamian, Philippe

    2015-01-01

    In this paper we study the thermal effective behaviour for 3D multiphase composite material consisting of three isotropic phases which are the matrix, the inclusions and the coating media. For this purpose we use an accelerated FFT-based scheme initially proposed in Eyre and Milton (1999) to evaluate the thermal conductivity tensor. Matrix and spherical inclusions media are polymers with similar properties whereas the coating medium is metallic hence better conducting. Thus, the contrast between the coating and the others media is very large. For our study, we use RVEs (Representative volume elements) generated by RSA (Random Sequential Adsorption) method developed in our previous works, then, we compute effective thermal properties using an FFT-based homogenization technique validated by comparison with the direct finite elements method. We study the thermal behaviour of the 3D-multiphase composite material and we show what features should be taken into account to make the computational approach efficient.

  11. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    Science.gov (United States)

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  12. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  13. Intro - High Performance Computing for 2015 HPC Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Klitsner, Tom [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The recent Executive Order creating the National Strategic Computing Initiative (NSCI) recognizes the value of high performance computing for economic competitiveness and scientific discovery and commits to accelerate delivery of exascale computing. The HPC programs at Sandia –the NNSA ASC program and Sandia’s Institutional HPC Program– are focused on ensuring that Sandia has the resources necessary to deliver computation in the national interest.

  14. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...... a somewhat neglected area. It also suggests several new practical ideas that may improve management decisions....

  15. Acceleration of FEM-based transfer matrix computation for forward and inverse problems of electrocardiography.

    Science.gov (United States)

    Farina, Dmytro; Jiang, Y; Dössel, O

    2009-12-01

    The distributions of transmembrane voltage (TMV) within the cardiac tissue are linearly connected with the patient's body surface potential maps (BSPMs) at every time instant. The matrix describing the relation between the respective distributions is referred to as the transfer matrix. This matrix can be employed to carry out forward calculations in order to find the BSPM for any given distribution of TMV inside the heart. Its inverse can be used to reconstruct the cardiac activity non-invasively, which can be an important diagnostic tool in the clinical practice. The computation of this matrix using the finite element method can be quite time-consuming. In this work, a method is proposed allowing to speed up this process by computing an approximate transfer matrix instead of the precise one. The method is tested on three realistic anatomical models of real-world patients. It is shown that the computation time can be reduced by 50% without loss of accuracy.

  16. Accelerating selected columns of the density matrix computations via approximate column selection

    CERN Document Server

    Damle, Anil; Ying, Lexing

    2016-01-01

    Localized representation of the Kohn-Sham subspace plays an important role in quantum chemistry and materials science. The recently developed selected columns of the density matrix (SCDM) method [J. Chem. Theory Comput. 11, 1463, 2015] is a simple and robust procedure for finding a localized representation of a set of Kohn-Sham orbitals from an insulating system. The SCDM method allows the direct construction of a well conditioned (or even orthonormal) and localized basis for the Kohn-Sham subspace. The SCDM procedure avoids the use of an optimization procedure and does not depend on any adjustable parameters. The most computationally expensive step of the SCDM method is a column pivoted QR factorization that identifies the important columns for constructing the localized basis set. In this paper, we develop a two stage approximate column selection strategy to find the important columns at much lower computational cost. We demonstrate the effectiveness of this process using a dissociation process of a BH$_{3}...

  17. Accelerating the discovery of space-time patterns of infectious diseases using parallel computing.

    Science.gov (United States)

    Hohl, Alexander; Delmelle, Eric; Tang, Wenwu; Casas, Irene

    2016-11-01

    Infectious diseases have complex transmission cycles, and effective public health responses require the ability to monitor outbreaks in a timely manner. Space-time statistics facilitate the discovery of disease dynamics including rate of spread and seasonal cyclic patterns, but are computationally demanding, especially for datasets of increasing size, diversity and availability. High-performance computing reduces the effort required to identify these patterns, however heterogeneity in the data must be accounted for. We develop an adaptive space-time domain decomposition approach for parallel computation of the space-time kernel density. We apply our methodology to individual reported dengue cases from 2010 to 2011 in the city of Cali, Colombia. The parallel implementation reaches significant speedup compared to sequential counterparts. Density values are visualized in an interactive 3D environment, which facilitates the identification and communication of uneven space-time distribution of disease events. Our framework has the potential to enhance the timely monitoring of infectious diseases.

  18. PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)

    Science.gov (United States)

    Vincenti, Henri

    2016-03-01

    The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.

  19. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...

  20. Strategic Responsiveness

    DEFF Research Database (Denmark)

    Pedersen, Carsten; Juul Andersen, Torben

    decision making is often conceived as ‘standing on the two feet’ of deliberate or intended strategic decisions by top management and emergent strategic decisions pursued by lower-level managers and employees. In this view, the paper proposes that bottom-up initiatives have a hard time surfacing...... in hierarchical organizations and that lowerlevel managers and employees, therefore, pursue various strategies to bypass the official strategy processes to act on emerging strategic issues and adapt to changing environmental conditions.......The analysis of major resource committing decisions is central focus in the strategy field, but despite decades of rich conceptual and empirical research we still seem distant from a level of understanding that can guide corporate practices under dynamic and unpredictable conditions. Strategic...

  1. LCODE: a parallel quasistatic code for computationally heavy problems of plasma wakefield acceleration

    CERN Document Server

    Sosedkin, Alexander

    2015-01-01

    LCODE is a freely-distributed quasistatic 2D3V code for simulating plasma wakefield acceleration, mainly specialized at resource-efficient studies of long-term propagation of ultrarelativistic particle beams in plasmas. The beam is modeled with fully relativistic macro-particles in a simulation window copropagating with the light velocity; the plasma can be simulated with either kinetic or fluid model. Several techniques are used to obtain exceptional numerical stability and precision while maintaining high resource efficiency, enabling LCODE to simulate the evolution of long particle beams over long propagation distances even on a laptop. A recent upgrade enabled LCODE to perform the calculations in parallel. A pipeline of several LCODE processes communicating via MPI (Message-Passing Interface) is capable of executing multiple consecutive time steps of the simulation in a single pass. This approach can speed up the calculations by hundreds of times.

  2. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  3. LCODE: A parallel quasistatic code for computationally heavy problems of plasma wakefield acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Sosedkin, A.P.; Lotov, K.V. [Budker Institute of Nuclear Physics SB RAS, 630090 Novosibirsk (Russian Federation); Novosibirsk State University, 630090 Novosibirsk (Russian Federation)

    2016-09-01

    LCODE is a freely distributed quasistatic 2D3V code for simulating plasma wakefield acceleration, mainly specialized at resource-efficient studies of long-term propagation of ultrarelativistic particle beams in plasmas. The beam is modeled with fully relativistic macro-particles in a simulation window copropagating with the light velocity; the plasma can be simulated with either kinetic or fluid model. Several techniques are used to obtain exceptional numerical stability and precision while maintaining high resource efficiency, enabling LCODE to simulate the evolution of long particle beams over long propagation distances even on a laptop. A recent upgrade enabled LCODE to perform the calculations in parallel. A pipeline of several LCODE processes communicating via MPI (Message‐Passing Interface) is capable of executing multiple consecutive time steps of the simulation in a single pass. This approach can speed up the calculations by hundreds of times.

  4. LCODE: A parallel quasistatic code for computationally heavy problems of plasma wakefield acceleration

    Science.gov (United States)

    Sosedkin, A. P.; Lotov, K. V.

    2016-09-01

    LCODE is a freely distributed quasistatic 2D3V code for simulating plasma wakefield acceleration, mainly specialized at resource-efficient studies of long-term propagation of ultrarelativistic particle beams in plasmas. The beam is modeled with fully relativistic macro-particles in a simulation window copropagating with the light velocity; the plasma can be simulated with either kinetic or fluid model. Several techniques are used to obtain exceptional numerical stability and precision while maintaining high resource efficiency, enabling LCODE to simulate the evolution of long particle beams over long propagation distances even on a laptop. A recent upgrade enabled LCODE to perform the calculations in parallel. A pipeline of several LCODE processes communicating via MPI (Message-Passing Interface) is capable of executing multiple consecutive time steps of the simulation in a single pass. This approach can speed up the calculations by hundreds of times.

  5. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    Science.gov (United States)

    2014-04-01

    e.g., see [Bhattacharyya 2013]). Through their connections to computation graphs [ Karp 1966] and Kahn process networks [Kahn 1974, Lee 1995...parallel programming. In Proceedings of the IFIP Congress, 1974. [ Karp 1966] R. M. Karp and R. E. Miller. Properties of a model for parallel

  6. Computer-aided molecular design of solvents for accelerated reaction kinetics.

    Science.gov (United States)

    Struebing, Heiko; Ganase, Zara; Karamertzanis, Panagiotis G; Siougkrou, Eirini; Haycock, Peter; Piccione, Patrick M; Armstrong, Alan; Galindo, Amparo; Adjiman, Claire S

    2013-11-01

    Solvents can significantly alter the rates and selectivity of liquid-phase organic reactions, often hindering the development of new synthetic routes or, if chosen wisely, facilitating routes by improving rates and selectivities. To address this challenge, a systematic methodology is proposed that quickly identifies improved reaction solvents by combining quantum mechanical computations of the reaction rate constant in a few solvents with a computer-aided molecular design (CAMD) procedure. The approach allows the identification of a high-performance solvent within a very large set of possible molecules. The validity of our CAMD approach is demonstrated through application to a classical nucleophilic substitution reaction for the study of solvent effects, the Menschutkin reaction. The results were validated successfully by in situ kinetic experiments. A space of 1,341 solvents was explored in silico, but required quantum-mechanical calculations of the rate constant in only nine solvents, and uncovered a solvent that increases the rate constant by 40%.

  7. Accelerating inference for diffusions observed with measurement error and large sample sizes using approximate Bayesian computation

    DEFF Research Database (Denmark)

    Picchini, Umberto; Forman, Julie Lyng

    2016-01-01

    In recent years, dynamical modelling has been provided with a range of breakthrough methods to perform exact Bayesian inference. However, it is often computationally unfeasible to apply exact statistical methodologies in the context of large data sets and complex models. This paper considers...... a nonlinear stochastic differential equation model observed with correlated measurement errors and an application to protein folding modelling. An approximate Bayesian computation (ABC)-MCMC algorithm is suggested to allow inference for model parameters within reasonable time constraints. The ABC algorithm...... applications. A simulation study is conducted to compare our strategy with exact Bayesian inference, the latter resulting two orders of magnitude slower than ABC-MCMC for the considered set-up. Finally, the ABC algorithm is applied to a large size protein data. The suggested methodology is fairly general...

  8. Prediction of peak ground acceleration of Iran's tectonic regions using a hybrid soft computing technique

    Directory of Open Access Journals (Sweden)

    Mostafa Gandomi

    2016-01-01

    Full Text Available A new model is derived to predict the peak ground acceleration (PGA utilizing a hybrid method coupling artificial neural network (ANN and simulated annealing (SA, called SA-ANN. The proposed model relates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran's tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R = 0.835 and ρ = 0.0908 and it is subsequently converted into a tractable design equation.

  9. Adjacency-Based Data Reordering Algorithm for Acceleration of Finite Element Computations

    Directory of Open Access Journals (Sweden)

    Min Zhou

    2010-01-01

    Full Text Available Effective use of the processor memory hierarchy is an important issue in high performance computing. In this work, a part level mesh topological traversal algorithm is used to define a reordering of both mesh vertices and regions that increases the spatial locality of data and improves overall cache utilization during on processor finite element calculations. Examples based on adaptively created unstructured meshes are considered to demonstrate the effectiveness of the procedure in cases where the load per processing core is varied but balanced (e.g., elements are equally distributed across cores for a given partition. In one example, the effect of the current ajacency-based data reordering is studied for different phases of an implicit analysis including element-data blocking, element-level computations, sparse-matrix filling and equation solution. These results are compared to a case where reordering is applied to mesh vertices only. The computations are performed on various supercomputers including IBM Blue Gene (BG/L and BG/P, Cray XT (XT3 and XT5 and Sun Constellation Cluster. It is observed that reordering improves the per-core performance by up to 24% on Blue Gene/L and up to 40% on Cray XT5. The CrayPat hardware performance tool is used to measure the number of cache misses across each level of the memory hierarchy. It is determined that the measured decrease in L1, L2 and L3 cache misses when data reordering is used, closely accounts for the observed decrease in the overall execution time.

  10. Using the fast fourier transform to accelerate the computational search for RNA conformational switches.

    Directory of Open Access Journals (Sweden)

    Evan Senter

    Full Text Available Using complex roots of unity and the Fast Fourier Transform, we design a new thermodynamics-based algorithm, FFTbor, that computes the Boltzmann probability that secondary structures differ by [Formula: see text] base pairs from an arbitrary initial structure of a given RNA sequence. The algorithm, which runs in quartic time O(n(4 and quadratic space O(n(2, is used to determine the correlation between kinetic folding speed and the ruggedness of the energy landscape, and to predict the location of riboswitch expression platform candidates. A web server is available at http://bioinformatics.bc.edu/clotelab/FFTbor/.

  11. Computational acceleration of orbital neutral sensor ionizer simulation through phenomena separation

    Science.gov (United States)

    Font, Gabriel I.

    2016-07-01

    Simulation of orbital phenomena is often difficult because of the non-continuum nature of the flow, which forces the use of particle methods, and the disparate time scales, which make long run times necessary. In this work, the computational work load has been reduced by taking advantage of the low number of collisions between different species. This allows each population of particles to be brought into convergence separately using a time step size optimized for its particular motion. The converged populations are then brought together to simulate low probability phenomena, such as ionization or excitation, on much longer time scales. The result of this technique has the effect of reducing run times by a factor of 103-104. The technique was applied to the simulation of a low earth orbit neutral species sensor with an ionizing element. Comparison with laboratory experiments of ion impacts generated by electron flux shows very good agreement.

  12. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    Science.gov (United States)

    Adamo, Mark E; Gerber, Scott A

    2016-09-07

    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc.

  13. An accelerated conjugate gradient algorithm to compute low-lying eigenvalues a study for the Dirac operator in SU(2) lattice QCD

    CERN Document Server

    Kalkreuter, T; Kalkreuter, Thomas; Simma, Hubert

    1995-01-01

    The low-lying eigenvalues of a (sparse) hermitian matrix can be computed with controlled numerical errors by a conjugate gradient (CG) method. This CG algorithm is accelerated by alternating it with exact diagonalisations in the subspace spanned by the numerically computed eigenvectors. We study this combined algorithm in case of the Dirac operator with (dynamical) Wilson fermions in four-dimensional \\SUtwo gauge fields. The algorithm is numerically very stable and can be parallelized in an efficient way. On lattices of sizes 4^4-16^4 an acceleration of the pure CG method by a factor of~4-8 is found.

  14. Isosurface Computation Made Simple: Hardware acceleration,Adaptive Refinement and tetrahedral Stripping

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, V

    2004-02-18

    This paper presents a simple approach for rendering isosurfaces of a scalar field. Using the vertex programming capability of commodity graphics cards, we transfer the cost of computing an isosurface from the Central Processing Unit (CPU), running the main application, to the Graphics Processing Unit (GPU), rendering the images. We consider a tetrahedral decomposition of the domain and draw one quadrangle (quad) primitive per tetrahedron. A vertex program transforms the quad into the piece of isosurface within the tetrahedron (see Figure 2). In this way, the main application is only devoted to streaming the vertices of the tetrahedra from main memory to the graphics card. For adaptively refined rectilinear grids, the optimization of this streaming process leads to the definition of a new 3D space-filling curve, which generalizes the 2D Sierpinski curve used for efficient rendering of triangulated terrains. We maintain the simplicity of the scheme when constructing view-dependent adaptive refinements of the domain mesh. In particular, we guarantee the absence of T-junctions by satisfying local bounds in our nested error basis. The expensive stage of fixing cracks in the mesh is completely avoided. We discuss practical tradeoffs in the distribution of the workload between the application and the graphics hardware. With current GPU's it is convenient to perform certain computations on the main CPU. Beyond the performance considerations that will change with the new generations of GPU's this approach has the major advantage of avoiding completely the storage in memory of the isosurface vertices and triangles.

  15. Strategic Aspirations

    DEFF Research Database (Denmark)

    Christensen, Lars Thøger; Morsing, Mette; Thyssen, Ole

    2016-01-01

    Strategic aspirations are public announcements designed to inspire, motivate, and create expectations about the future. Vision statements or value declarations are examples of such talk, through which organizations announce their ideal selves and declare what they (intend to) do. While aspirations...... aspirations, in other words, have exploratory and inspirational potential—two features that are highly essential in complex areas such as sustainability and CSR. This entry takes a communicative focus on strategic aspirations, highlighting the value of aspirational talk, understood as ideals and intentions...

  16. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  17. Computer simulations predict that chromosome movements and rotations accelerate mitotic spindle assembly without compromising accuracy.

    Science.gov (United States)

    Paul, Raja; Wollman, Roy; Silkworth, William T; Nardi, Isaac K; Cimini, Daniela; Mogilner, Alex

    2009-09-15

    The mitotic spindle self-assembles in prometaphase by a combination of centrosomal pathway, in which dynamically unstable microtubules search in space until chromosomes are captured, and a chromosomal pathway, in which microtubules grow from chromosomes and focus to the spindle poles. Quantitative mechanistic understanding of how spindle assembly can be both fast and accurate is lacking. Specifically, it is unclear how, if at all, chromosome movements and combining the centrosomal and chromosomal pathways affect the assembly speed and accuracy. We used computer simulations and high-resolution microscopy to test plausible pathways of spindle assembly in realistic geometry. Our results suggest that an optimal combination of centrosomal and chromosomal pathways, spatially biased microtubule growth, and chromosome movements and rotations is needed to complete prometaphase in 10-20 min while keeping erroneous merotelic attachments down to a few percent. The simulations also provide kinetic constraints for alternative error correction mechanisms, shed light on the dual role of chromosome arm volume, and compare well with experimental data for bipolar and multipolar HT-29 colorectal cancer cells.

  18. Strategic Bonding.

    Science.gov (United States)

    Davis, Lynn; Tyson, Ben

    2003-01-01

    Many school buildings are in dire need of renovation, expansion, or replacement. Brief case studies from around the country illustrate the importance of finding out why people vote for or against a construction referendum. Lists recommendations for a strategic campaign. (MLF)

  19. Strategic Staffing

    Science.gov (United States)

    Clark, Ann B.

    2012-01-01

    Business and industry leaders do not flinch at the idea of placing top talent in struggling departments and divisions. This is not always the case in public education. The Charlotte-Mecklenburg Schools made a bold statement to its community in its strategic plan by identifying two key reform levers--(1) an effective principal leading each school;…

  20. Computational Approach for Securing Radiology-Diagnostic Data in Connected Health Network using High-Performance GPU-Accelerated AES.

    Science.gov (United States)

    Adeshina, A M; Hashim, R

    2017-03-01

    Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also

  1. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  2. Accelerated Aging of BKC 44306-10 Rigid Polyurethane Foam: FT-IR Spectroscopy, Dimensional Analysis, and Micro Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gilbertson, Robert D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Patterson, Brian M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Smith, Zachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-01-02

    An accelerated aging study of BKC 44306-10 rigid polyurethane foam was carried out. Foam samples were aged in a nitrogen atmosphere at three different temperatures: 50 °C, 65 °C, and 80 °C. Foam samples were periodically removed from the aging canisters at 1, 3, 6, 9, 12, and 15 month intervals when FT-IR spectroscopy, dimensional analysis, and mechanical testing experiments were performed. Micro Computed Tomography imaging was also employed to study the morphology of the foams. Over the course of the aging study the foams the decreased in size by a magnitude of 0.001 inches per inch of foam. Micro CT showed the heterogeneous nature of the foam structure likely resulting from flow effects during the molding process. The effect of aging on the compression and tensile strength of the foam was minor and no cause for concern. FT-IR spectroscopy was used to follow the foam chemistry. However, it was difficult to draw definitive conclusions about the changes in chemical nature of the materials due to large variability throughout the samples.

  3. Strategic Marketing

    OpenAIRE

    Potter, Ned

    2015-01-01

    This chapter from The Library Marketing Toolkit focuses on marketing strategy. Marketing is more successful when it happens as part of a constantly-renewing cycle. The aim of this chapter is to demystify the process of strategic marketing, simplifying it into seven key stages with advice on how to implement each one. Particular emphasis is put on dividing your audience and potential audience into segments, and marketing different messages to each group. \\ud \\ud It includes case studies from T...

  4. Implementation on GPU-based acceleration of the m-line reconstruction algorithm for circle-plus-line trajectory computed tomography

    Science.gov (United States)

    Li, Zengguang; Xi, Xiaoqi; Han, Yu; Yan, Bin; Li, Lei

    2016-10-01

    The circle-plus-line trajectory satisfies the exact reconstruction data sufficiency condition, which can be applied in C-arm X-ray Computed Tomography (CT) system to increase reconstruction image quality in a large cone angle. The m-line reconstruction algorithm is adopted for this trajectory. The selection of the direction of m-lines is quite flexible and the m-line algorithm needs less data for accurate reconstruction compared with FDK-type algorithms. However, the computation complexity of the algorithm is very large to obtain efficient serial processing calculations. The reconstruction speed has become an important issue which limits its practical applications. Therefore, the acceleration of the algorithm has great meanings. Compared with other hardware accelerations, the graphics processing unit (GPU) has become the mainstream in the CT image reconstruction. GPU acceleration has achieved a better acceleration effect in FDK-type algorithms. But the implementation of the m-line algorithm's acceleration for the circle-plus-line trajectory is different from the FDK algorithm. The parallelism of the circular-plus-line algorithm needs to be analyzed to design the appropriate acceleration strategy. The implementation can be divided into the following steps. First, selecting m-lines to cover the entire object to be rebuilt; second, calculating differentiated back projection of the point on the m-lines; third, performing Hilbert filtering along the m-line direction; finally, the m-line reconstruction results need to be three-dimensional-resembled and then obtain the Cartesian coordinate reconstruction results. In this paper, we design the reasonable GPU acceleration strategies for each step to improve the reconstruction speed as much as possible. The main contribution is to design an appropriate acceleration strategy for the circle-plus-line trajectory m-line reconstruction algorithm. Sheep-Logan phantom is used to simulate the experiment on a single K20 GPU. The

  5. Computation of the head-related transfer function via the fast multipole accelerated boundary element method and its spherical harmonic representation.

    Science.gov (United States)

    Gumerov, Nail A; O'Donovan, Adam E; Duraiswami, Ramani; Zotkin, Dmitry N

    2010-01-01

    The head-related transfer function (HRTF) is computed using the fast multipole accelerated boundary element method. For efficiency, the HRTF is computed using the reciprocity principle by placing a source at the ear and computing its field. Analysis is presented to modify the boundary value problem accordingly. To compute the HRTF corresponding to different ranges via a single computation, a compact and accurate representation of the HRTF, termed the spherical spectrum, is developed. Computations are reduced to a two stage process, the computation of the spherical spectrum and a subsequent evaluation of the HRTF. This representation allows easy interpolation and range extrapolation of HRTFs. HRTF computations are performed for the range of audible frequencies up to 20 kHz for several models including a sphere, human head models [the Neumann KU-100 ("Fritz") and the Knowles KEMAR ("Kemar") manikins], and head-and-torso model (the Kemar manikin). Comparisons between the different cases are provided. Comparisons with the computational data of other authors and available experimental data are conducted and show satisfactory agreement for the frequencies for which reliable experimental data are available. Results show that, given a good mesh, it is feasible to compute the HRTF over the full audible range on a regular personal computer.

  6. Look enterprise information technology strategic planning SOA and cloud computing technology integration%看企业信息化战略规划SOA和云计算技术的融入

    Institute of Scientific and Technical Information of China (English)

    牛昊天

    2014-01-01

    Application of SOA and cloud computing enterprise information technology services strategic planning is to support companies achieve their strategic planning purposes established strong measures for business process innovation,improve operational efficiency and reduce operating costs of information has a positive meaning.This paper analyzes the enterprise information problems in strategic planning and strategic planning ideas,on the advantages and limitations of SOA and cloud computing technologies are analyzed,based on the design of both the advantages of SOA and cloud computing technology integration structures to more good service in the construction of enterprise information.%应用SOA和云计算技术服务企业信息化战略规划制定,是支持企业实现自身既定发展战略规划目的的强有力措施,对于企业业务流程创新、提升运营效率、降低信息运作成本有积极意义。本文分析了企业信息化战略规划中存在的问题和战略规划思路,对SOA技术和云计算技术的优势与局限性进行了分析,立足于二者优势设计了SOA和云计算技术融合结构,以更好的服务于企业信息化建设。

  7. Strategic Windows

    DEFF Research Database (Denmark)

    Risberg, Annette; King, David R.; Meglio, Olimpia

    We examine the importance of speed and timing in acquisitions with a framework that identifies management considerations for three interrelated acquisition phases (selection, deal closure and integration) from an acquiring firm’s perspective. Using a process perspective, we pinpoint items within...... acquisition phases that relate to speed. In particular, we present the idea of time-bounded strategic windows in acquisitions consistent with the notion of kairòs, where opportunities appear and must be pursued at the right time for success to occur....

  8. Strategic Management

    CERN Document Server

    Jeffs, Chris

    2008-01-01

    The Sage Course Companion on Strategic Management is an accessible introduction to the subject that avoids lengthy debate in order to focus on the core concepts. It will help the reader to develop their understanding of the key theories, whilst enabling them to bring diverse topics together in line with course requirements. The Sage Course Companion also provides advice on getting the most from your course work; help with analysing case studies and tips on how to prepare for examinations. Designed to compliment existing strategy textbooks, the Companion provides: -Quick and easy access to the

  9. Thinking strategically.

    Science.gov (United States)

    Goree, Michael

    2002-01-01

    Over the course of the past 20 years, human resources has tried a variety of strategic initiatives to add value to the working environment, from the alphabets of TQM, CQI, EVA, ROI, ISO, QS, Theory X, Y, Z, Generation X and Y to re-engineering, balanced scorecard, lean, hoshin, six sigma, to Margaret Wheatley's "The Simpler Way" and finally to cheese and fish. The problem is that none of these is a strategy. They are all tactics to accomplish or achieve a strategy.

  10. Strategic conversation

    Directory of Open Access Journals (Sweden)

    Nicholas Asher

    2013-08-01

    Full Text Available Models of conversation that rely on a strong notion of cooperation don’t apply to strategic conversation — that is, to conversation where the agents’ motives don’t align, such as courtroom cross examination and political debate. We provide a game-theoretic framework that provides an analysis of both cooperative and strategic conversation. Our analysis features a new notion of safety that applies to implicatures: an implicature is safe when it can be reliably treated as a matter of public record. We explore the safety of implicatures within cooperative and non cooperative settings. We then provide a symbolic model enabling us (i to prove a correspondence result between a characterisation of conversation in terms of an alignment of players’ preferences and one where Gricean principles of cooperative conversation like Sincerity hold, and (ii to show when an implicature is safe and when it is not. http://dx.doi.org/10.3765/sp.6.2 BibTeX info

  11. Strategic Engagement

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    “Pakistan regards China as a strategic partner and the bilateral ties have endured the test of time.”Pakistani Prime Minister Shaukat Aziz made the comment during his four-day official visit to China on April 16 when he met Chinese President Hu Jintao,Premier Wen Jiabao and the NPC Standing Committee Chairman Wu Bangguo.His visit to China also included a trip to Boao,where he delivered a keynote speech at the Boao Forum for Asia held on April 20-22. During his stay in Beijing,the two countries signed 13 agreements on cooperation in the fields of space,telecommunications,education and legal assistance,which enhanced an already close strategic partnership. In an interview with Beijing Review reporter Pan Shuangqin,Prime Minister Aziz addressed a number of issues ranging from Asia’s searching for a win-win economic situation to the influence of Sino-Pakistani relations on regional peace.

  12. Using compute unified device architecture-enabled graphic processing unit to accelerate fast Fourier transform-based regression Kriging interpolation on a MODIS land surface temperature image

    Science.gov (United States)

    Hu, Hongda; Shu, Hong; Hu, Zhiyong; Xu, Jianhui

    2016-04-01

    Kriging interpolation provides the best linear unbiased estimation for unobserved locations, but its heavy computation limits the manageable problem size in practice. To address this issue, an efficient interpolation procedure incorporating the fast Fourier transform (FFT) was developed. Extending this efficient approach, we propose an FFT-based parallel algorithm to accelerate regression Kriging interpolation on an NVIDIA® compute unified device architecture (CUDA)-enabled graphic processing unit (GPU). A high-performance cuFFT library in the CUDA toolkit was introduced to execute computation-intensive FFTs on the GPU, and three time-consuming processes were redesigned as kernel functions and executed on the CUDA cores. A MODIS land surface temperature 8-day image tile at a resolution of 1 km was resampled to create experimental datasets at eight different output resolutions. These datasets were used as the interpolation grids with different sizes in a comparative experiment. Experimental results show that speedup of the FFT-based regression Kriging interpolation accelerated by GPU can exceed 1000 when processing datasets with large grid sizes, as compared to the traditional Kriging interpolation running on the CPU. These results demonstrate that the combination of FFT methods and GPU-based parallel computing techniques greatly improves the computational performance without loss of precision.

  13. Strategic Planning: What's so Strategic about It?

    Science.gov (United States)

    Strong, Bart

    2005-01-01

    The words "strategic" and "planning" used together can lead to confusion unless one spent the early years of his career in never-ending, team-oriented, corporate training sessions. Doesn't "strategic" have something to do with extremely accurate bombing or a defensive missile system or Star Wars or something? Don't "strategic" and "planning" both…

  14. Case Studies in Strategic Planning

    Science.gov (United States)

    1990-03-06

    Contains developed case studies in strategic planning on The Navy General Board, Joint Service War Planning 1919 to 1941, Navy Strategic Planning , NASA...in Strategic Planning NPS-56-88-031-PR of September 1988. Strategic planning , Strategic Management.

  15. On "enabling systems - A strategic review"

    Digital Repository Service at National Institute of Oceanography (India)

    Nayak, M.R.

    Enabling Systems is a formal strategic planning exercise that sets its direction in an organization for the 21st century. Information technology (IT), Computer Centre (CC) and Analytical Laboratory (AnLab) are identified as three important...

  16. Strategic serendipity

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark; Lemmergaard, Jeanette

    2014-01-01

    of – and communicative responses to – Kopenhagen Fur's campaign The World's Best – but not perfect in both broadcast media (e.g. print and television) and social media, more specifically Facebook. Through understanding how an organisation can plan for and take advantage of the unpredictable through state......This paper contributes to critical voices on the issue of strategic communication. It does so by exploring how an organisation can seize the moment of serendipity based on careful preparation of its issues management and communication channels. The focus of the study is the media coverage...... organisations as communicative actors can take advantage of the serendipity afforded by other actors' campaigns when advocating and campaigning....

  17. Experiences of How Developed Country Accelerated Scientific and Technological Achievements Transformation of Strategic Emerging Industry---Taking Electric Vehicles as an Example%发达国家加速战略性新兴产业科技成果转化的经验--以电动汽车为例

    Institute of Scientific and Technical Information of China (English)

    杨荣

    2013-01-01

    The strategic emerging industries are a highly challenging industry, improving the conversion rate of the results is an important goal to pursue.Taking electric vehicles as an example, the author analyzed how developed countries promoted strategic emerging industries and technological achievements into practice : special strategic planning, special laws and regula-tions, technical standards research, attention to research cooperation, carrying out demonstration operations, and vigorously pro-moting promotion, setting up a management and intermediary organizations, focusing on education and training of professional and technical personnel, etc. These practices experience to accelerate the development of strategic emerging industries, im-proved strategic emerging industries and technological achievements can have rate value for references.%战略性新兴产业是一个具有高度挑战性的产业,提高其成果的转化率是一个重要的追求目标。文章以电动汽车为例,分析了发达国家促使战略性新兴产业科技成果转化的做法经验,即制定专项战略规划、制定专门法律法规、开展技术标准研究、重视产学研合作、开展示范运营、大力宣传推广、设立管理和中介机构、注重专业技术人才教育培训等,这些做法经验对加快我国战略性新兴产业的发展,提高战略性新兴产业科技成果转化率有参考借鉴价值。

  18. Effects of dimensionality on computer simulations of laser-ion acceleration: When are three-dimensional simulations needed?

    Science.gov (United States)

    Yin, L.; Stark, D. J.; Albright, B. J.

    2016-10-01

    Laser-ion acceleration via relativistic induced transparency provides an effective means to accelerate ions to tens of MeV/nucleon over distances of 10s of μm. These ion sources may enable a host of applications, from fast ignition and x-rays sources to medical treatments. Understanding whether two-dimensional (2D) PIC simulations can capture the relevant 3D physics is important to the development of a predictive capability for short-pulse laser-ion acceleration and for economical design studies for applications of these accelerators. In this work, PIC simulations are performed in 3D and in 2D where the direction of the laser polarization is in the simulation plane (2D-P) and out-of-plane (2D-S). Our studies indicate modeling sensitivity to dimensionality and laser polarization. Differences arise in energy partition, electron heating, ion peak energy, and ion spectral shape. 2D-P simulations are found to over-predict electron heating and ion peak energy. The origin of these differences and the extent to which 2D simulations may capture the key acceleration dynamics will be discussed. Work performed under the auspices of the U.S. DOE by the LANS, LLC, Los Alamos National Laboratory under Contract No. DE-AC52-06NA25396. Funding provided by the Los Alamos National Laboratory Directed Research and Development Program.

  19. 7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

    CERN Multimedia

    Anna Pantelia

    2013-01-01

    7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

  20. Evaluation of Computer-Aided System Design Tools for SDI (Strategic Defense Initiative) Battle Management/C3 (Command, Control and Communications) Architecture Development

    Science.gov (United States)

    1987-10-01

    Ft. Meade, MD 20755-6000 Lee Cooper 1 copy Advanced Technology 2121 Crystal Drive, Suite 200 Arlington, VA 22202 Larry Cox I copy TRW 1950 Sunwest...Richmond Street Providence, RI 02903 " - Mr. Larry Christina, Jr. 1 copy Technology Branch, CSSD-H-SBY Battle Management Division U.S. Army Strategic...copy ’_ Advanced System Architectures Johnson House 73-79 Park Street GU 15 3PE, United Kingdom s. - .%° N. % S CSED Review Panel Dr. Dan Alpert

  1. Learning without experience: Understanding the strategic implications of deregulation and competition in the electricity industry

    Energy Technology Data Exchange (ETDEWEB)

    Lomi, A. [School of Economics, University of Bologna, Bologna (Italy); Larsen, E.R. [Dept. of Managements Systems and Information, City University Business School, London (United Kingdom)

    1998-11-01

    As deregulation of the electricity industry continues to gain momentum around the world, electricity companies face unprecedented challenges. Competitive complexity and intensity will increase substantially as deregulated companies find themselves competing in new industries, with new rules, against unfamiliar competitors - and without any history to learn from. We describe the different kinds of strategic issues that newly deregulated utility companies are facing, and the risks that strategic issues implicate. We identify a number of problems induced by experiential learning under conditions of competence-destroying change, and we illustrate ways in which companies can activate history-independent learning processes. We suggest that Micro worlds - a new generation of computer-based learning environments made possible by conceptual and technological progress in the fields of system dynamics and systems thinking - are particularly appropriate tools to accelerate and enhance organizational and managerial learning under conditions of increased competitive complexity. (au)

  2. Monte Carlo simulations of molecular gas flow: some applications in accelerator vacuum technology using a versatile personal computer program

    Energy Technology Data Exchange (ETDEWEB)

    Pace, A.; Poncet, A. (European Organization for Nuclear Research, Geneva (Switzerland))

    1990-01-01

    The Monte Carlo technique has been used extensively in the past to solve the problem of molecular flow through vacuum pipes or structures with specific boundary conditions for which analytical or even approximate solutions do not exist. Starting from a specific program written in 1975, the idea germinated over the years to produce handy, rather general, problem solving applications capable of running efficiently on modern microcomputers, mainly for ease of transportability and interactivity. Here, the latest version is described. The capabilities and limitations of these tools are presented through a few practical cases of conductance and pumping speed calculations pertinent to accelerator vacuum technology. (author).

  3. Moving Beyond Strategic Planning to Strategic Thinking.

    Science.gov (United States)

    Wolverton, Mimi; Gmelch, Walter

    1999-01-01

    Examines a moderately sized Washington school district's efforts to move beyond strategic planning as a segregated activity toward thinking strategically about long-term plans to govern both tactical operations and the district's future. Top management grew to recognize the legitimacy of multiple external and internal constituent claims. (25…

  4. Revisiting Strategic versus Non-strategic Cooperation

    NARCIS (Netherlands)

    Reuben, E.; Suetens, S.

    2009-01-01

    We use a novel experimental design to disentangle strategically- and non-strategically-motivated cooperation. By using contingent responses in a repeated sequential prisoners’ dilemma with a known probabilistic end, we differentiate end-game behavior from continuation behavior within individuals whi

  5. Strategic information security

    CERN Document Server

    Wylder, John

    2003-01-01

    Introduction to Strategic Information SecurityWhat Does It Mean to Be Strategic? Information Security Defined The Security Professional's View of Information Security The Business View of Information SecurityChanges Affecting Business and Risk Management Strategic Security Strategic Security or Security Strategy?Monitoring and MeasurementMoving Forward ORGANIZATIONAL ISSUESThe Life Cycles of Security ManagersIntroductionThe Information Security Manager's Responsibilities The Evolution of Data Security to Information SecurityThe Repository Concept Changing Job Requirements Business Life Cycles

  6. Understanding the effect of touchdown distance and ankle joint kinematics on sprint acceleration performance through computer simulation.

    Science.gov (United States)

    Bezodis, Neil Edward; Trewartha, Grant; Salo, Aki Ilkka Tapio

    2015-06-01

    This study determined the effects of simulated technique manipulations on early acceleration performance. A planar seven-segment angle-driven model was developed and quantitatively evaluated based on the agreement of its output to empirical data from an international-level male sprinter (100 m personal best = 10.28 s). The model was then applied to independently assess the effects of manipulating touchdown distance (horizontal distance between the foot and centre of mass) and range of ankle joint dorsiflexion during early stance on horizontal external power production during stance. The model matched the empirical data with a mean difference of 5.2%. When the foot was placed progressively further forward at touchdown, horizontal power production continually reduced. When the foot was placed further back, power production initially increased (a peak increase of 0.7% occurred at 0.02 m further back) but decreased as the foot continued to touchdown further back. When the range of dorsiflexion during early stance was reduced, exponential increases in performance were observed. Increasing negative touchdown distance directs the ground reaction force more horizontally; however, a limit to the associated performance benefit exists. Reducing dorsiflexion, which required achievable increases in the peak ankle plantar flexor moment, appears potentially beneficial for improving early acceleration performance.

  7. Strategizing NATOs Narratives

    DEFF Research Database (Denmark)

    Nissen, Thomas Elkjer

    2014-01-01

    , implementation structures, and capabilities can be used to inform the construction of strategic narratives in NATO. Using Libya as a case study he explains that the formulation and implementation of strategic narratives in NATO currently is a fragmented process that rarely takes into account the grand strategic...

  8. CARAT-GxG: CUDA-Accelerated Regression Analysis Toolkit for Large-Scale Gene-Gene Interaction with GPU Computing System.

    Science.gov (United States)

    Lee, Sungyoung; Kwon, Min-Seok; Park, Taesung

    2014-01-01

    In genome-wide association studies (GWAS), regression analysis has been most commonly used to establish an association between a phenotype and genetic variants, such as single nucleotide polymorphism (SNP). However, most applications of regression analysis have been restricted to the investigation of single marker because of the large computational burden. Thus, there have been limited applications of regression analysis to multiple SNPs, including gene-gene interaction (GGI) in large-scale GWAS data. In order to overcome this limitation, we propose CARAT-GxG, a GPU computing system-oriented toolkit, for performing regression analysis with GGI using CUDA (compute unified device architecture). Compared to other methods, CARAT-GxG achieved almost 700-fold execution speed and delivered highly reliable results through our GPU-specific optimization techniques. In addition, it was possible to achieve almost-linear speed acceleration with the application of a GPU computing system, which is implemented by the TORQUE Resource Manager. We expect that CARAT-GxG will enable large-scale regression analysis with GGI for GWAS data.

  9. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  10. Learning to think strategically.

    Science.gov (United States)

    1994-01-01

    Strategic thinking focuses on issues that directly affect the ability of a family planning program to attract and retain clients. This issue of "The Family Planning Manager" outlines the five steps of strategic thinking in family planning administration: 1) define the organization's mission and strategic goals; 2) identify opportunities for improving quality, expanding access, and increasing demand; 3) evaluate each option in terms of its compatibility with the organization's goals; 4) select an option; and 5) transform strategies into action. Also included in this issue is a 20-question test designed to permit readers to assess their "strategic thinking quotient" and a list of sample questions to guide a strategic analysis.

  11. Golden-Finger and Back-Door: Two HW/SW Mechanisms for Accelerating Multicore Computer Systems

    Directory of Open Access Journals (Sweden)

    Slo-Li Chu

    2012-01-01

    Full Text Available Continuously requirements of high-performance computing make the computer system adopt more processors within a system to improve the parallelism and throughput. Although multiple processing cores are implemented in a computer system, the complicated hardware communication mechanism between processors will decrease the performance of overall system. Besides, the unsuitable process scheduling mechanism of conventional operating system can not fully utilize the computation power of additional processors. Accordingly, this paper provides two mechanisms to overcome the above challenges by using hardware and software mechanisms, respectively. In software aspect, we propose a tool, called Golden-Finger, to dynamically adjust the scheduling policy of the process scheduler in Linux. This software mechanism can improve the performance of the specified process by occupying a processor solely. In hardware aspect, we design an effective hardware mechanism, called Back-Door, to communicate two independent processors which can not be operated together, such as the dual PowerPC 405 cores in the Xilinx ML310 system. The experimental results reveal that the two mechanisms can obtain significant performance enhancements.

  12. Examining the Impact of Strategic Learning on Strategic Agility

    OpenAIRE

    Wael Mohamad Subhi Idris; Methaq Taher Kadhim AL-Rubaie

    2013-01-01

    The main aim of this study is to examining the Impact of Strategic Learning on Strategic Agility in Elba House Company in Jordan. The study adopts the demonstrative analytical approach to achieve their objectives. A total of (55) individual, (47) were respondents and answered the questionnaire distributed. The study finding that the strategic learning (strategic knowledge creation, strategic knowledge distribution, strategic knowledge interpretation and of strategic knowledge implementation) ...

  13. Strategic Analysis Overview

    Science.gov (United States)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  14. Implementation of the networked computer based control system for PEFP 100MeV proton linear accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young Gi; Kwon, Hyeok Jung; Jang, Ji Ho; Cho, Yong Sub [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The 100MeV Radio Frequency (RF) linac for the pulsed proton source is under development in KAERI. The main systems of the linac, such as the general timing control, the high power RF system, the control system of klystrons, the power supply system of magnets, the vacuum subsystem, and the cooling system, should be integrated into the control system of PEFP. Various subsystems units of the linac are to be made by other manufacturers with different standards. The technical integration will be based upon Experimental Physics and Industrial Control System (EPICS) software framework. The network attached computers, such as workstation, server, VME, and embedded system, will be applied as control devices. This paper is discussed on integration and implementation of the distributed control systems using networked computer systems.

  15. Vol. 34 - Optimization of quench protection heater performance in high-field accelerator magnets through computational and experimental analysis

    CERN Document Server

    Salmi, Tiina

    2016-01-01

    Superconducting accelerator magnets with increasingly hi gh magnetic fields are being designed to improve the performance of the Large Hadron Collider (LHC) at CERN. One of the technical challenges is the magnet quench p rotection, i.e., preventing damage in the case of an unexpected loss of superc onductivity and the heat generation related to that. Traditionally this is d one by disconnecting the magnet current supply and using so-called protection he aters. The heaters suppress the superconducting state across a large fraction of the winding thus leading to a uniform dissipation of the stored energy. Preli minary studies suggested that the high-field Nb 3 Sn magnets under development for the LHC luminosity upgrade (HiLumi) could not be reliably protected using the existing heaters. In this thesis work I analyzed in detail the present state-of-the-art protection heater technology, aiming to optimize its perfo rmance and evaluate the prospects in high-field magnet protection. The heater efficiency analyses ...

  16. Prediction of peak ground acceleration of Iran’s tectonic regions using a hybrid soft computing technique

    Institute of Scientific and Technical Information of China (English)

    Mostafa Gandomi; Mohsen Soltanpour; Mohammad R. Zolfaghari; Amir H. Gandomi

    2016-01-01

    A new model is derived to predict the peak ground acceleration (PGA) utilizing a hybrid method coupling artificial neural network (ANN) and simulated annealing (SA), called SA-ANN. The proposed model re-lates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran’s tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R ¼ 0.835 and r ¼ 0.0908) and it is subsequently converted into a tractable design equation.

  17. Strategic Shock: Managing the Strategic Gap

    Science.gov (United States)

    2013-03-01

    planning for strategic shocks. In Blindside, Francis Fukuyama covers much of the same intellectual territory with a specific focus on national security...Anticipating Strategic Surprise,” in Blindside, ed. Francis Fukuyama (Washington, D.C.: Brookings Institution Press, 2007), 93. 8 Ibid, 94. 9...so. Their theories and application are focused on the business environment rather than the national security environment. 30 Francis Fukuyama

  18. Strategic Thoughts in Organizations

    Directory of Open Access Journals (Sweden)

    Juliane Inês Di Francesco Kich

    2014-08-01

    Full Text Available This paper aims to analyze a new way of thinking about the organizational strategies through a theoretical discussion of the term "strategic thoughts", and its development in organizations. To achieve this, a bibliographical research was conducted in order to go more deeply on the theme and reach a conceptual background, which can support further analysis. Among the results of this research, it is emphasized that the pragmatic characteristics of strategic planningappears to not have more space in the current organizational world, this tool needs to be interconnected to the strategic thought process to bring more effective results. In this regard, the challenged is present in how the organizations could develop a strategic planning that encourages strategic thoughts instead of undermine it, as well as, the development of tools that promote the ability to think strategically in all employees, regardless of the hierarchical levels.

  19. Cultivating strategic thinking skills.

    Science.gov (United States)

    Shirey, Maria R

    2012-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author presents an overview of strategic leadership and offers approaches for cultivating strategic thinking skills.

  20. Strategic Marketing Planning Audit

    OpenAIRE

    Violeta Radulescu

    2012-01-01

    Market-oriented strategic planning is the process of defining and maintaining a viable relationship between objectives, training of personnel and resources of an organization, on the one hand and market conditions, on the other hand. Strategic marketing planning is an integral part of the strategic planning process of the organization. For successful marketing organization to obtain a competitive advantage, but also to measure the effectiveness of marketing actions the company is required to ...

  1. GPU上计算流体力学的加速%Acceleration of Computational Fluid Dynamics Codes on GPU

    Institute of Scientific and Technical Information of China (English)

    董廷星; 李新亮; 李森; 迟学斌

    2011-01-01

    Computational Fluid Dynamic (CFD) codes based on incompressible Navier-Stokes, compressible Euler and compressible Navier-Stokes solvers are ported on NVIDIA GPU. As validation test, we have simulated a two-dimension cavity flow, Riemann problem and a transonic flow over a RAE2822 airfoil. Maximum 33.2x speedup is reported in our test. To maximum the GPU code performance, we also explore a number of GPU-specific optimization strategies. It demonstrates GPU code gives the expected results compared CPU code and experimental result and GPU computing has good compatibility and bright future.%本文将计算流体力学中的可压缩的纳维叶-斯托克斯(Navier-Stokes),不可压缩的Navier-Stokes和欧拉(Euler)方程移植到NVIDIA GPU上.模拟了3个测试例子,2维的黎曼问题,方腔流问题和RAE2822型的机翼绕流.相比于CPU,我们在GPU平台上最高得到了33.2倍的加速比.为了最大程度提高代码的性能,针对GPU平台上探索了几种优化策略.和CPU以及实验结果对比表明,利用计算流体力学在GPU平台上能够得到预想的结果,具有很好的应用前景.

  2. ARCHER, a New Monte Carlo Software Tool for Emerging Heterogeneous Computing Environments

    Science.gov (United States)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2014-06-01

    The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented.

  3. DCOI Strategic Plan

    Data.gov (United States)

    General Services Administration — Under the Data Center Optimization Initiative (DCOI), covered agencies are required to post DCOI Strategic Plans and updates to their FITARA milestones publicly on...

  4. Sandia Strategic Plan 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    Sandia embarked on its first exercise in corporate strategic planning during the winter of 1989. The results of that effort were disseminated with the publication of Strategic Plan 1990. Four years later Sandia conducted their second major planning effort and published Strategic Plan 1994. Sandia`s 1994 planning effort linked very clearly to the Department of Energy`s first strategic plan, Fueling a Competitive Economy. It benefited as well from the leadership of Lockheed Martin Corporation, the management and operating contractor. Lockheed Martin`s corporate success is founded on visionary strategic planning and annual operational planning driven by customer requirements and technology opportunities. In 1996 Sandia conducted another major planning effort that resulted in the development of eight long-term Strategic Objectives. Strategic Plan 1997 differs from its predecessors in that the robust elements of previous efforts have been integrated into one comprehensive body. The changes implemented so far have helped establish a living strategic plan with a stronger business focus and with clear deployment throughout Sandia. The concept of a personal line of sight for all employees to this strategic plan and its objectives, goals, and annual milestones is becoming a reality.

  5. Strategic Belief Management

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects....... The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars.......While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects...

  6. Strategic planning for neuroradiologists.

    Science.gov (United States)

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan.

  7. GPU-based implementation of an accelerated SR-NLUT based on N-point one-dimensional sub-principal fringe patterns in computer-generated holograms

    Directory of Open Access Journals (Sweden)

    Hee-Min Choi

    2015-06-01

    Full Text Available An accelerated spatial redundancy-based novel-look-up-table (A-SR-NLUT method based on a new concept of the N-point one-dimensional sub-principal fringe pattern (N-point1-D sub-PFP is implemented on a graphics processing unit (GPU for fast calculation of computer-generated holograms (CGHs of three-dimensional (3-Dobjects. Since the proposed method can generate the N-point two-dimensional (2-D PFPs for CGH calculation from the pre-stored N-point 1-D PFPs, the loading time of the N-point PFPs on the GPU can be dramatically reduced, which results in a great increase of the computational speed of the proposed method. Experimental results confirm that the average calculation time for one-object point has been reduced by 49.6% and 55.4% compared to those of the conventional 2-D SR-NLUT methods for each case of the 2-point and 3-point SR maps, respectively.

  8. How Strategic are Strategic Information Systems?

    Directory of Open Access Journals (Sweden)

    Alan Eardley

    1996-11-01

    Full Text Available There are many examples of information systems which are claimed to have created and sustained competitive advantage, allowed beneficial collaboration or simply ensured the continued survival of the organisations which used them These systems are often referred to as being 'strategic'. This paper argues that many of the examples of strategic information systems as reported in the literature are not sufficiently critical in determining whether the systems meet the generally accepted definition of the term 'strategic' - that of achieving sustainable competitive advantage. Eight of the information systems considered to be strategic are examined here from the standpoint of one widely-accepted 'competition' framework- Porter's model of industry competition . The framework is then used to question the linkage between the information systems and the mechanisms which are required for the enactment of strategic business objectives based on competition. Conclusions indicate that the systems are compatible with Porter's framework. Finally, some limitations of the framework are discussed and aspects of the systems which extend beyond the framework are highlighted

  9. Strategic HRD within companies

    NARCIS (Netherlands)

    Wognum, A.A.M.; Mulder, M.M.

    1999-01-01

    This article reports a preliminary survey that was conducted within the framework of the project on strategic human resource development (HRD), in which for various aspects of organisations the effects of strategic HRD are explored. The aim of the survey was to explore some conditions that are impor

  10. 11. Strategic planning.

    Science.gov (United States)

    2014-05-01

    There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.

  11. Improved Strategic Planning

    Science.gov (United States)

    1966-04-08

    to analyze the difficulties of providing improved strategic planning needed for more orderly progress in human affairs. This analysis consists of an...identification of important conceptual difficulties which stand in the way of improving strategic planning . This thesis concludes that it is necessary

  12. Manage "Human Capital" Strategically

    Science.gov (United States)

    Odden, Allan

    2011-01-01

    To strategically manage human capital in education means restructuring the entire human resource system so that schools not only recruit and retain smart and capable individuals, but also manage them in ways that support the strategic directions of the organization. These management practices must be aligned with a district's education improvement…

  13. On strategic spatial planning

    Directory of Open Access Journals (Sweden)

    Tošić Branka

    2014-01-01

    Full Text Available The goal of this paper is to explain the origin and development of strategic spatial planning, to show complex features and highlight the differences and/or advantages over traditional, physical spatial planning. Strategic spatial planning is seen as one of approaches in legally defined planning documents, and throughout the display of properties of sectoral national strategies, as well as issues of strategic planning at the local level in Serbia. The strategic approach is clearly recognized at the national and sub-national level of spatial planning in European countries and in our country. It has been confirmed by the goals outlined in documents of the European Union and Serbia that promote the grounds of territorial cohesion and strategic integrated planning, emphasizing cooperation and the principles of sustainable spatial development. [Projekat Ministarstva nauke Republike Srbije, br. 176017

  14. FY17 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the FY17 Division 1000 Science and Technology Strategic Plan. As this plan represents a continuation of the work we started last year, the four strategic themes (Mission Engagement, Bold Outcomes, Collaborative Environment, and Safety Imperative) remain the same, along with many of the goals. You will see most of the changes in the actions listed for each goal: We completed some actions, modified others, and added a few new ones. As I’ve stated previously, this is not a strategy to be pursued in tension with the Laboratory strategic plan. The Division 1000 strategic plan is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming months.

  15. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  16. The strategic issues - structural elements of strategic management

    OpenAIRE

    Balta Corneliu

    2013-01-01

    The paper presents the most important concepts related to strategic management and the connection with strategic results as they are obtained after the main steps in strategic management are followed. The dynamics of the relationship between strategic issues and objectives is included

  17. Strategic Communication Institutionalized

    DEFF Research Database (Denmark)

    Kjeldsen, Anna Karina

    2013-01-01

    of institutionalization when strategic communication is not yet visible as organizational practice, and how can such detections provide explanation for the later outcome of the process? (2) How can studies of strategic communication benefit from an institutional perspective? How can the virus metaphor generate a deeper...... understanding of the mechanisms that interact from the time an organization is exposed to a new organizational idea such as strategic communication until it surfaces in the form of symptoms such as mission and vision statements, communication manuals and communication positions? The first part of the article...... communication in three Danish art museums....

  18. THE STRATEGIC OPTIONS IN INVESTMENT PROJECTS VALUATION

    Directory of Open Access Journals (Sweden)

    VIOLETA SĂCUI

    2012-11-01

    Full Text Available The topic of real options applies the option valuation techniques to capital budgeting exercises in which a project is coupled with a put or call option. In many project valuation settings, the firm has one or more options to make strategic changes to the project during its life. These strategic options, which are known as real options, are typically ignored in standard discounted cash-flow analysis where a single expected present value is computed. This paper presents the types of real options that are met in economic activity.

  19. Strategic agility for nursing leadership.

    Science.gov (United States)

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  20. Accelerating Value Creation with Accelerators

    DEFF Research Database (Denmark)

    Jonsson, Eythor Ivar

    2015-01-01

    accelerator programs. Microsoft runs accelerators in seven different countries. Accelerators have grown out of the infancy stage and are now an accepted approach to develop new ventures based on cutting-edge technology like the internet of things, mobile technology, big data and virtual reality. It is also...... with the traditional audit and legal universes and industries are examples of emerging potentials both from a research and business point of view to exploit and explore further. The accelerator approach may therefore be an Idea Watch to consider, no matter which industry you are in, because in essence accelerators...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. Accelerated Metals Development by Computation

    Science.gov (United States)

    2008-02-01

    MS&T 2006, Cincinnati, Shade etal. MRS 2006 Fall Meeting, Boston, Shade etal. 2007 International Workshop on Small Scale Plasticity, Braunwald ... Braunwald , Switzerland “Characterization of Grain Growth Behavior in a Nickel-Base Alloy,” Materials Science & Technology 2006 conference (MS&T’06

  3. Accelerating Value Creation with Accelerators

    DEFF Research Database (Denmark)

    Jonsson, Eythor Ivar

    2015-01-01

    Accelerators can help to accelerate value creation. Accelerators are short-term programs that have the objective of creating innovative and fast growing ventures. They have gained attraction as larger corporations like Microsoft, Barclays bank and Nordea bank have initiated and sponsored accelera......Accelerators can help to accelerate value creation. Accelerators are short-term programs that have the objective of creating innovative and fast growing ventures. They have gained attraction as larger corporations like Microsoft, Barclays bank and Nordea bank have initiated and sponsored...... an approach to facilitate implementation and realization of business ideas and is a lucrative approach to transform research into ventures and to revitalize regions and industries in transition. Investors have noticed that the accelerator approach is a way to increase the possibility of success by funnelling...

  4. Strategic planning in transition

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2012-01-01

    In this paper, we analyse how contested transitions in planning rationalities and spatial logics have shaped the processes and outputs of recent episodes of Danish ‘strategic spatial planning’. The practice of ‘strategic spatial planning’ in Denmark has undergone a concerted reorientation...... in the recent years as a consequence of an emerging neoliberal agenda promoting a growth-oriented planning approach emphasising a new spatial logic of growth centres in the major cities and urban regions. The analysis, of the three planning episodes, at different subnational scales, highlights how this new...... style of ‘strategic spatial planning’ with its associated spatial logics is continuously challenged by a persistent regulatory, top-down rationality of ‘strategic spatial planning’, rooted in spatial Keynesianism, which has long characterised the Danish approach. The findings reveal the emergence...

  5. The IAU Strategic Plan

    Science.gov (United States)

    Miley, George

    2016-10-01

    I shall review the content of the IAU Strategic Plan (SP) to use astronomy as a tool for stimulating development globally during the decade 2010 - 2020. Considerable progress has been made in its implementation since the last General Assembly.

  6. Complex Strategic Choices

    DEFF Research Database (Denmark)

    Leleur, Steen

    Effective decision making requires a clear methodology, particularly in a complex world of globalisation. Institutions and companies in all disciplines and sectors are faced with increasingly multi-faceted areas of uncertainty which cannot always be effectively handled by traditional strategies....... Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value...... to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students...

  7. Strategic Management: General Concepts

    Directory of Open Access Journals (Sweden)

    Shahram Tofighi

    2010-05-01

    Full Text Available In the era after substitution of long term planning by strategic planning, it was wished that the managers could act more successful in implementing their plans. The outcomes were far from the expected, there were minor improvements. In the organizations, a plenty of namely strategic plans has been developed during strategic planning processes, but most of these plans have been kept in the shelves, a few of them played their roles as guiding documents for the entire organization. What are the factors inducing such outcomes? Different scientists have offered a variety of justifications, according to their expe-riences."nThe first examined issue was misunderstanding stra-tegic planning by the managers and staff; it means the strategic planning process may be executed erroneously, and what they had expected from this process was not accurate. Substantially, strategic planning looks at the future and coming situations, and is designed to answer the questions which emerge in the future. Unfortunately, this critical and fundamental characteristic of strategic planning is obscured."nStrategic planning conveys the concept of drawing the future and developing a set of different probable scenarios along with defining a set of solutions in order to combat undesirable coming conditions for positioning the system or business. It helps organizations save themselves safe and maintain them successful. In other words, in strategic planning efforts we are seeking solutions fit for problems which will appear in the future for the conditions that will emerge in the future. Unfortunately, most of strategic plans which have been developed in the organizations lack this important and critical characteristic; I mean in most of them the developers had offered solutions in order to solve today's problems in the future! "nThe second issue which was considered by the scientists, was the task of ensuring the continuity of effectiveness of the planning, there was a

  8. Engineering Forum Strategic Plan

    Science.gov (United States)

    This Strategic Plan highlights the purpose, mission, goals, and objectives of the U.S. Environmental Protection Agency (EPA) Engineering Forum (EF). It sets forth the principles that guide the EF's decision-making, helps clarify the EF's priorities, and...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. 2015 Enterprise Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-08-01

    This document aligns with the Department of Energy Strategic Plan for 2014-2018 and provides a framework for integrating our missions and direction for pursuing DOE’s strategic goals. The vision is a guide to advancing world-class science and engineering, supporting our people, modernizing our infrastructure, and developing a management culture that operates a safe and secure enterprise in an efficient manner.

  13. USAF Strategic Master Plan

    Science.gov (United States)

    2015-05-01

    Strategic Posture Annex (SPA). The Strategic Posture Annex provides direction on where and how the Air Force will pursue the mid- and far-term development ...command and control . ● Adaptive Organizations through new and more agile structures and processes. Development and Education An agile Air... control lifecycle costs and reliably deliver timely, suitable solutions to the warfighter.  Use experimentation for agile capability development . Meeting

  14. Strategizing in multiple ways

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Madsen, Charlotte Øland; Rasmussen, Jørgen Gulddahl

    2013-01-01

    Strategy processes are kinds of wayfaring where different actors interpret a formally defined strat-egy differently. In the everyday practice of organizations strategizing takes place in multiple ways through narratives and sensible actions. This forms a meshwork of polyphonic ways to enact one a...... based on this development paper is whether one can understand these diver-gent strategic wayfaring processes as constructive for organizations....

  15. Restriction of the use of hazardous substances (RoHS in the personal computer segment: analysis of the strategic adoption by the manufacturers settled in Brazil

    Directory of Open Access Journals (Sweden)

    Ademir Brescansin

    2015-09-01

    Full Text Available The enactment of the RoHS Directive (Restriction of Hazardous Substances in 2003, limiting the use of certain hazardous substances in electronic equipment has forced companies to adjust their products to comply with this legislation. Even in the absence of similar legislation in Brazil, manufacturers of personal computers which are located in this country have been seen to adopt RoHS for products sold in the domestic market and abroad. The purpose of this study is to analyze whether these manufacturers have really adopted RoHS, focusing on their motivations, concerns, and benefits. This is an exploratory study based on literature review and interviews with HP, Dell, Sony, Lenovo, Samsung, LG, Itautec, and Positivo, using summative content analysis. The results showed that initially, global companies adopted RoHS to market products in Europe, and later expanded this practice to all products. Brazilian companies, however, adopted RoHS to participate in the government’s sustainable procurement bidding processes. It is expected that this study can assist manufacturers in developing strategies for reducing or eliminating hazardous substances in their products and processes, as well as help the government to formulate public policies on reducing risks of environmental contamination.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Developing a framework for predicting upper extremity muscle activities, postures, velocities, and accelerations during computer use: the effect of keyboard use, mouse use, and individual factors on physical exposures.

    Science.gov (United States)

    Bruno Garza, Jennifer L; Catalano, Paul J; Katz, Jeffrey N; Huysmans, Maaike A; Dennerlein, Jack T

    2012-01-01

    Prediction models were developed based on keyboard and mouse use in combination with individual factors that could be used to predict median upper extremity muscle activities, postures, velocities, and accelerations experienced during computer use. In the laboratory, 25 participants performed five simulated computer trials with different amounts of keyboard and mouse use ranging from a highly keyboard-intensive trial to a highly mouse-intensive trial. During each trial, muscle activity and postures of the shoulder and wrist and velocities and accelerations of the wrists, along with percentage keyboard and mouse use, were measured. Four individual factors (hand length, shoulder width, age, and gender) were also measured on the day of data collection. Percentage keyboard and mouse use explained a large amount of the variability in wrist velocities and accelerations. Although hand length, shoulder width, and age were each significant predictors of at least one median muscle activity, posture, velocity, or acceleration exposure, these individual factors explained very little variability in addition to percentage keyboard and mouse use in any of the physical exposures investigated. The amounts of variability explained for models predicting median wrist velocities and accelerations ranged from 75 to 84% but were much lower for median muscle activities and postures (0-50%). RMS errors ranged between 8 to 13% of the range observed. While the predictions for wrist velocities and accelerations may be able to be used to improve exposure assessment for future epidemiologic studies, more research is needed to identify other factors that may improve the predictions for muscle activities and postures.

  18. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  19. Strategic business planning and development for competitive health care systems.

    Science.gov (United States)

    Nauert, Roger C

    2005-01-01

    The health care industry has undergone enormous evolutionary changes in recent years. Competitive transitions have accelerated the compelling need for aggressive strategic business planning and dynamic system development. Success is driven by organizational commitments to farsighted market analyses, timely action, and effective management.

  20. Guidelines for strategic planning

    Energy Technology Data Exchange (ETDEWEB)

    1991-07-01

    Strategic planning needs to be done as one of the integral steps in fulfilling our overall Departmental mission. The role of strategic planning is to assure that the longer term destinations, goals, and objectives which the programs and activities of the Department are striving towards are the best we can envision today so that our courses can then be set to move in those directions. Strategic planning will assist the Secretary, Deputy Secretary, and Under Secretary in setting the long-term directions and policies for the Department and in making final decisions on near-term priorities and resource allocations. It will assist program developers and implementors by providing the necessary guidance for multi-year program plans and budgets. It is one of the essential steps in the secretary's Strategic Planning Initiative. The operational planning most of us are so familiar with deals with how to get things done and with the resources needed (people, money, facilities, time) to carry out tasks. Operating plans like budgets, capital line item projects, R D budgets, project proposals, etc., are vital to the mission of the Department. They deal, however, with how to carry out programs to achieve some objective or budget assumption. Strategic planning deals with the prior question of what it is that should be attempted. It deals with what objectives the many programs and activities of the Department of Department should be striving toward. The purpose of this document is to provide guidance to those organizations and personnel starting the process for the first time as well as those who have prepared strategic plans in the past and now wish to review and update them. This guideline should not be constructed as a rigid, restrictive or confining rulebook. Each organization is encouraged to develop such enhancements as they think may be useful in their planning. The steps outlined in this document represent a very simplified approach to strategic planning. 9 refs.

  1. LIBO accelerates

    CERN Multimedia

    2002-01-01

    The prototype module of LIBO, a linear accelerator project designed for cancer therapy, has passed its first proton-beam acceleration test. In parallel a new version - LIBO-30 - is being developed, which promises to open up even more interesting avenues.

  2. RECIRCULATING ACCELERATION

    Energy Technology Data Exchange (ETDEWEB)

    BERG,J.S.; GARREN,A.A.; JOHNSTONE,C.

    2000-04-07

    This paper compares various types of recirculating accelerators, outlining the advantages and disadvantages of various approaches. The accelerators are characterized according to the types of arcs they use: whether there is a single arc for the entire recirculator or there are multiple arcs, and whether the arc(s) are isochronous or non-isochronous.

  3. Computational investigation of 99Mo, 89Sr, and 131I production rates in a subcritical UO2(NO32 aqueous solution reactor driven by a 30-MeV proton accelerator

    Directory of Open Access Journals (Sweden)

    Z. Gholamzadeh

    2015-12-01

    Full Text Available The use of subcritical aqueous homogenous reactors driven by accelerators presents an attractive alternative for producing 99Mo. In this method, the medical isotope production system itself is used to extract 99Mo or other radioisotopes so that there is no need to irradiate common targets. In addition, it can operate at much lower power compared to a traditional reactor to produce the same amount of 99Mo by irradiating targets. In this study, the neutronic performance and 99Mo, 89Sr, and 131I production capacity of a subcritical aqueous homogenous reactor fueled with low-enriched uranyl nitrate was evaluated using the MCNPX code. A proton accelerator with a maximum 30-MeV accelerating power was used to run the subcritical core. The computational results indicate a good potential for the modeled system to produce the radioisotopes under completely safe conditions because of the high negative reactivity coefficients of the modeled core. The results show that application of an optimized beam window material can increase the fission power of the aqueous nitrate fuel up to 80%. This accelerator-based procedure using low enriched uranium nitrate fuel to produce radioisotopes presents a potentially competitive alternative in comparison with the reactor-based or other accelerator-based methods. This system produces ∼1,500 Ci/wk (∼325 6-day Ci of 99Mo at the end of a cycle.

  4. COMPUTING

    CERN Document Server

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  6. FY16 Strategic Themes.

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    I am pleased to present this summary of the Division 1000 Science and Technology Strategic Plan. This plan was created with considerable participation from all levels of management in Division 1000, and is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. The plan is characterized by four strategic themes: Mission Engagement, Bold Outcomes, Collaborative Environment, and the Safety Imperative. Each theme is accompanied by a brief vision statement, several goals, and planned actions to support those goals throughout FY16. I want to be clear that this is not a strategy to be pursued in tension with the Laboratory strategic plan. Rather, it is intended to describe “how” we intend to show up for the “what” described in Sandia’s Strategic Plan. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming year.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  14. Thinking strategically about capitation.

    Science.gov (United States)

    Boland, P

    1997-05-01

    All managed care stakeholders--health plan members, employers, providers, community organizations, and government entitites--share a common interest in reducing healthcare costs while improving the quality of care health plan members receive. Although capitation is a usually thought of primarily as a payment mechanism, it can be a powerful tool providers and health plans can use to accomplish these strategic objectives and others, such as restoring and maintaining the health of plan members or improving a community's health status. For capitation to work effectively as a strategic tool, its use must be tied to a corporate agenda of partnering with stakeholders to achieve broader strategic goals. Health plans and providers must develop a partnership strategy in which each stakeholder has well-defined roles and responsibilities. The capitation structure must reinforce interdependence, shift focus from meeting organizational needs to meeting customer needs, and develop risk-driven care strategies.

  15. Strategic Self-Ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    We examine strategic self-ignorance—the use of ignorance as an excuse to overindulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals — a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58 percent) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  16. Strategic self-ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    2016-01-01

    We examine strategic self-ignorance—the use of ignorance as an excuse to over-indulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals—a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58%) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  17. Strategic CSR in Afghanistan

    DEFF Research Database (Denmark)

    Azizi, Sameer

    CSR is a rising phenomena in Afghanistan – but why are firms concerned about CSR in a least-developed context such as Afghanistan, and what are the strategic benefits? This paper is one of the first to explore these CSR issues in a least-developed country. It does so by focusing on CSR...... in the Afghan telecommunication sector and in particular on ‘Roshan’ as a case company. The findings of this paper are two-folded. First, it provides an overview of the CSR practices in the telecommunication sector in Afghanistan. Second, it focuses on one case and explains whether Roshan can gain strategic...... advantages through CSR in Afghanistan, and if so which and how these strategic benefits are gained. The paper shows that the developmental challenges of Afghanistan are the key explanations for why companies engage in CSR. Roshan has engaged in proactive CSR to overcome the contextual barriers for growth...

  18. Accelerating QDP++ using GPUs

    CERN Document Server

    Winter, Frank

    2011-01-01

    Graphic Processing Units (GPUs) are getting increasingly important as target architectures in scientific High Performance Computing (HPC). NVIDIA established CUDA as a parallel computing architecture controlling and making use of the compute power of GPUs. CUDA provides sufficient support for C++ language elements to enable the Expression Template (ET) technique in the device memory domain. QDP++ is a C++ vector class library suited for quantum field theory which provides vector data types and expressions and forms the basis of the lattice QCD software suite Chroma. In this work accelerating QDP++ expression evaluation to a GPU was successfully implemented leveraging the ET technique and using Just-In-Time (JIT) compilation. The Portable Expression Template Engine (PETE) and the C API for CUDA kernel arguments were used to build the bridge between host and device memory domains. This provides the possibility to accelerate Chroma routines to a GPU which are typically not subject to special optimisation. As an ...

  19. Mapping strategic diversity: strategic thinking from a variety of perspectives

    NARCIS (Netherlands)

    Jacobs, D.

    2010-01-01

    In his influential work, Strategy Safari, Henry Mintzberg and his colleagues presented ten schools of strategic thought. In this impressive book, Dany Jacobs demonstrates that the real world of strategic management is much wider and richer. In Mapping Strategic Diversity, Jacobs distinguishes betwee

  20. Strategic environmental assessment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    1997-01-01

    The integration of environmental considerations into strategic decision making is recognized as a key to achieving sustainability. In the European Union a draft directive on Strategic Environmental Assessment (SEA) is currently being reviewed by the member states. The nature of the proposed SEA...... that the SEA directive will influence the decision-making process positively and will help to promote improved environmental decisions. However, the guidelines for public participation are not sufficient and the democratic element is strongly limited. On the basis of these findings, recommendations relating...

  1. Vacuum Brazing of Accelerator Components

    Science.gov (United States)

    Singh, Rajvir; Pant, K. K.; Lal, Shankar; Yadav, D. P.; Garg, S. R.; Raghuvanshi, V. K.; Mundra, G.

    2012-11-01

    Commonly used materials for accelerator components are those which are vacuum compatible and thermally conductive. Stainless steel, aluminum and copper are common among them. Stainless steel is a poor heat conductor and not very common in use where good thermal conductivity is required. Aluminum and copper and their alloys meet the above requirements and are frequently used for the above purpose. The accelerator components made of aluminum and its alloys using welding process have become a common practice now a days. It is mandatory to use copper and its other grades in RF devices required for accelerators. Beam line and Front End components of the accelerators are fabricated from stainless steel and OFHC copper. Fabrication of components made of copper using welding process is very difficult and in most of the cases it is impossible. Fabrication and joining in such cases is possible using brazing process especially under vacuum and inert gas atmosphere. Several accelerator components have been vacuum brazed for Indus projects at Raja Ramanna Centre for Advanced Technology (RRCAT), Indore using vacuum brazing facility available at RRCAT, Indore. This paper presents details regarding development of the above mentioned high value and strategic components/assemblies. It will include basics required for vacuum brazing, details of vacuum brazing facility, joint design, fixturing of the jobs, selection of filler alloys, optimization of brazing parameters so as to obtain high quality brazed joints, brief description of vacuum brazed accelerator components etc.

  2. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...

  3. Horizontal Accelerator

    Data.gov (United States)

    Federal Laboratory Consortium — The Horizontal Accelerator (HA) Facility is a versatile research tool available for use on projects requiring simulation of the crash environment. The HA Facility is...

  4. A Handbook for Strategic Planning

    Science.gov (United States)

    1994-01-01

    This handbook was written for Department of the Navy (DON) commanding officers, TQL coordinators, and strategic planning facilitators in response to...questions about the strategic planning process and how it should be conducted within the DON. It is not intended to teach the intricacies of strategic ... planning , but is provided to answer process questions. While every question cannot be anticipated, the handbook details one way to do strategic

  5. Strategic Alignment of Business Intelligence

    OpenAIRE

    Cederberg, Niclas

    2010-01-01

    This thesis is about the concept of strategic alignment of business intelligence. It is based on a theoretical foundation that is used to define and explain business intelligence, data warehousing and strategic alignment. By combining a number of different methods for strategic alignment a framework for alignment of business intelligence is suggested. This framework addresses all different aspects of business intelligence identified as relevant for strategic alignment of business intelligence...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  13. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  14. Being Strategic in HE Management

    Science.gov (United States)

    West, Andrew

    2008-01-01

    The call to be strategic--and with it the concept of strategic management--can bring to mind a wide range of definitions, and there is now a huge array of academic literature supporting the different schools of thought. At a basic level, however, strategic thinking is probably most simply about focusing on the whole, rather than the part. In…

  15. Strategic market planning for hospitals.

    Science.gov (United States)

    Zallocco, R L; Joseph, W B; Doremus, H

    1984-01-01

    The application of strategic market planning to hospital management is discussed, along with features of the strategic marketing management process. A portfolio analysis tool, the McKinsey/G.E. Business Screen, is presented and, using a large urban hospital as an example, discussed in detail relative to hospital administration. Finally, strategic implications of the portfolio analysis are examined.

  16. The Possibilities of Strategic Finance

    Science.gov (United States)

    Chaffee, Ellen

    2010-01-01

    Strategic finance is aligning financial decisions--regarding revenues, creating and maintaining institutional assets, and using those assets--with the institution's mission and strategic plan. The concept known as "strategic finance" increasingly is being seen as a useful perspective for helping boards and presidents develop a sustainable…

  17. Strategic Planning and Financial Management

    Science.gov (United States)

    Conneely, James F.

    2010-01-01

    Strong financial management is a strategy for strategic planning success in student affairs. It is crucial that student affairs professionals understand the necessity of linking their strategic planning with their financial management processes. An effective strategic planner needs strong financial management skills to implement the plan over…

  18. Towards Strategic Language Learning

    NARCIS (Netherlands)

    Oostdam, R.; Rijlaarsdam, Gert

    1995-01-01

    Towards Strategic Language Learning is the result of extensive research in the relationship between mother tongue education and foreign language learning. As language skills that are taught during native language lessons are applied in foreign language performance as well, it is vital that curricula

  19. A Strategic Planning Workbook.

    Science.gov (United States)

    Austin, William

    This workbook outlines the Salem Community College's (New Jersey) Strategic Planning Initiative (SPI), which will enable the college to enter the 21st Century as an active agent in the educational advancement of the Salem community. SPI will allow college faculty, staff, students, and the local community to reflect on the vitality of the college…

  20. The strategic research positioning:

    DEFF Research Database (Denmark)

    Viala, Eva Silberschmidt

    to provide new insights into ‘immigrant’ parents’ perspective on home/school partnership in Denmark. The majority of the immigrant parents came from non-Western countries, and they had already been ‘labelled’ difficult in terms of home/school partnership. This calls for what I call ‘strategic research...

  1. The Strategic Resources

    Institute of Scientific and Technical Information of China (English)

    Liu Zhiyang

    2011-01-01

    “The reason I pay close attention to and am very concerned about standards is that fiom my point of view standards are very important resources or even strategic resources.The meteorological work is highly professional and requires standards in every aspect.With disjoint standards,businesses,services and scientific researches cannot be properly done.”

  2. Strategic Tutor Monitoring.

    Science.gov (United States)

    Chee-kwong, Kenneth Chao

    1996-01-01

    Discusses effective tutor monitoring strategies based on experiences at the Open Learning Institute of Hong Kong. Highlights include key performance and strategic control points; situational factors, including tutor expectations and relevant culture; Theory X versus Theory Y leadership theories; and monitoring relationships with tutors. (LRW)

  3. Adaptive Airport Strategic Planning

    NARCIS (Netherlands)

    Kwakkel, J.H.; Walker, W.E.; Marchau, V.A.W.J.

    2010-01-01

    Airport Strategic Planning (ASP) focuses on the development of plans for the long-term development of an airport. The dominant approach for ASP is Airport Master Planning (AMP). The goal of AMP is to provide a detailed blueprint for how the airport should look in the future, and how it can get there

  4. Strategic Targeted Advertising

    NARCIS (Netherlands)

    A. Galeotti; J.L. Moraga-Gonzalez (José Luis)

    2003-01-01

    textabstractWe present a strategic game of pricing and targeted-advertising. Firms can simultaneously target price advertisements to different groups of customers, or to the entire market. Pure strategy equilibria do not exist and thus market segmentation cannot occur surely. Equilibria exhibit rand

  5. Strategic planning for marketers.

    Science.gov (United States)

    Wilson, I

    1978-12-01

    The merits of strategic planning as a marketing tool are discussed in this article which takes the view that although marketers claim to be future-oriented, they focus too little attention on long-term planning and forecasting. Strategic planning, as defined by these authors, usually encompasses periods of between five and twenty-five years and places less emphasis on the past as an absolute predictor of the future. It takes a more probabilistic view of the future than conventional marketing strategy and looks at the corporation as but one component interacting with the total environment. Inputs are examined in terms of environmental, social, political, technological and economic importance. Because of its futuristic orientation, an important tenant of strategic planning is the preparation of several alternative scenarios ranging from most to least likely. By planning for a wide-range of future market conditions, a corporation is more able to be flexible by anticipating the course of future events, and is less likely to become a captive reactor--as the authors believe is now the case. An example of strategic planning at General Elecric is cited.

  6. Strategic Leadership Development Model

    Science.gov (United States)

    2012-03-19

    system in vogue is relatively streamlined and ensures better grooming of potential strategic leaders at varying stages of their career; however, it...Washington, D.C.: National Defence University Press,1997). 20 Op. Cit. 21 Howard Gardener, Leading Minds; An Anatomy of Leadership (Great Britain

  7. Strategic Marketing for Agribusiness.

    Science.gov (United States)

    Welch, Mary A., Ed.

    1993-01-01

    The steps for strategic market planning are discussed including: (1) assessing the situation with market conditions, customers, competitors, and your firm; and (2) crafting a strategy to prioritize target markets, develop a core strategy, and create a marketing mix. Examples of agribusiness successes are presented. The booklet concludes with a…

  8. EMSL Strategic Plan 2008

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2008-08-15

    This Strategic Plan is EMSL’s template for achieving our vision of simultaneous excellence in all aspects of our mission as a national scientific user facility. It reflects our understanding of the long-term stewardship we must work toward to meet the scientific challenges of the Department of Energy and the nation. During the next decade, we will implement the strategies contained in this Plan, working closely with the scientific community, our advisory committees, DOE’s Office of Biological and Environmental Research, and other key stakeholders. This Strategic Plan is fully aligned with the strategic plans of DOE and its Office of Science. We recognize that shifts in science and technology, national priorities, and resources made available through the Federal budget process create planning uncertainties and, ultimately, a highly dynamic planning environment. Accordingly, this Strategic Plan should be viewed as a living document for which we will continually evaluate changing needs and opportunities posed by our stakeholders (i.e., DOE, users, staff, advisory committees), work closely with them to understand and respond to those changes, and align our strategy accordingly.

  9. TACITUS: Text Understanding for Strategic Computing

    Science.gov (United States)

    1990-11-01

    problems of suprasegmental phonology will be left for another paper. 3 Backwards Rules I shall start by making explicit what it means to apply a... suprasegmental issues like stress. The goal of this paper is to contrast two different ways of doing segmental phonology. Both would presumably benefit

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  11. Strategic Management and Business Analysis

    CERN Document Server

    Williamson, David; Jenkins, Wyn; Moreton, Keith Michael

    2003-01-01

    Strategic Business Analysis shows students how to carry out a strategic analysis of a business, with clear guidelines on where and how to apply the core strategic techniques and models that are the integral tools of strategic management.The authors identify the key questions in strategic analysis and provide an understandable framework for answering these questions.Several case studies are used to focus understanding and enable a more thorough analysis of the concepts and issues, especially useful for students involved with case study analysis.Accompanying the text is a CD-Rom containing the m

  12. Strategic Planning in U.S. Municipalities

    Directory of Open Access Journals (Sweden)

    James VAN RAVENSWAY

    2015-12-01

    Full Text Available Strategic planning started in the U.S. as a corporate planning endeavor. By the 1960’s, it had become a major corporate management tool in the Fortune 500. At fi rst, it was seen as a way of interweaving policies, values and purposes with management, resources and market information in a way that held the organization together. By the 1950’s, the concept was simplifi ed somewhat to focus on SWOT as a way of keeping the corporation afl oat in a more turbulent world. The public sector has been under pressure for a long time to become more effi cient, effective and responsive. Many have felt that the adoption of business practices would help to accomplish that. One tool borrowed from business has been strategic planning. At the local government level, strategic planning became popular starting in the 1980’s, and the community’s planning offi ce was called on to lead the endeavor. The planning offi ce was often the advocate of the process. Urban planning offi ces had been doing long-range plans for decades, but with accelerating urban change a more rapid action-oriented response was desired. The paper describes this history and process in the East Lansing, Michigan, U.S., where comprehensive community plans are the result of a multi-year visioning process and call for action- oriented, strategies for targeted parts of the community.

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  14. Strategizing Communication. Theory and Practice

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    Strategizing Communication offers a unique perspective on the theory and practice of strategic communication. Written for students and practitioners interested in learning about and acquiring tools for dealing with the technological, environmental and managerial challenges, which organizations face...... when communicating in today’s mediascape, this book presents an array of theories, concepts and models through which we can understand and practice communication strategically. The core of the argument is in the title: strategizing – meaning the act of making something strategic. This entails looking...... beyond, but not past instrumental, rational plans in order to become better able to understand and manage the concrete, incremental practices and contexts in which communication becomes strategic. Thus, we argue that although strategic communicators do (and should) make plans, a plan in itself does...

  15. Strategic port development

    DEFF Research Database (Denmark)

    Olesen, Peter Bjerg; Dukovska-Popovska, Iskra; Hvolby, Hans-Henrik

    2014-01-01

    developments. This paper examines a series of models from the port development literature and then proposes an approach for conceptualizing the strategic development of a port’s collaboration with local operators and the local hinterland based on connected development steps. The paper is based on a literature...... review relevant to international port development and a case study done in a Danish port as part of the main authors PhD project. The proposed model provides a strategic approach to control and improve the development of a port system and the connected hinterland. While the model is generic in its......While large global ports are recognised as playing a central role in many supply chains as logistic gateways, smaller regional ports have been more stagnant and have not reached the same level of development as the larger ports. The research literature in relation to port development is also...

  16. The Strategic Mediator

    DEFF Research Database (Denmark)

    Rossignoli, Cecilia; Carugati, Andrea; Mola, Lapo

    2009-01-01

    The last 10 years have witnessed the emergence of electronic marketplaces as players that leverage new technologies to facilitate B2B internet-mediated collaborative business. Nowadays these players are augmenting their services from simple intermediation to include new inter-organizational relat......The last 10 years have witnessed the emergence of electronic marketplaces as players that leverage new technologies to facilitate B2B internet-mediated collaborative business. Nowadays these players are augmenting their services from simple intermediation to include new inter......-marketplace assumes the paradoxical role of strategic mediator: an agent who upholds and heightens the fences of the transactions instead of leveling them. The results have implication in shaping how we see the role of technology as strategic or commoditized....

  17. Tourism and Strategic Planning

    DEFF Research Database (Denmark)

    Pasgaard, Jens Christian

    2012-01-01

    The main purpose of this report is to explore and unfold the complexity of the tourism phenomenon in order to qualify the general discussion of tourism-related planning challenges. Throughout the report I aim to demonstrate the strategic potential of tourism in a wider sense and more specifically...... the potential of ‘the extraordinary’ tourism-dominated space. As highlighted in the introduction, this report does not present any systematic analysis of strategic planning processes; neither does it provide any unequivocal conclusions. Rather, the report presents a collection of so-called ‘detours...... as a pilot project which can inform future studies seeking to address the tourism phenomenon from a spatial perspective....

  18. Accelerated Unification

    OpenAIRE

    Arkani-Hamed, Nima; Cohen, Andrew; Georgi, Howard

    2001-01-01

    We construct four dimensional gauge theories in which the successful supersymmetric unification of gauge couplings is preserved but accelerated by N-fold replication of the MSSM gauge and Higgs structure. This results in a low unification scale of $10^{13/N}$ TeV.

  19. Strategic Appraisal 1996.

    Science.gov (United States)

    1996-01-01

    the Zapatista National Liberation Front ( EZLN ) insurgency and the discovery of oil in the Lacandon jungle, the southern state of Chiapas—largely...populated by indigenous peoples—has increased in strategic importance. Talks between the military and the EZLN appear to be deadlocked, and while the...and support. Strong networks of nongovernmental organizations have grown around EZLN -related issues, often with the effect of keeping the Mexican

  20. Strategic Leadership towards Sustainability

    OpenAIRE

    Robèrt, Karl-Henrik; Broman, Göran; Waldron, David; Ny, Henrik; Byggeth, Sophie; Cook, David; Johansson, Lena; Oldmark, Jonas; Basile, George; Haraldsson, Hördur V.

    2004-01-01

    The Master's programme named "Strategic Leadership Towards Sustainability" is offered at the Blekinge Institute of Technology (Blekinge Tekniska Högskola) in Karlskrona, Sweden. This Master's programme builds on four central themes: (1) four scientific principles for socio-ecological sustainability; (2) a planning methodology of "backcasting" based on those scientific principles for sustainability; (3) a five-level model for planning in complex systems, into which backcasting is incorporated ...

  1. Naming as Strategic Communication

    DEFF Research Database (Denmark)

    Schmeltz, Line; Kjeldsen, Anna Karina

    2016-01-01

    This article presents a framework for understanding corporate name change as strategic communication. From a corporate branding perspective, the choice of a new name can be seen as a wish to stand out from a group of similar organizations. Conversely, from an institutional perspective, name change....... Second, it offers practical support to organizations, private as well as public, who find themselves in a situation where changing the name of the organization could be a way to reach either communicative or organizational goals....

  2. Strategic Human Resources Management

    OpenAIRE

    Marta Muqaj

    2016-01-01

    Strategic Human Resources Management (SHRM) represents an important and sensitive aspect of the functioning and development of a company, business, institution, state, public or private agency of a country. SHRM is based on a point of view of the psychological practices, especially by investing on empowerment, broad training and teamwork. This way it remains the primary resource to maintain stability and competitiveness. SHRM has lately evolved on fast and secure steps, and the transformation...

  3. Strategic Transfer Pricing

    OpenAIRE

    Michael Alles; Srikant Datar

    1998-01-01

    Most research into cost systems has focused on their motivational implications. This paper takes a different approach, by developing a model where two oligopolistic firms strategically select their cost-based transfer prices. Duopoly models frequently assume that firms game on their choice of prices. Product prices, however, are ultimately based on the firms' transfer prices that communicate manufacturing costs to marketing departments. It is for this reason that transfer prices will have a s...

  4. Thinking strategically about assessment

    OpenAIRE

    Mutch, A

    2002-01-01

    Drawing upon the literature on strategy formulation in organisations, this paper argues for a focus on strategy as process. It relates this to the need to think strategically about assessment, a need engendered by resource pressures, developments in learning and the demands of external stakeholders. It is argued that in practice assessment strategies are often formed at the level of practice, but that this produces contradiction and confusion at higher levels. Such tensions cannot be managed ...

  5. Making Strategic Analysis Matter

    Science.gov (United States)

    2012-01-01

    way, so intel- ligence can help them ask, for instance, “Does Plan Colombia offer insights for Afghani- stan?” If the analogy is to be useful, it...Intelligence Agency DNI Director of National Intelligence EADS European Aeronautic Defence and Space Company HIV human immunodeficiency virus NATO...can be a task for strategic analysis. For instance, does Plan Colombia offer any insights by analogy for counterinsurgency in Afghanistan? If

  6. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  7. THE MODELS OF STRATEGIC MANAGEMENT OF INFOCOMM BUSINESS

    Directory of Open Access Journals (Sweden)

    M. A. Lyashenko

    2015-01-01

    and communication business made in article of the analysis one general idea of formation of strategy of management of infocommunication business which consists in full recognition of inevitability of globalization processes in the modern world at the accelerated development of information technologies was selected. In these conditions, the companies use such strategic means of the competition, as: increase of productivity, mastering of the new markets, creation of new business models and attraction of talents on a global scale.

  8. Particle Accelerators in China

    Science.gov (United States)

    Zhang, Chuang; Fang, Shouxian

    As the special machines that can accelerate charged particle beams to high energy by using electromagnetic fields, particle accelerators have been widely applied in scientific research and various areas of society. The development of particle accelerators in China started in the early 1950s. After a brief review of the history of accelerators, this article describes in the following sections: particle colliders, heavy-ion accelerators, high-intensity proton accelerators, accelerator-based light sources, pulsed power accelerators, small scale accelerators, accelerators for applications, accelerator technology development and advanced accelerator concepts. The prospects of particle accelerators in China are also presented.

  9. MUON ACCELERATION

    Energy Technology Data Exchange (ETDEWEB)

    BERG,S.J.

    2003-11-18

    One of the major motivations driving recent interest in FFAGs is their use for the cost-effective acceleration of muons. This paper summarizes the progress in this area that was achieved leading up to and at the FFAG workshop at KEK from July 7-12, 2003. Much of the relevant background and references are also given here, to give a context to the progress we have made.

  10. 7 CFR 25.202 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Strategic plan. 25.202 Section 25.202 Agriculture... Procedure § 25.202 Strategic plan. (a) Principles of strategic plan. The strategic plan included in the application must be developed in accordance with the following four key principles: (1) Strategic vision...

  11. SYSTEM REFLEXIVE STRATEGIC MARKETING MANAGEMENT

    Directory of Open Access Journals (Sweden)

    A. Dligach

    2013-10-01

    Full Text Available This article reviews the System Reflexive paradigm of strategic marketing management, being based on the alignment of strategic economic interests of stakeholders, specifically, enterprise owners and hired managers, and consumers. The essence of marketing concept of management comes under review, along with the strategic management approaches to business, buildup and alignment of economic interests of business stakeholders. A roadmap for resolving the problems of modern marketing is proposed through the adoption of System Reflexive marketing theory.

  12. Laser acceleration

    Science.gov (United States)

    Tajima, T.; Nakajima, K.; Mourou, G.

    2017-02-01

    The fundamental idea of Laser Wakefield Acceleration (LWFA) is reviewed. An ultrafast intense laser pulse drives coherent wakefield with a relativistic amplitude robustly supported by the plasma. While the large amplitude of wakefields involves collective resonant oscillations of the eigenmode of the entire plasma electrons, the wake phase velocity ˜ c and ultrafastness of the laser pulse introduce the wake stability and rigidity. A large number of worldwide experiments show a rapid progress of this concept realization toward both the high-energy accelerator prospect and broad applications. The strong interest in this has been spurring and stimulating novel laser technologies, including the Chirped Pulse Amplification, the Thin Film Compression, the Coherent Amplification Network, and the Relativistic Mirror Compression. These in turn have created a conglomerate of novel science and technology with LWFA to form a new genre of high field science with many parameters of merit in this field increasing exponentially lately. This science has triggered a number of worldwide research centers and initiatives. Associated physics of ion acceleration, X-ray generation, and astrophysical processes of ultrahigh energy cosmic rays are reviewed. Applications such as X-ray free electron laser, cancer therapy, and radioisotope production etc. are considered. A new avenue of LWFA using nanomaterials is also emerging.

  13. STRATEGIC COMMUNICATION IN MULTINATIONAL COMPANIES

    Directory of Open Access Journals (Sweden)

    Alexandrina Cristina VASILE

    2014-11-01

    Full Text Available The article intends to show how multinational companies gain market share and visibility by using the appropriate strategic communication. The study evaluate the base framework, analysis, tools, data sources, sets of improvement plans and results that some multinational companies obtain by using strategic communication. The analysed companies are American based mainly communicative corporations and it will be underlined the importance of communication in the current economic environment. The results will show how important strategic communication is along the information used and the strategic management in targeting the position in the market.

  14. Bucharest heavy ion accelerator facility

    Energy Technology Data Exchange (ETDEWEB)

    Ceausescu, V.; Dobrescu, S.; Duma, M.; Indreas, G.; Ivascu, M.; Papureanu, S.; Pascovici, G.; Semenescu, G.

    1986-02-15

    The heavy ion accelerator facility of the Heavy Ion Physics Department at the Institute of Physics and Nuclear Engineering in Bucharest is described. The Tandem accelerator development and the operation of the first stage of the heavy ion postaccelerating system are discussed. Details are given concerning the resonance cavities, the pulsing system matching the dc beam to the RF cavities and the computer control system.

  15. Strategic Human Resources Management

    Directory of Open Access Journals (Sweden)

    Marta Muqaj

    2016-07-01

    Full Text Available Strategic Human Resources Management (SHRM represents an important and sensitive aspect of the functioning and development of a company, business, institution, state, public or private agency of a country. SHRM is based on a point of view of the psychological practices, especially by investing on empowerment, broad training and teamwork. This way it remains the primary resource to maintain stability and competitiveness. SHRM has lately evolved on fast and secure steps, and the transformation from Management of Human Resources to SHRM is becoming popular, but it still remains impossible to exactly estimate how much SHRM has taken place in updating the practices of HRM in organizations and institutions in general. This manuscript aims to make a reflection on strategic management, influence factors in its practices on some organizations. Researchers aim to identify influential factors that play key roles in SHRM, to determine its challenges and priorities which lay ahead, in order to select the most appropriate model for achieving a desirable performance. SHRM is a key factor in the achievement of the objectives of the organization, based on HR through continuous performance growth, it’s a complex process, unpredictable and influenced by many outside and inside factors, which aims to find the shortest way to achieve strategic competitive advantages, by creating structure planning, organizing, thinking values, culture, communication, perspectives and image of the organization. While traditional management of HR is focused on the individual performance of employees, the scientific one is based on the organizational performance, the role of the HRM system as main factor on solving business issues and achievement of competitive advantage within its kind.

  16. Strategic planning and republicanism

    Directory of Open Access Journals (Sweden)

    Mazza Luigi

    2010-01-01

    Full Text Available The paper develops two main linked themes: (i strategic planning reveals in practice limits that are hard to overcome; (ii a complete planning system is efficacy only in the framework of a republican political, social and government culture. It is argued that the growing disappointment associated to strategic planning practices, may be due to excessive expectations, and the difficulties encountered by strategic planning are traced to three main issues: (a the relationship between politics and planning; (b the relationship between government and governance; and (c the relationship between space and socioeconomic development. Some authors recently supported an idea of development as consisting in the qualitative evolution of forms of social rationality and argued that a reflection about the relationships between physical transformations and visions of development could be a way of testing innovations. But such strong demands might be satisfied only if we manage to make a 'new social and territorial pact for development', recreating a social fabric imbued with shared values. The re-creation of a social fabric imbued with shared values requires a rich conception of the political community and the possibility that the moral purposes of the community may be incorporated by the state. All this is missing today. Outside a republican scheme planning activities are principally instruments for legitimising vested interests and facilitating their investments, and the resolution of the conflicts that arise between the planning decisions of the various levels of government becomes at least impracticable. A complete planning system can be practised if can be referred to the authority and syntheses expressed in and by statehood, which suggests that in a democratic system planning is republican by necessity rather than by choice.

  17. Strategic performance management evaluation for the Navy's SPLICE local area networks

    OpenAIRE

    Blankenship, David D.

    1985-01-01

    Approved for public release; distribution is unlimited This thesis investigates those aspects of network performance evaluation thought to pertain specifically to strategic performance management evaluation of the Navy's Stock Point Logistics Integrated Communications Environment (SPLICE) local area networks at stock point and inventory control point sites- Background is provided concerning the SPLICE Project, strategic management, computer performance evaluation tools...

  18. Beyond Strategic Vision

    CERN Document Server

    Cowley, Michael

    2012-01-01

    Hoshin is a system which was developed in Japan in the 1960's, and is a derivative of Management By Objectives (MBO). It is a Management System for determining the appropriate course of action for an organization, and effectively accomplishing the relevant actions and results. Having recognized the power of this system, Beyond Strategic Vision tailors the Hoshin system to fit the culture of North American and European organizations. It is a "how-to" guide to the Hoshin method for executives, managers, and any other professionals who must plan as part of their normal job. The management of an o

  19. The unfocused strategic vision.

    Science.gov (United States)

    Friedman, L H

    1997-01-01

    Integrated delivery systems are often seen as the answer to the question of how to deliver high quality health services to a defined population at the lowest possible cost. This case examines the birth, growth, and ultimate demise of one such system. At first glance, all of the elements necessary for a successful integration were present including visionary leadership and a well defined strategic plan. However, the senior managers did not foresee the problems that would result from a clash of organizational cultures, significant mistrust between and among staff and physicians, and inability to manage the emotional-cognitive landscape.

  20. Strategizing y liderazgo

    OpenAIRE

    Marín Tuyá, Belén

    2013-01-01

    El desarrollo del strategizing, concepto introducido por Whittington (1996) que enfoca la estrategia en la práctica “cómo algo que las personas hacen”, surgió por la creciente insatisfacción con la investigación convencional en estrategia. Así mientras las personas realizaban la estrategia, las teorías se centraban en análisis multivariantes sobre los efectos de la estrategia en el rendimiento de la organización con una curiosa ausencia de los actores humanos. Con el objetivo de avanzar en el...

  1. Strategic port development

    DEFF Research Database (Denmark)

    Olesen, Peter Bjerg; Dukovska-Popovska, Iskra; Steger-Jensen, Kenn;

    2012-01-01

    This paper proposes a framework for strategic development of a port’s collaboration with its hinterland. The framework is based on literature relevant to port development and undertakes market perspective by considering import/export data relevant for the region of interest. The series of steps...... proposed in the framework, provide ports with a systematic approach in finding possibilities for new business ventures and increasing integration with the hinterland. The framework is generic in its approach. A case study illustrates possible usage of the framework in terms of hinterland development....

  2. Guam Strategic Energy Plan

    Energy Technology Data Exchange (ETDEWEB)

    Conrad, M. D.

    2013-07-01

    Describes various energy strategies available to Guam to meet the territory's goal of diversifying fuel sources and reducing fossil energy consumption 20% by 2020.The information presented in this strategic energy plan will be used by the Guam Energy Task Force to develop an energy action plan. Available energy strategies include policy changes, education and outreach, reducing energy consumption at federal facilities, and expanding the use of a range of energy technologies, including buildings energy efficiency and conservation, renewable electricity production, and alternative transportation. The strategies are categorized based on the time required to implement them.

  3. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  4. Accelerators and the Accelerator Community

    Energy Technology Data Exchange (ETDEWEB)

    Malamud, Ernest; Sessler, Andrew

    2008-06-01

    In this paper, standing back--looking from afar--and adopting a historical perspective, the field of accelerator science is examined. How it grew, what are the forces that made it what it is, where it is now, and what it is likely to be in the future are the subjects explored. Clearly, a great deal of personal opinion is invoked in this process.

  5. Application of local area networks to accelerator control systems at the Stanford Linear Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.D.; Linstadt, E.; Melen, R.

    1983-03-01

    The history and current status of SLAC's SDLC networks for distributed accelerator control systems are discussed. These local area networks have been used for instrumentation and control of the linear accelerator. Network topologies, protocols, physical links, and logical interconnections are discussed for specific applications in distributed data acquisition and control system, computer networks and accelerator operations.

  6. accelerating cavity

    CERN Multimedia

    On the inside of the cavitytThere is a layer of niobium. Operating at 4.2 degrees above absolute zero, the niobium is superconducting and carries an accelerating field of 6 million volts per metre with negligible losses. Each cavity has a surface of 6 m2. The niobium layer is only 1.2 microns thick, ten times thinner than a hair. Such a large area had never been coated to such a high accuracy. A speck of dust could ruin the performance of the whole cavity so the work had to be done in an extremely clean environment.

  7. Impact accelerations

    Science.gov (United States)

    Vongierke, H. E.; Brinkley, J. W.

    1975-01-01

    The degree to which impact acceleration is an important factor in space flight environments depends primarily upon the technology of capsule landing deceleration and the weight permissible for the associated hardware: parachutes or deceleration rockets, inflatable air bags, or other impact attenuation systems. The problem most specific to space medicine is the potential change of impact tolerance due to reduced bone mass and muscle strength caused by prolonged weightlessness and physical inactivity. Impact hazards, tolerance limits, and human impact tolerance related to space missions are described.

  8. Operationalizing strategic marketing.

    Science.gov (United States)

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  9. Strategic Human Resource Development. Symposium.

    Science.gov (United States)

    2002

    This document contains three papers on strategic human resource (HR) development. "Strategic HR Orientation and Firm Performance in India" (Kuldeep Singh) reports findings from a study of Indian business executives that suggests there is a positive link between HR policies and practices and workforce motivation and loyalty and…

  10. Strategic Planning for Higher Education.

    Science.gov (United States)

    Kotler, Philip; Murphy, Patrick E.

    1981-01-01

    The framework necessary for achieving a strategic planning posture in higher education is outlined. The most important benefit of strategic planning for higher education decision makers is that it forces them to undertake a more market-oriented and systematic approach to long- range planning. (Author/MLW)

  11. Transfers, Contracts and Strategic Games

    NARCIS (Netherlands)

    Kleppe, J.; Hendrickx, R.L.P.; Borm, P.E.M.; Garcia-Jurado, I.; Fiestras-Janeiro, G.

    2007-01-01

    This paper analyses the role of transfer payments and strategic con- tracting within two-person strategic form games with monetary pay- offs. First, it introduces the notion of transfer equilibrium as a strat- egy combination for which individual stability can be supported by allowing the possibilit

  12. Strategic Interactions in Franchise Relationships

    NARCIS (Netherlands)

    Croonen, Evelien Petronella Maria

    2006-01-01

    This dissertation deals with understanding strategic interactions between franchisors and franchisees. The empirical part of this study consists of in-depth case studies in four franchise systems in the Dutch drugstore industry. The case studies focus on a total of eight strategic change processes i

  13. Strategic directions in tissue engineering.

    NARCIS (Netherlands)

    Johnson, P.C.; Mikos, A.G.; Fisher, J.P.; Jansen, J.A.

    2007-01-01

    The field of tissue engineering is developing rapidly. Given its ultimate importance to clinical care, the time is appropriate to assess the field's strategic directions to optimize research and development activities. To characterize strategic directions in tissue engineering, a distant but reachab

  14. NASA Space Sciences Strategic Planning

    Science.gov (United States)

    Crane, Philippe

    2004-01-01

    The purpose of strategic planning roadmap is to:Fulfill the strategic planning requirements; Provide a guide to the science community in presenting research requests to NASA; Inform and inspire; Focus investments in technology and research for future missions; and Provide the scientific and technical justification for augmentation requests.

  15. Strategic Planning Is an Oxymoron

    Science.gov (United States)

    Bassett, Patrick F.

    2012-01-01

    The thinking on "strategic thinking" has evolved significantly over the years. In the previous century, the independent school strategy was to focus on long-range planning, blithely projecting 10 years into the future. For decades this worked well enough, but in the late 20th century, independent schools shifted to "strategic planning," with its…

  16. IS and Business Leaders' Strategizing

    DEFF Research Database (Denmark)

    Hansen, Anne Mette

    , and productivity. However, strategizing in such dynamic environments is not straightforward process. While IS and business leaders must develop new IS strategic objectives and move quickly towards new opportunities, they must also be good at exploiting the value of current assets and reducing the costs of existing...

  17. Strategic Aspects of Cost Management

    Directory of Open Access Journals (Sweden)

    Angelika I. Petrova

    2013-01-01

    Full Text Available This report is a summary of a research done on the area of Strategic Cost Management (SCM. This report includes a detailed discussion and application of Life Cycle Costing (LCC which a company can use to achieve its strategic objects in today's dynamic business environment. Hence, the main focus of this report is on LCC as mentioned

  18. Tax rates as strategic substitutes

    NARCIS (Netherlands)

    H. Vrijburg (Hendrik); R.A. de Mooij (Ruud)

    2016-01-01

    textabstractThis paper analytically derives conditions under which the slope of the tax-reaction function is negative in a classical tax competition model. If countries maximize welfare, a negative slope (reflecting strategic substitutability) occurs under relatively mild conditions. The strategic t

  19. Energy Innovation Acceleration Program

    Energy Technology Data Exchange (ETDEWEB)

    Wolfson, Johanna [Fraunhofer USA Inc., Center for Sustainable Energy Systems, Boston, MA (United States)

    2015-06-15

    The Energy Innovation Acceleration Program (IAP) – also called U-Launch – has had a significant impact on early stage clean energy companies in the Northeast and on the clean energy economy in the Northeast, not only during program execution (2010-2014), but continuing into the future. Key results include: Leverage ratio of 105:1; $105M in follow-on funding (upon $1M investment by EERE); At least 19 commercial products launched; At least 17 new industry partnerships formed; At least $6.5M in revenue generated; >140 jobs created; 60% of assisted companies received follow-on funding within 1 year of program completion; In addition to the direct measurable program results summarized above, two primary lessons emerged from our work executing Energy IAP:; Validation and demonstration awards have an outsized, ‘tipping-point’ effect for startups looking to secure investments and strategic partnerships. An ecosystem approach is valuable, but an approach that evaluates the needs of individual companies and then draws from diverse ecosystem resources to fill them, is most valuable of all.

  20. ABSTRACTS Preliminary Study of Strategic Inner Cores

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    When a strategic entity attempts to make a dicision, first the project must be m accoroance wlm its strategic framework as well as make the strategic inner cores prominent. The existing theories of development strategy indicate that the formation of the framework can be divided into the following parts: inside and outside environments, purpose, goal, key points, and countermeasures. The strategic inner cores that put forward by this paper is the intensification and advancement for the theory of strategic framework, strategic orientation, strategic vision and main line are inciuded. Appearance of these ideas have improved the theory and enhanced strategic practice.

  1. STRATEGIC PLANNING AT SPORTS ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Radovan Ilić

    2013-10-01

    Full Text Available The article defines the terminology of the strategic planning at sports organizations and puts an accent on its specifics. The first part explains what is planning and its functions in the strategic management in order to further put a light on the theoretic terminology of strategic planning and strategic management as well as to explain the relation between them. In the second part the phases of the planning in sports are revised as follows: (1 preplanning phase, (2 strategy formulating phase, (3 implementing strategy phase, and (4 evaluation and control of the planned assignments. The last part of the article is dedicated to concluding revisions. The conclusions from the researches of this complex problematic are given by number in a long-term view of the strategic planning in sport organizations.

  2. Massively parallel computational fluid dynamics calculations for aerodynamics and aerothermodynamics applications

    Energy Technology Data Exchange (ETDEWEB)

    Payne, J.L.; Hassan, B.

    1998-09-01

    Massively parallel computers have enabled the analyst to solve complicated flow fields (turbulent, chemically reacting) that were previously intractable. Calculations are presented using a massively parallel CFD code called SACCARA (Sandia Advanced Code for Compressible Aerothermodynamics Research and Analysis) currently under development at Sandia National Laboratories as part of the Department of Energy (DOE) Accelerated Strategic Computing Initiative (ASCI). Computations were made on a generic reentry vehicle in a hypersonic flowfield utilizing three different distributed parallel computers to assess the parallel efficiency of the code with increasing numbers of processors. The parallel efficiencies for the SACCARA code will be presented for cases using 1, 150, 100 and 500 processors. Computations were also made on a subsonic/transonic vehicle using both 236 and 521 processors on a grid containing approximately 14.7 million grid points. Ongoing and future plans to implement a parallel overset grid capability and couple SACCARA with other mechanics codes in a massively parallel environment are discussed.

  3. Multinational Corporation and International Strategic Alliance

    Institute of Scientific and Technical Information of China (English)

    陆兮

    2015-01-01

    The world is now deeply into the second great wave of globalization, in which product, capital, and markets are becoming more and more integrated across countries. Multinational corporations are gaining their rapid growth around the globe and playing a significant role in the world economy. Meanwhile, the accelerated rate of globalization has also imposed pressures on MNCs, left them desperately seeking overseas alliances in order to remain competitive. International strategic alliances, which bring together large and commonly competitive firms for specific purposes, have gradual y shown its importance in the world market. And the form of international joint venture is now widely adopted. Then after the formation of alliances, selecting the right partner, formulating right strategies, establishing harmonious and effective partnership are generally the key to success.

  4. Strategizing on innovation systems

    DEFF Research Database (Denmark)

    Jofre, Sergio

    the role of university in the innovation system and its co-dependency with and within government and industry. The model supports the hypothesis that universities, governments and industry play an equally important role in innovation and that interdependency and evolution is what defines the systemic...... and the supranational levels. Data is gathered from available literature. Conclusions and discussion The findings suggest that there are important disparities among NIS particularly at the level of systemic functions such as knowledge creation, knowledge diffusion, guidance, and human and financial resource......This paper explores the strategic context of the implementation of the European Institute of Technology (EIT) from the perspective of National Innovation Systems (NIS) and the Triple Helix of University-Government-Industry relationship. The analytical framework is given by a comparative study...

  5. Trust in Strategic Alliances

    DEFF Research Database (Denmark)

    Nielsen, Bo

    2011-01-01

    This article examines the dynamic and multi-dimensional nature of trust in strategic alliances. Adopting a co-evolutionary approach, I developed a framework to show how trust, conceptualised in different forms, plays distinct roles at various evolutionary stages of the alliance relationship....... Emphasising the multi-dimensional and dynamic role of trust, the framework illustrates how initial levels of a particular type of trust may co-evolve with the alliance and influence subsequent phases of the relationship – either on its own or in combination with other types or dimensions of trust....... The theoretical distinction between trust as antecedent, moderator and outcome during the evolution of the alliance relationship leads to research questions that may guide future empirical research....

  6. Accelerating abelian gauge dynamics

    CERN Document Server

    Adler, Stephen Louis

    1991-01-01

    In this paper, we suggest a new acceleration method for Abelian gauge theories based on linear transformations to variables which weight all length scales equally. We measure the autocorrelation time for the Polyakov loop and the plaquette at β=1.0 in the U(1) gauge theory in four dimensions, for the new method and for standard Metropolis updates. We find a dramatic improvement for the new method over the Metropolis method. Computing the critical exponent z for the new method remains an important open issue.

  7. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  8. FAMOUS, faster: using parallel computing techniques to accelerate the FAMOUS/HadCM3 climate model with a focus on the radiative transfer algorithm

    Directory of Open Access Journals (Sweden)

    P. Hanappe

    2011-09-01

    Full Text Available We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations.

    The modified algorithm runs more than 50 times faster on the CELL's Synergistic Processing Element than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60 % of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

  9. FAMOUS, faster: using parallel computing techniques to accelerate the FAMOUS/HadCM3 climate model with a focus on the radiative transfer algorithm

    Directory of Open Access Journals (Sweden)

    P. Hanappe

    2011-06-01

    Full Text Available We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. A task queue and a thread pool are used to distribute the computation to several processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations.

    The modified algorithm runs more than 50 times faster on the CELL's Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster and on graphics processors, using OpenCL, more than 2.5 times faster, as compared to the original code. Because the radiation code takes more than 60 % of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach.

  10. Complex Strategic Choices Applying Systemic Planning for Strategic Decision Making

    CERN Document Server

    Leleur, Steen

    2012-01-01

    Effective decision making requires a clear methodology, particularly in a complex world of globalisation. Institutions and companies in all disciplines and sectors are faced with increasingly multi-faceted areas of uncertainty which cannot always be effectively handled by traditional strategies. Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students resulting in new material stemming from and focusing on practical application of a systemic approach. The outcome is a coher...

  11. New strategic roles of manufacturing

    DEFF Research Database (Denmark)

    Yang, Cheng; Johansen, John; Boer, Harry

    2008-01-01

    This paper aims to view manufacturing from a new angle, and tries to look beyond fit, focus and trade-offs, approaches which may no longer be sufficient for long-term competitive success. Four cases from different industries are described and used to illustrate and discuss the possibility...... of manufacturing playing new strategic roles. Backward, forward and lateral interactive support are suggested to explicate how manufacturing can realize its new strategic roles. Finally, four new strategic roles of manufacturing are suggested. They are: innovation manufacturing, ramp-up manufacturing, primary...... manufacturing, and service manufacturing....

  12. Application of Plasma Waveguides to High Energy Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Milchberg, Howard [Univ. of Maryland, College Park, MD (United States)

    2016-07-01

    This grant supported basic experimental, theoretical and computer simulation research into developing a compact, high pulse repetition rate laser accelerator using the direct laser acceleration mechanism in plasma-based slow wave structures.

  13. 75 FR 18824 - Federal Advisory Committee; U.S. Strategic Command Strategic Advisory Group; Closed Meeting

    Science.gov (United States)

    2010-04-13

    ..., intelligence, and policy-related issues to the Commander, U.S. Strategic Command, during the development of the... of the Secretary Federal Advisory Committee; U.S. Strategic Command Strategic Advisory Group; Closed... announces that the U.S. Strategic Command Strategic Advisory Group will meet on May 6 and 7, 2010....

  14. Standard test method for accelerated leach test for diffusive releases from solidified waste and a computer program to model diffusive, fractional leaching from cylindrical waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method provides procedures for measuring the leach rates of elements from a solidified matrix material, determining if the releases are controlled by mass diffusion, computing values of diffusion constants based on models, and verifying projected long-term diffusive releases. This test method is applicable to any material that does not degrade or deform during the test. 1.1.1 If mass diffusion is the dominant step in the leaching mechanism, then the results of this test can be used to calculate diffusion coefficients using mathematical diffusion models. A computer program developed for that purpose is available as a companion to this test method (Note 1). 1.1.2 It should be verified that leaching is controlled by diffusion by a means other than analysis of the leach test solution data. Analysis of concentration profiles of species of interest near the surface of the solid waste form after the test is recommended for this purpose. 1.1.3 Potential effects of partitioning on the test results can...

  15. The neoliberalisation of strategic spatial planning

    DEFF Research Database (Denmark)

    Olesen, Kristian

    2014-01-01

    Strategic spatial planning practices have recently taken a neoliberal turn in many northwestern European countries. This neoliberalisation of strategic spatial planning has materialised partly in governance reforms aiming to reduce or abolish strategic spatial planning at national and regional...... scales, and partly through the normalisation of neoliberal discourses in strategic spatial planning processes. This paper analyses the complex relationship, partly of unease and partly of coevolution, between neoliberalism and strategic spatial planning. Furthermore, the paper discusses the key...... challenges for strategic spatial planning in the face of neoliberalism and argues for a need to strengthen strategic spatial planning’s critical dimension....

  16. Control of robot dynamics using acceleration control

    Science.gov (United States)

    Workman, G. L.; Prateru, S.; Li, W.; Hinman, Elaine

    1992-01-01

    Acceleration control of robotic devices can provide improvements to many space-based operations using flexible manipulators and to ground-based operations requiring better precision and efficiency than current industrial robots can provide. This paper reports on a preliminary study of acceleration measurement on robotic motion during parabolic flights on the NASA KC-135 and a parallel study of accelerations with and without gravity arising from computer simulated motions using TREETOPS software.

  17. Abstract Acceleration of General Linear Loops

    OpenAIRE

    2014-01-01

    International audience; We present abstract acceleration techniques for computing loop invariants for numerical programs with linear assignments and conditionals. Whereas abstract interpretation techniques typically over-approximate the set of reachable states iteratively, abstract acceleration captures the effect of the loop with a single, non-iterative transfer function applied to the initial states at the loop head. In contrast to previous acceleration techniques, our approach applies to a...

  18. 2011 Computation Directorate Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence

  19. Strategic Arrivals Recommendation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — During the conduct of a NASA Research Announcement (NRA) in 2012 and 2013, the Mosaic ATM team first developed the Strategic Arrivals Recommendation Tool concept, or...

  20. Dominant investors and strategic transparency

    NARCIS (Netherlands)

    E.C. Perotti; E.-L. von Thadden

    1999-01-01

    This paper studies product market competition under a strategic transparency decision. Dominant investors can influence information collection in the financial market, and thereby corporate transparency, by affecting market liquidity or the cost of information collection. More transparency on a firm

  1. Dominant investors and strategic transparency

    NARCIS (Netherlands)

    E.C. Perotti; E.-L. von Thadden

    1998-01-01

    This paper studies product market competition under a strategic transparency decision. Dominant investors can influence information collection in the financial market, and thereby corporate transparency, by affecting market liquidity or the cost of information collection. More transparency on a firm

  2. The Strategic Process in Organisations

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    1999-01-01

    Organisational strategy development is often conceptualised through methodological frameworks. In this paper strategy development is seen as a strategic process characterised by inherent contradictions between actors, OR methods and the problem situation. The paper presents the dimensions...

  3. STRATEGIC ALLIANCES – THEIR DEFINITION AND FORMATION

    OpenAIRE

    Kinderis, Remigijus; Jucevičius, Giedrius

    2013-01-01

    The article presents analysis of the definition of strategic alliances, the analysis of alliance and the research of a strategic alliance concept; furthermore, it focuses on the contingent hierarchy of alliances. The motives of strategic alliances formation, their categories, groups and benefit for business have been revealed in this article. Special attention is paid to the process of strategic alliance formation and the analysis of factors that influence the formation of strategic alliances...

  4. Strategic Leadership of Corporate Sustainability

    DEFF Research Database (Denmark)

    Strand, Robert

    2014-01-01

    ? What effects do corporate sustainability TMT positions have at their organizations? We consider these questions through strategic leadership and neoinstitutional theoretical frameworks. Through the latter, we also engage with Weberian considerations of bureaucracy. We find that the reasons why......Strategic leadership and corporate sustainability have recently come together in conspicuously explicit fashion through the emergence of top management team (TMT) positions with dedicated corporate sustainability responsibilities. These TMT positions, commonly referred to as 'Chief Sustainability...

  5. Executive presence for strategic influence.

    Science.gov (United States)

    Shirey, Maria R

    2013-01-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author discusses cultivating executive presence, a crucial component of great leadership, needed for strategic influence and to drive change.

  6. TOPSIS Method for Determining The Priority of Strategic Training Program

    Directory of Open Access Journals (Sweden)

    Rohmatulloh Rohmatulloh

    2014-01-01

    Full Text Available The voice of stakeholders is an important issue for government or public organizations. The issue becomes an input in designing strategic program. Decision maker should evaluate the priority to get the importance level. The decision making process is a complex problem because it is influenced by many critetria. The purpose of this study is to solve multi-criteria decision making problem using TOPSIS method. This method is proposed due to its easy and simple computation process. The case sample is determining the strategic training program in energy and mineral resources field. TOPSIS analysis may be able to assist decision maker in allocating resources for the preparation of strategic training program in accordance with the priorities

  7. CINT 2020 Strategic Plan

    Energy Technology Data Exchange (ETDEWEB)

    Shinn, Neal D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    CINT’s role is to enable world-leading science towards realizing these benefits and our strategic objectives describe what is needed to deliver on this promise. As a vibrant partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL), CINT leverages the unmatched scientific and engineering expertise of our host DOE Laboratories in an Office of Science open-access user facility to benefit hundreds of researchers annually. We have world-leading scientific expertise in four thrust areas, as described in section 1, and specialized capabilities to create, characterize and understand nanomaterials in increasingly complex integrated environments. Building upon these current strengths, we identify some of the capabilities and expertise that the nanoscience community will need in the future and that CINT is well positioned to develop and offer as a user facility. These include an expanding portfolio of our signature Discovery Platforms that can be used alone or as sophisticated “experiments within an experiment”; novel synthetic approaches for exquisitely heterostructured nanowires, nanoparticles and quasi-two-dimensional materials; ultra-high resolution spectroscopic techniques of nanomaterial dynamics; in situ microscopies that provide realtime, spatially-resolved structure/property information for increasingly complex materials systems; advanced simulation techniques for integrated nanomaterials; and multi-scale theory for interfaces and dynamics.

  8. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  9. Accelerated shallow water modeling

    Science.gov (United States)

    Gandham, Rajesh; Medina, David; Warburton, Timothy

    2015-04-01

    ln this talk we will describe our ongoing developments in accelerated numerical methods for modeling tsunamis, and oceanic fluid flows using two dimensional shallow water model and/or three dimensional incompressible Navier Stokes model discretized with high order discontinuous Galerkin methods. High order discontinuous Galerkin methods can be computationally demanding, requiring extensive computational time to simulate real time events on traditional CPU architectures. However, recent advances in computing architectures and hardware aware algorithms make it possible to reduce simulation time and provide accurate predictions in a timely manner. Hence we tailor these algorithms to take advantage of single instruction multiple data (SIMD) architecture that is seen in modern many core compute devices such as GPUs. We will discuss our unified and extensive many-core programming library OCCA that alleviates the need to completely re-design the solvers to keep up with constantly evolving parallel programming models and hardware architectures. We will present performance results for the flow simulations demonstrating performance leveraging multiple different multi-threading APIs on GPU and CPU targets.

  10. CLOUD COMPUTING AND SECURITY

    Directory of Open Access Journals (Sweden)

    Asharani Shinde

    2015-10-01

    Full Text Available This document gives an insight into Cloud Computing giving an overview of key features as well as the detail study of exact working of Cloud computing. Cloud Computing lets you access all your application and documents from anywhere in the world, freeing you from the confines of the desktop thus making it easier for group members in different locations to collaborate. Certainly cloud computing can bring about strategic, transformational and even revolutionary benefits fundamental to future enterprise computing but it also offers immediate and pragmatic opportunities to improve efficiencies today while cost effectively and systematically setting the stage for the strategic change. As this technology makes the computing, sharing, networking easy and interesting, we should think about the security and privacy of information too. Thus the key points we are going to be discussed are what is cloud, what are its key features, current applications, future status and the security issues and the possible solutions.

  11. Anderson Acceleration for Fixed-Point Iterations

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Homer F. [Worcester Polytechnic Institute, MA (United States)

    2015-08-31

    The purpose of this grant was to support research on acceleration methods for fixed-point iterations, with applications to computational frameworks and simulation problems that are of interest to DOE.

  12. 77 FR 54615 - Strategic Management Program; Fiscal Year 2013-2016 Strategic Plan

    Science.gov (United States)

    2012-09-05

    ... Doc No: 2012-21820] NATIONAL TRANSPORTATION SAFETY BOARD Strategic Management Program; Fiscal Year..., Strategic Management Program. FOR FURTHER INFORMATION CONTACT: Agency contact, Shamicka Fulson, Program Manager, Strategic Management Program; National Transportation Safety Board, 490 L'Enfant Plaza SW.,...

  13. A Study on the Effect of the Strategic Intelligence on Decision Making and Strategic Planning

    OpenAIRE

    Mahmoud Reza Esmaili

    2014-01-01

    The present research aims to recognize not only the effective factors on the strategic intelligence, strategic decision making and strategic planning but also it studies the effect of the strategic intelligence on the strategic decision making and strategic planning in organization and companies using the intelligence system in the Khorram-abad city. According to the results, this study is an analytical-survey research. The statistical population for the research is the companies and organiza...

  14. Whole scale change for real-time strategic application in complex health systems.

    Science.gov (United States)

    Shirey, Maria R; Calarco, Margaret M

    2014-11-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. In this article, the authors introduce Whole Scale Change™, an action learning approach that accelerates organizational transformation to meet the challenges of dynamic environments.

  15. Children's strategic theory of mind.

    Science.gov (United States)

    Sher, Itai; Koenig, Melissa; Rustichini, Aldo

    2014-09-16

    Human strategic interaction requires reasoning about other people's behavior and mental states, combined with an understanding of their incentives. However, the ontogenic development of strategic reasoning is not well understood: At what age do we show a capacity for sophisticated play in social interactions? Several lines of inquiry suggest an important role for recursive thinking (RT) and theory of mind (ToM), but these capacities leave out the strategic element. We posit a strategic theory of mind (SToM) integrating ToM and RT with reasoning about incentives of all players. We investigated SToM in 3- to 9-y-old children and adults in two games that represent prevalent aspects of social interaction. Children anticipate deceptive and competitive moves from the other player and play both games in a strategically sophisticated manner by 7 y of age. One game has a pure strategy Nash equilibrium: In this game, children achieve equilibrium play by the age of 7 y on the first move. In the other game, with a single mixed-strategy equilibrium, children's behavior moved toward the equilibrium with experience. These two results also correspond to two ways in which children's behavior resembles adult behavior in the same games. In both games, children's behavior becomes more strategically sophisticated with age on the first move. Beyond the age of 7 y, children begin to think about strategic interaction not myopically, but in a farsighted way, possibly with a view to cooperating and capitalizing on mutual gains in long-run relationships.

  16. OpenMP for Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, J C; Stotzer, E J; Hart, A; de Supinski, B R

    2011-03-15

    OpenMP [13] is the dominant programming model for shared-memory parallelism in C, C++ and Fortran due to its easy-to-use directive-based style, portability and broad support by compiler vendors. Similar characteristics are needed for a programming model for devices such as GPUs and DSPs that are gaining popularity to accelerate compute-intensive application regions. This paper presents extensions to OpenMP that provide that programming model. Our results demonstrate that a high-level programming model can provide accelerated performance comparable to hand-coded implementations in CUDA.

  17. Development of high quality electron beam accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Kando, Masaki; Dewa, Hideki; Kotaki, Hideyuki; Kondo, Shuji; Hosokai, Tomonao; Kanazawa, Shuhei; Yokoyama, Takashi; Nakajima, Kazuhisa [Advanced Photon Research Center, Kansai Research Establishment, Japan Atomic Energy Research Institute, Kizu, Kyoto (Japan)

    2000-03-01

    A design study on a high quality electron beam accelerator is described. This accelerator will be used for second generation experiments of laser wakefield acceleration, short x-ray generation, and other experiments of interaction of high intensity laser with an electron beam at Advanced Photon Research Center, Kansai Research Establishment, Japan Atomic Energy Research Institute. The system consists of a photocathode rf gun and a race-track microtron (RTM). To combine these two components, injection and extraction beamlines are designed employing transfer matrix and compute codes. A present status of the accelerator system is also presented. (author)

  18. 导航卫星速度和加速度的计算方法及精度分析%Navigation Satellites Velocity and Acceleration Computation. Methods and Accuracy Analysis

    Institute of Scientific and Technical Information of China (English)

    李显; 吴美平; 张开东; 曹聚亮; 黄杨明

    2012-01-01

    A systemic analysis of the different methods to calculate the velocities and accelerations of the navigation satellites is made, including O the closed analytical method based on broadcast elghemeris (1) the numerical differencing method based on position series of the satellite Q the analytical differencing method based on position series of the satellite. Firstly, the analytical expressions are deduced based on broadcast ephemeris, three types of broadcast ephemeris, including Kepler elements, GE~, and position-velocity type are discussed. The results can be drawn from precision comparison as follows, ~ the accuracy of velocity and acceleration derived from broadcast ephemeris is relative low, and can not match the high precision applications, such as airborne gravimetric measurement~ Q the acceleration accuracy is higher derived from position-velocity broadcast ephemeris while the Kepler type has higher velocity accuracy (3) the orbit height is one of the factors of the computation precision. Then, the analytical differencing and numerical differencing based on precision ephemeris to derive velocities and acceleration are analyzed and compared, the results shows that although the analytical method has advantages on efficient, the velocities computation precision is lower for the orbit analytical model built from short term position series is inaccurate, however, the acceleration computation precision is compared to the numerical differencing method. Finally, a static experiment is conducted which data from two CQRS (continues operational reference system) stations to evaluate and compare the computation accuracy among the methods mentioned above.%系统分析和总结基于广播星历和精密星历的导航卫星速度和加速度的计算方法,包括:①基于广播星历的公式法;②基于导航卫星位置序列的数值差分法;③基于导航卫星位置序列的解析差分法。首先在基于广播星历的公式法中,推导出Kepler

  19. Systems 2020: Strategic Initiative

    Science.gov (United States)

    2010-08-29

    Auction Pits Google’s Game Theorists Against the FCC’s | Epicenter | Wired.com. http://www.wired.com/epicenter/2007/11/so-two-game-the/. ( Babar et al...2010), Babar , M.A., Chen, Lianping, Shull, F. “Managing Variability in Software Product Lines,” IEEE Software, published by the IEEE Computer

  20. Accelerated large-scale multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Lloyd Scott

    2011-12-01

    Full Text Available Abstract Background Multiple sequence alignment (MSA is a fundamental analysis method used in bioinformatics and many comparative genomic applications. Prior MSA acceleration attempts with reconfigurable computing have only addressed the first stage of progressive alignment and consequently exhibit performance limitations according to Amdahl's Law. This work is the first known to accelerate the third stage of progressive alignment on reconfigurable hardware. Results We reduce subgroups of aligned sequences into discrete profiles before they are pairwise aligned on the accelerator. Using an FPGA accelerator, an overall speedup of up to 150 has been demonstrated on a large data set when compared to a 2.4 GHz Core2 processor. Conclusions Our parallel algorithm and architecture accelerates large-scale MSA with reconfigurable computing and allows researchers to solve the larger problems that confront biologists today. Program source is available from http://dna.cs.byu.edu/msa/.

  1. NATO's Strategic Partnership with Ukraine

    DEFF Research Database (Denmark)

    Breitenbauch, Henrik Ø.

    2014-01-01

    Russian actions in Ukraine have altered the security land- scape in Europe, highlighting a renewed emphasis on the differences between members and non-members. In this context, NATO must a) create a strategic understanding of partnerships as something that can be transformative, even if it will n......Russian actions in Ukraine have altered the security land- scape in Europe, highlighting a renewed emphasis on the differences between members and non-members. In this context, NATO must a) create a strategic understanding of partnerships as something that can be transformative, even...... if it will not lead to membership in the short or even long term, and b) build such a strategic relationship with Ukraine. In sum, the Russian-induced Ukraine crisis should spur the reform of NATO partnerships – with Ukraine as a case in point....

  2. The Test of Strategic Culture

    DEFF Research Database (Denmark)

    Dalgaard-Nielsen, Anja

    2005-01-01

    Germany was the first country to issue a categorical refusal to support the US-led war in Iraq. Some have interpreted this as the result of a clash between the strategic cultures of Germany and the USA, others as a sign that a more nationalistic and assertive Germany is emerging. This article...... explains the apparently contradictory aspects of Germany’s stance on Iraq by identifying two competing strands within Germany’s strategic culture. It concludes that the German refusal signals neither a reversion to a pacifist stance nor that Germany is in a process of shedding the bonds and alliances...... that have so far framed the reunified Germany’s military policy. Iraq simply showed that Germany, like most other countries, has conditions that have to be met – in Germany’s case, conditions flowing from the coexistence of two competing schools of thought within Germany’s strategic culture....

  3. Managing transdisciplinarity in strategic foresight

    DEFF Research Database (Denmark)

    Rasmussen, Birgitte; Andersen, Per Dannemand; Borch, Kristian

    2010-01-01

    Strategic foresight deals with the long term future and is a transdisciplinary exercise which, among other aims, addresses the prioritization of science and other decision making in science and innovation advisory and funding bodies. This article discusses challenges in strategic foresight...... in relation to transdisciplinarity based on empirical as well as theoretical work in technological domains. By strategic foresight is meant future oriented, participatory consultation of actors and stakeholders, both within and outside a scientific community. It therefore allows multiple stakeholders...... to negotiate over how to attain a desirable future. This requires creative thinking from the participants, who need to extend their knowledge into the uncertainty of the future. Equally important is skilled facilitating in order to create a space for dialogue and exploration in a contested territory. Although...

  4. The Emerging Strategic Entrepreneurship Field

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Lyngsie, Jacob

    to be considered jointly. The purpose of this brief chapter is to explain the emergence of SE theory field in terms of a response to research gaps in the neighboring fields of entrepreneurship and strategic management; describe the main tenets of SE theory; discuss its relations to neighboring fields; and finally......The field of strategic entrepreneurship is a fairly recent one. Its central idea is that opportunity-seeking and advantage-seeking — the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field — are processes that need...... describe some research gaps in extant theory, mainly focusing on the need to provide clear microfoundations for SE theory and link it to organizational design theory....

  5. Strategic planning in healthcare organizations.

    Science.gov (United States)

    Rodríguez Perera, Francisco de Paula; Peiró, Manel

    2012-08-01

    Strategic planning is a completely valid and useful tool for guiding all types of organizations, including healthcare organizations. The organizational level at which the strategic planning process is relevant depends on the unit's size, its complexity, and the differentiation of the service provided. A cardiology department, a hemodynamic unit, or an electrophysiology unit can be an appropriate level, as long as their plans align with other plans at higher levels. The leader of each unit is the person responsible for promoting the planning process, a core and essential part of his or her role. The process of strategic planning is programmable, systematic, rational, and holistic and integrates the short, medium, and long term, allowing the healthcare organization to focus on relevant and lasting transformations for the future.

  6. Limited rationality and strategic interaction

    DEFF Research Database (Denmark)

    Fehr, Ernst; Tyran, Jean-Robert

    2008-01-01

    Much evidence suggests that people are heterogeneous with regard to their abilities to make rational, forward-looking decisions. This raises the question as to when the rational types are decisive for aggregate outcomes and when the boundedly rational types shape aggregate results. We examine...... this question in the context of a long-standing and important economic problem: the adjustment of nominal prices after an anticipated monetary shock. Our experiments suggest that two types of bounded rationality-money illusion and anchoring-are important behavioral forces behind nominal inertia. However......, depending on the strategic environment, bounded rationality has vastly different effects on aggregate price adjustment. If agents' actions are strategic substitutes, adjustment to the new equilibrium is extremely quick, whereas under strategic complementarity, adjustment is both very slow and associated...

  7. Likelihood Analysis of the Local Group Acceleration

    CERN Document Server

    Schmoldt, I M; Teodoro, L; Efstathiou, G P; Frenk, C S; Keeble, O; Maddox, S J; Oliver, S; Rowan-Robinson, M; Saunders, W J; Sutherland, W; Tadros, H; White, S D M

    1999-01-01

    We compute the acceleration on the Local Group using 11206 IRAS galaxies from the recently completed all-sky PSCz redshift survey. Measuring the acceleration vector in redshift space generates systematic uncertainties due to the redshift space distortions in the density field. We therefore assign galaxies to their real space positions by adopting a non-parametric model for the velocity field that solely relies on the linear gravitational instability and linear biasing hypotheses. Remaining systematic contributions to the measured acceleration vector are corrected for by using PSCz mock catalogues from N-body experiments. The resulting acceleration vector points approx. 15 degrees away from the CMB dipole apex, with a remarkable alignment between small and large scale contributions. A considerable fraction of the measured acceleration is generated within 40 h-1 Mpc with a non-negligible contribution from scales between 90 and 140 h-1 Mpc after which the acceleration amplitude seems to have converged. The local...

  8. Hadron accelerators for radiotherapy

    Science.gov (United States)

    Owen, Hywel; MacKay, Ranald; Peach, Ken; Smith, Susan

    2014-04-01

    Over the last twenty years the treatment of cancer with protons and light nuclei such as carbon ions has moved from being the preserve of research laboratories into widespread clinical use. A number of choices now exist for the creation and delivery of these particles, key amongst these being the adoption of pencil beam scanning using a rotating gantry; attention is now being given to what technologies will enable cheaper and more effective treatment in the future. In this article the physics and engineering used in these hadron therapy facilities is presented, and the research areas likely to lead to substantive improvements. The wider use of superconducting magnets is an emerging trend, whilst further ahead novel high-gradient acceleration techniques may enable much smaller treatment systems. Imaging techniques to improve the accuracy of treatment plans must also be developed hand-in-hand with future sources of particles, a notable example of which is proton computed tomography.

  9. Characteristics of Useful and Practical Organizational Strategic Plans

    Science.gov (United States)

    Kaufman, Roger

    2014-01-01

    Most organizational strategic plans are not strategic but rather tactical or operational plans masquerading as "strategic." This article identifies the basic elements required in a useful and practical strategic plan and explains why they are important.

  10. Contrasting strategic and Milan therapies.

    Science.gov (United States)

    MacKinnon, L

    1983-12-01

    Three related models of therapy are often grouped together as the strategic therapies. These are brief therapy model associated with the Mental Research Institute, approaches developed by Jay Haley and Cloë Madanes, and the model developed by the Milan associates. Controversy exists, however, as to whether the Milan model should be included as a strategic therapy. It appears that the similarities among the three models can mask deeper differences, thus confounding the confusion. This paper contrast the models in their development, theory, and practice.

  11. Final Draft Strategic Marketing Plan.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1994-02-01

    The Bonneville Power Administration (BPA) has developed a marketing plan to define how BPA can be viable and competitive in the future, a result important to BPA`s customers and constituents. The Marketing Plan represents the preferred customer outcomes, marketplace achievements, and competitive advantage required to accomplish the Vision and the Strategic Business Objectives of the agency. The Marketing Plan contributes to successful implementation of BPA`s Strategic Business Objectives (SBOs) by providing common guidance to organizations and activities throughout the agency responsible for (1) planning, constructing, operating, and maintaining the Federal Columbia River Power System; (2) conducting business with BPA`s customers; and (3) providing required internal support services.

  12. Issues in Strategic Decision Modelling

    CERN Document Server

    Jennings, Paula

    2008-01-01

    [Spreadsheet] Models are invaluable tools for strategic planning. Models help key decision makers develop a shared conceptual understanding of complex decisions, identify sensitivity factors and test management scenarios. Different modelling approaches are specialist areas in themselves. Model development can be onerous, expensive, time consuming, and often bewildering. It is also an iterative process where the true magnitude of the effort, time and data required is often not fully understood until well into the process. This paper explores the traditional approaches to strategic planning modelling commonly used in organisations and considers the application of a real-options approach to match and benefit from the increasing uncertainty in today's rapidly changing world.

  13. The Strategic Data Project's Strategic Performance Indicators

    Science.gov (United States)

    Page, Lindsay C.; Fullerton, Jon; Bacher-Hicks, Andrew; Owens, Antoniya; Cohodes, Sarah R.; West, Martin R.; Glover, Sarah

    2013-01-01

    Strategic Performance Indicators (SPIs) are summary measures derived from parallel, descriptive analyses conducted across educational agencies. The SPIs are designed to inform agency management and efforts to improve student outcomes. We developed the SPIs to reveal patterns common across partner agencies, to highlight exceptions to those…

  14. Strategic Planning and Strategic Thinking Clothed in STRATEGO

    Science.gov (United States)

    Baaki, John; Moseley, James L.

    2011-01-01

    This article shares experiences that participants had playing the game of STRATEGO and how the activity may be linked to strategic planning and thinking. Among the human performance technology implications of playing this game are that gamers agreed on a framework for rules, took stock on where they wanted to go in the future, and generated a risk…

  15. The strategic labor allocation proces : a model of strategic HRM

    NARCIS (Netherlands)

    Bax, Erik H.

    2002-01-01

    In this article the Strategic Labor Allocation Process model (SLAP) is described. The model relates HR-strategies to structure, culture and task technology to HR-policies like recruitment, appraisal and rewarding, to business strategy and to socio-cultural, economic, institutional and technological

  16. The Danish experience of strategic environment assesment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    2004-01-01

    The article recounts a number of examples of the Danish experience with Strategic Environmental Assessment (SEA).......The article recounts a number of examples of the Danish experience with Strategic Environmental Assessment (SEA)....

  17. Strategic Planning as a Perceptual Process,

    Science.gov (United States)

    1981-03-01

    Like any complex human endeavor, the sort of broad scope or long range organizational planning often referred to as strategic planning can be viewed...might be of use to planners and managers concerned with broad scope strategic planning .

  18. Design of a nonscaling fixed field alternating gradient accelerator

    CERN Document Server

    Trbojevic, D; Blaskiewicz, M

    2005-01-01

    We present a design of nonscaling fixed field alternating gradient accelerators (FFAG) minimizing the dispersion action function H. The design is considered both analytically and via computer modeling. We present the basic principles of a nonscaling FFAG lattice and discuss optimization strategies so that one can accelerate over a broad range of momentum with reasonable apertures. Acceleration schemes for muons are discussed.

  19. Design of a nonscaling fixed field alternating gradient accelerator

    Science.gov (United States)

    Trbojevic, D.; Courant, E. D.; Blaskiewicz, M.

    2005-05-01

    We present a design of nonscaling fixed field alternating gradient accelerators (FFAG) minimizing the dispersion action function H. The design is considered both analytically and via computer modeling. We present the basic principles of a nonscaling FFAG lattice and discuss optimization strategies so that one can accelerate over a broad range of momentum with reasonable apertures. Acceleration schemes for muons are discussed.

  20. Acceleration of saddle-point searches with machine learning.

    Science.gov (United States)

    Peterson, Andrew A

    2016-08-21

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  1. Acceleration of saddle-point searches with machine learning

    Science.gov (United States)

    Peterson, Andrew A.

    2016-08-01

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  2. Strategic Alliance Poker: Demonstrating the Importance of Complementary Resources and Trust in Strategic Alliance Management

    Science.gov (United States)

    Reutzel, Christopher R.; Worthington, William J.; Collins, Jamie D.

    2012-01-01

    Strategic Alliance Poker (SAP) provides instructors with an opportunity to integrate the resource based view with their discussion of strategic alliances in undergraduate Strategic Management courses. Specifically, SAP provides Strategic Management instructors with an experiential exercise that can be used to illustrate the value creation…

  3. 75 FR 67695 - U.S. Strategic Command Strategic Advisory Group Closed Meeting

    Science.gov (United States)

    2010-11-03

    ... of the Secretary of Defense U.S. Strategic Command Strategic Advisory Group Closed Meeting AGENCY.... Strategic Command Strategic Advisory Group. DATES: December 9, 2010: 8 a.m. to 5 p.m. December 10, 2010: 8 a... of the meeting is to provide advice on scientific, technical, intelligence, and policy-related...

  4. 76 FR 14950 - Closed Meeting of the U.S. Strategic Command Strategic Advisory Group

    Science.gov (United States)

    2011-03-18

    ... of the Secretary Closed Meeting of the U.S. Strategic Command Strategic Advisory Group AGENCY... notice pertaining to the following federal advisory committee: U.S. Strategic Command Strategic Advisory... meeting is to provide advice on scientific, technical, intelligence, and policy-related issues to...

  5. 75 FR 22561 - Federal Advisory Committee; United States Strategic Command Strategic Advisory Group; Charter...

    Science.gov (United States)

    2010-04-29

    ..., intelligence, and policy-related matters of interest to the Joint Chiefs of Staff and the U.S. Strategic..., communications, intelligence and information operations, or other important aspects of the Nation's strategic... of the Secretary Federal Advisory Committee; United States Strategic Command Strategic Advisory...

  6. Hoshin Kanri Forest : lean strategic organizational design

    OpenAIRE

    Villalba-Díez, Javier

    2017-01-01

    Strategic Lean Management (LM) efforts almost always fail because Leaders often lack a map of their own organization. The reason for this might be that scholars have so far mostly provided qualitative or rigid one-size-fits-all frameworks for strategically designing organizations. The purpose of this work is to provide a comprehensive quantifiable framework for strategically designing organizations for LM. Combining knowledge about Strategic Organizational Design and LM, we introduce a novel ...

  7. Hoshin Kanri Forest : lean strategic organizational design

    OpenAIRE

    Villalba-Díez, Javier

    2016-01-01

    Strategic Lean Management (LM) efforts almost always fail because Leaders often lack a map of their own organization. The reason for this might be that scholars have so far mostly provided qualitative or rigid one-size-fits-all frameworks for strategically designing organizations. The purpose of this work is to provide a comprehensive quantifiable framework for strategically designing organizations for LM. Combining knowledge about Strategic Organizational Design and LM, we introduce a novel ...

  8. Children’s strategic theory of mind

    OpenAIRE

    Sher, Itai; Koenig, Melissa; Rustichini, Aldo

    2014-01-01

    Human interaction requires reasoning not only about other people’s observed behavior and mental states but also about their incentives and goals. The development of children’s strategic thinking is not well understood, leaving open critical questions about early human capacity for strategic interaction. We investigated strategic reasoning in 3- to 9-y-old children and adults in two strategic games that represent prevalent aspects of social interaction: incentives to mislead and competition. W...

  9. Crisis - Strategic Management in Public Relation

    OpenAIRE

    Saari Ahmad

    2012-01-01

    This is a concept paper to explore the strategic management approaches in public relations during crisis. The main objective of this article is to identify the most effective action plan for Public relation. The review of the strategic management in public relations literature reveals that the relationship between strategic management and public relations is still vague. Four stages were identified in the process of establishing the action plan for public relations and eleven strategic action...

  10. Maturity of strategic management in organizations

    OpenAIRE

    Anna Witek-Crabb

    2015-01-01

    There is some ambivalence with regards to how to improve strategic management of organizations. On the one hand the example of big companies emphasizes the need for formalization and good organization of strategic management process. On the other hand the example of small companies draws attention to such qualities as entrepreneurship, flexibility and adaptability. The concept of strategic management maturity embraces both of these priorities. In this paper a framework for strategic managemen...

  11. Strategic Planning and Army Installation Management.

    Science.gov (United States)

    1996-01-01

    program. The U.S. Army has adopted the Malcolm Baldrige National Quality Award criteria for use in the ACOE program. Strategic planning is one of the...seven pillars of the Baldrige criteria. The Army has recognized that strategic planning is the key to the future. Strategic planning is the key to...and utilization of strategic planning . This paper examines through case study analysis several civilian communities and lessons learned through their

  12. Strategic Simulation - Support of Innovation and Operation in Distribution and production Networks

    DEFF Research Database (Denmark)

    Hansen, Mette Sanne

    Today’s business environment is characterized by global competition, changing conditions, and uncertainty. Many Western companies have responded by developing global distribution and production networks. The increasingly challenging business environment and the more complex structure of companies...... or quantitative approaches. Strategic simulation is the combination of narrative and numerical simulation and can be used as a tool to support strategic decision making by providing different scenarios in combination with computer modelling. The core of the combined simulation approach (CSA) is to make...

  13. Strategic issues in information technology international implications for decision makers

    CERN Document Server

    Schütte, Hellmut

    1988-01-01

    Strategic Issues in Information Technology: International Implications for Decision Makers presents the significant development of information technology in the output of components, computers, and communication equipment and systems. This book discusses the integration of information technology into factories and offices to increase productivity.Organized into six parts encompassing 12 chapters, this book begins with an overview of the advancement towards an automated interpretation communication system to achieve real international communication. This text then examines the main determining

  14. 13 CFR 313.6 - Strategic Plans.

    Science.gov (United States)

    2010-01-01

    .... EDA shall evaluate the Strategic Plan based on the following minimum requirements: (1) An analysis of... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Strategic Plans. 313.6 Section 313... § 313.6 Strategic Plans. (a) General. An Impacted Community that intends to apply for a grant...

  15. The value contribution of strategic foresight

    DEFF Research Database (Denmark)

    Rohrbeck, René; Schwarz, Jan Oliver

    2013-01-01

    This paper focuses on exploring the potential and empirically observable value creation of strategic foresight activities in firms. We first review the literature on strategic foresight, innovation management and strategic management in order to identify the potential value contributions. We use...

  16. Strategic Activism, Educational Leadership and Social Justice

    Science.gov (United States)

    Ryan, James

    2016-01-01

    This article describes the strategic activism of educational leaders who promote social justice. Given the risks, educational leaders need to be strategic about the ways in which they pursue their activism. Citing current research, this article explores the ways in which leaders strategically pursue their social justice agendas within their own…

  17. Strategic decision quality in Flemish municipalities

    NARCIS (Netherlands)

    B.R.J. George (Bert); S. Desmidt (Sebastian); J. De Moyer (Julie)

    2016-01-01

    textabstractStrategic planning (SP) has taken the public sector by storm because it is widely believed that SP’s approach to strategic decision-making strengthens strategic decision quality (SDQ) in public organizations. However, if or how SP relates to SDQ seems to lack empirical evidence. Drawing

  18. Strategic Management in Chinese Manufacturing SMEs

    OpenAIRE

    Chen, Muxia; Bowen, Liu

    2012-01-01

    This study is about to find out whether and how strategic management is employed in the Chinese manufacturing SMEs, as well as to explore the main characteristics of the strategic management process in these SMEs. It aims to work as a reference for the senior managers in these firms to better improve and utilize the strategic management tools for their future growth.

  19. Collaborative Strategic Planning: Myth or Reality?

    Science.gov (United States)

    Mbugua, Flora; Rarieya, Jane F. A.

    2014-01-01

    The concept and practice of strategic planning, while entrenched in educational institutions in the West, is just catching on in Kenya. While literature emphasizes the importance of collaborative strategic planning, it does not indicate the challenges presented by collaboratively engaging in strategic planning. This article reports on findings of…

  20. Accelerated Adaptive MGS Phase Retrieval

    Science.gov (United States)

    Lam, Raymond K.; Ohara, Catherine M.; Green, Joseph J.; Bikkannavar, Siddarayappa A.; Basinger, Scott A.; Redding, David C.; Shi, Fang

    2011-01-01

    The Modified Gerchberg-Saxton (MGS) algorithm is an image-based wavefront-sensing method that can turn any science instrument focal plane into a wavefront sensor. MGS characterizes optical systems by estimating the wavefront errors in the exit pupil using only intensity images of a star or other point source of light. This innovative implementation of MGS significantly accelerates the MGS phase retrieval algorithm by using stream-processing hardware on conventional graphics cards. Stream processing is a relatively new, yet powerful, paradigm to allow parallel processing of certain applications that apply single instructions to multiple data (SIMD). These stream processors are designed specifically to support large-scale parallel computing on a single graphics chip. Computationally intensive algorithms, such as the Fast Fourier Transform (FFT), are particularly well suited for this computing environment. This high-speed version of MGS exploits commercially available hardware to accomplish the same objective in a fraction of the original time. The exploit involves performing matrix calculations in nVidia graphic cards. The graphical processor unit (GPU) is hardware that is specialized for computationally intensive, highly parallel computation. From the software perspective, a parallel programming model is used, called CUDA, to transparently scale multicore parallelism in hardware. This technology gives computationally intensive applications access to the processing power of the nVidia GPUs through a C/C++ programming interface. The AAMGS (Accelerated Adaptive MGS) software takes advantage of these advanced technologies, to accelerate the optical phase error characterization. With a single PC that contains four nVidia GTX-280 graphic cards, the new implementation can process four images simultaneously to produce a JWST (James Webb Space Telescope) wavefront measurement 60 times faster than the previous code.

  1. DISCOM2: Distance Computing the SP2 Pilot FY98 Report

    Energy Technology Data Exchange (ETDEWEB)

    Beiriger, Judy; Byers, Rupert K.; Ernest, Martha J.; Goudy, Sue P.; Noe, John P.; Pratt, Thomas J.; Shirley, David N.; Tarman, Thomas D.; VanDevender, Walter H.; Wiltzius, David P.

    1999-05-01

    As a way to bootstrap the DISCOM(2) Distance Computing Program the SP2 Pilot Project was launched in March 1998. The Pilot was directed towards creating an environment to allow Sandia users to run their applications on the Accelerated Strategic Computing Initiative's (ASCI) Blue Pacific computation platform, the unclassified IBM SP2 platform at Lawrence Livermore National Laboratory (LLNL). The DISCOM(2) Pilot leverages the ASCI PSE (Problem solving Environment) efforts in networking and services to baseline the performance of the current system. Efforts in the following areas of the pilot are documented: applications, services, networking, visualization, and the system model. It details not only the running of two Sandia codes CTH and COYOTE on the Blue Pacific platform, but also the buildong of the Sandia National Laboratories (SNL) proxy environment of the RS6000 platforms to support the Sandia users.

  2. A strategic PACS maturity approach

    NARCIS (Netherlands)

    van de Wetering, R.

    2011-01-01

    Finding the key determinants of Picture Archivingand Communication Systems (PACS)performance in hospitals has been a conundrumfor decades. This research provides a method toassess the strategic alignment of PACS in hospitalsin order to find these key determinants. PACS touches upon every single part

  3. Always Strategic: Jointly Essential Landpower

    Science.gov (United States)

    2015-02-01

    modifier of the concept of power. It has been my first-hand experience for nearly 50 years as a teacher and author on strategy, that it can be...official defini- tion of Landpower, there is no strict requirement for strategic Landpower to be delivered by ground forces. The geopolitical and

  4. Networks and meshworks in strategizing

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Andersen, Poul Houman

    the literature on networks and network pictures, and identifies several shortcomings of this work. To develop the notion of business meshworks as an alternative for understanding strategizing practices in business interaction, the paper draws on recent writings within anthropology and the strategy...

  5. Strategic petroleum reserve annual report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-15

    Section 165 of the Energy Policy and Conservation Act (Public Law 94- 163), as amended, requires the Secretary of Energy to submit annual reports to the President and the Congress on activities of the Strategic Petroleum Reserve (SPR). This report describes activities for the year ending December 31, 1995.

  6. Strategic Communication: A Departmental Transformation

    Science.gov (United States)

    2010-03-24

    Representatives Adam Smith and Mac Thornberry echoed this view in early March 2010 when they invited other members of the U.S. House of Representatives to...jfq_pages/ editions/i55/1.pdf (accessed January 15, 2010). 5 Ibid. 21 6 U.S. Representatives Adam Smith and Mac Thornberry, “Join the new Strategic

  7. Strategic School Planning in Jordan

    Science.gov (United States)

    Al-Zboon, Mohammad Saleem; Hasan, Manal Subhi

    2012-01-01

    The study aimed to measuring the applying degree of the strategic school planning stages at the Governmental high schools from the educational supervisors and principals perspective in the directorates related to Amman city, the study society was formed of the educational supervisors and principals working at Educational directorates related to…

  8. Strategic Scenario Construction Made Easy

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    insights from the area of strategic forecasting (of which scenario planning is a proper subset) and experiences gained from a recent course in that area to develop a simpler, more direct, hands-on method for scenario construction and to provide several ideas for scenario construction that can be used...

  9. Raising financing through strategic timing

    Science.gov (United States)

    Maine, Elicia; Thomas, V. J.

    2017-02-01

    Strategic timing can be key for nano-drug-delivery ventures to get financing. Timely publications engage potential partners; early broad, blocking, relevant patents demonstrate the potential to appropriate value; and venture formation closer to clinical viability better aligns its timeline with that of venture capitalists.

  10. Strategic Asset Seeking by EMNEs

    DEFF Research Database (Denmark)

    Petersen, Bent; Seifert, Jr., Rene E.

    2014-01-01

    as the more relevant concept to use when explaining strategic asset seeking of EMNEs. A set of propositions are formulated to guide empirical testing. Originality/value: The insights gained from using the springboard perspective and the LOO concept are non-trivial: They basically predict future dominance...

  11. Thinking strategically about electricity pricing

    Energy Technology Data Exchange (ETDEWEB)

    Toulson, D. (Barakat and Chamberlin, Inc., Oakland, CA (United States))

    1992-12-01

    This report describes an approach by which utilities can view pricing from a strategic, market-oriented perspective. It begins by reviewing pricing practices found in private industry and develops a framework for utility rate design that incorporates both customer value and cost of service. A market intelligence system for gathering data relevant to pricing decisions is also briefly outlined.

  12. Negotiation for Strategic Video Games

    OpenAIRE

    Afiouni, Einar Nour; Ovrelid, Leif Julian

    2013-01-01

    This project aims to examine the possibilities of using game theoretic concepts and multi-agent systems in modern video games with real time demands. We have implemented a multi-issue negotiation system for the strategic video game Civilization IV, evaluating different negotiation techniques with a focus on the use of opponent modeling to improve negotiation results.

  13. Using Intellectual Property Rights Strategically

    DEFF Research Database (Denmark)

    Reitzig, Markus

    2003-01-01

    With the share of intellectual property among corporate value constantly rising,management's understanding of the strategic use of patents, trademarks, andcopyrights becomes ever more crucial. The vast majority of articles on patent ortrademark strategies, however, is written by and for lawyers...

  14. Strategic Groups and Banks’ Performance

    Directory of Open Access Journals (Sweden)

    Gregorz Halaj

    2009-06-01

    Full Text Available The theory of strategic groups predicts the existence of stable groups of companies that adopt similar business strategies. The theory also predicts that groups will differ in performance and in their reaction to external shocks. We use cluster analysis to identify strategic groups in the Polish banking sector. We find stable groups in the Polish banking sector constituted after the year 2000 following the major privatisation and ownership changes connected with transition to the mostly-privately-owned banking sector in the late 90s. Using panel regression methods we show that the allocation of banks to groups is statistically significant in explaining the profitability of banks. Thus, breaking down the banks into strategic groups and allowing for the different reaction of the groups to external shocks helps in a more accurate explanation of profits of the banking sector as a whole.Therefore, a more precise ex ante assessment of the loss absorption capabilities of banks is possible, which is crucial for an analysis of banking sector stability. However, we did not find evidence of the usefulness of strategic groups in explaining the quality of bank portfolios as measured by irregular loans over total loans, which is a more direct way to assess risks to financial stability.

  15. Strategic Audit and Marketing Plan

    Science.gov (United States)

    Wright, Lianna S.

    2013-01-01

    The purpose of this audit was to revise the marketing plan for ADSum Professional Development School and give the owner a long-term vision of the school to operate competitively in the crowded field of for-profit schools. It is fairly simple to create a strategic plan but harder to implement and execute. Execution requires weeks and months of…

  16. Developing a strategic marketing plan.

    Science.gov (United States)

    Zipin, M L

    1989-06-01

    Strategic planning is essential to the survival of today's hospital. Whether the hospital is an academic medical center, a community hospital or some other type of organization, the key to success is a thorough market planning process. The four-phase process described here focuses on an academic medical center, but it is equally applicable to other types of hospitals.

  17. BARRIERS OF STRATEGIC ALLIANCES ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Vladislav M. Sannikov

    2014-01-01

    Full Text Available General barriers of organization of different types of strategic alliances have beenconsidered in the article. There are several recommendations for overcoming themin cases of international alliances, and in case of work in one state. The article also identified goals and tasks of single coordination center of alliance to overcome organization barriers.

  18. Strategic and Everyday Innovative Narratives

    DEFF Research Database (Denmark)

    Reff Pedersen, Anne; Brehm Johansen, Mette

    2012-01-01

    in making sense of innovative ideas in everyday practice. An empirical case is offered to demonstrate how two types of innovation narratives emerge: strategic and everyday narratives through involvement of spokespersons and employees. These findings suggest that an advanced understanding of the roles...

  19. A Strategizing Perspective in Foresight

    DEFF Research Database (Denmark)

    : The overall purpose of the paper is partly to contribute to the discussion on the theoretical perspectives behind the practice of foresight and partly to suggest a strategizing approach in foresight practice. More specifically we focus on foresight as a policy tool for sectoral innovation. Approach...

  20. Strategic development: a new focus.

    Science.gov (United States)

    Lefko, J J

    1989-03-01

    Despite popular convention, a health-care organization's chief executive officer should be the organization's chief strategist. Planners and marketers must begin to function as strategic development officers responsible for sending the right information to the right decision makers so that they can make the right decisions as quickly as possible.

  1. IT Strategic and Operational Controls

    CERN Document Server

    Kyriazoglou, J

    2010-01-01

    This book provides a comprehensive guide to implementing an integrated and flexible set of IT controls in a systematic way. It can help organisations to formulate a complete culture for all areas which must be supervised and controlled; allowing them to simultaneously ensure a secure, high standard whilst striving to obtain the strategic and operational goals of the company.

  2. Tax Rates as Strategic Substitutes

    NARCIS (Netherlands)

    R.A. de Mooij (Ruud); H. Vrijburg (Hendrik)

    2012-01-01

    textabstractThis paper analytically derives the conditions under which the slope of the tax reaction function is negative in a classical tax competition model. If countries maximize welfare, we show that a negative slope (reflecting strategic substitutability) occurs under relatively mild conditions

  3. Entrepreneurial Spirit in Strategic Planning.

    Science.gov (United States)

    Riggs, Donald E.

    1987-01-01

    Presents a model which merges the concepts of entrepreneurship with those of strategic planning to create a library management system. Each step of the process, including needs assessment and policy formation, strategy choice and implementation, and evaluation, is described in detail. (CLB)

  4. Strategic Planning for School Success.

    Science.gov (United States)

    Herman, Jerry J.

    1993-01-01

    Strategic planners concerned with such matters as high-achieving students, high-performing teachers, broad-based community support, and a two-way involvement with the community must analyze the strengths, weaknesses, opportunities, and threats existing in the school's internal and external environment. A sample SWOT analysis is included. (MLH)

  5. Research of Virtual Accelerator Control System

    Institute of Scientific and Technical Information of China (English)

    DongJinmei; YuanYoujin; ZhengJianhua

    2003-01-01

    A Virtual Accelerator is a computer process which simulates behavior of beam in an accelerator and responds to the accelerator control program under development in a same way as an actual accelerator. To realize Virtual Accelerator, control system should provide the same program interface to top layer Application Control Program, it can make 'Real Accelerator' and 'Virtual Accelerator'use the same GUI, so control system should have a layer to hide hardware details, Application Control Program access control devices through logical name but not through coded hardware address. Without this layer, it is difficult to develop application program which can access both 'Virtual' and 'Real' Accelerators using same program interfaces. For this reason, we can create CSR Runtime Database which allows application program to access hardware devices and data on a simulation process in a unified way. A device 'is represented as a collection of records in CSR Runtime Database. A control program on host computer can access devices in the system only through names of record fields, called channel.

  6. Use of hardware accelerators for ATLAS computing

    CERN Document Server

    Bauce, Matteo; Dankel, Maik; Howard, Jacob; Kama, Sami

    2015-01-01

    Modern HEP experiments produce tremendous amounts of data. These data are processed by in-house built software frameworks which have lifetimes longer than the detector itself. Such frameworks were traditionally based on serial code and relied on advances in CPU technologies, mainly clock frequency, to cope with increasing data volumes. With the advent of many-core architectures and GPGPUs this paradigm has to shift to parallel processing and has to include the use of co-processors. However, since the design of most existing frameworks is based on the assumption of frequency scaling and predate co-processors, parallelisation and integration of co-processors are not an easy task. The ATLAS experiment is an example of such a big experiment with a big software framework called Athena. In this talk we will present the studies on parallelisation and co-processor (GPGPU) use in data preparation and tracking for trigger and offline reconstruction as well as their integration into a multiple process based Athena frame...

  7. GPU-accelerated molecular mechanics computations.

    Science.gov (United States)

    Anthopoulos, Athanasios; Grimstead, Ian; Brancale, Andrea

    2013-10-05

    In this article, we describe an improved cell-list approach designed to match the Kepler architecture of General-purpose graphics processing units (GPGPU). We explain how our approach improves load balancing for the above algorithm and how warp intrinsics are used to implement Newton's third law for the nonbonded force calculations. We also talk through our approach to exclusions handling together with a method to calculate bonded forces and 1-4 electrostatic scaling using a single Cuda kernel. Performance benchmarks are included in the last sections to show the linear scaling of our implementation using a step minimization method. In addition, multiple performance benchmarks demonstrate the contribution of various optimizations we used for our implementations. © 2013 Wiley Periodicals, Inc.

  8. Use of hardware accelerators for ATLAS computing

    CERN Document Server

    Dankel, Maik; The ATLAS collaboration; Howard, Jacob; Bauce, Matteo; Boing, Rene

    2015-01-01

    Modern HEP experiments produce tremendous amounts of data. This data is processed by in-house built software frameworks which have lifetimes longer than the detector it- self. Such frameworks were traditionally based on serial code and relied on advances in CPU technologies, mainly clock frequency, to cope with increasing data volumes. With the advent of many-core architectures and GPGPUs this paradigm has to shift to paral- lel processing and has to include the use of co-processors. However, since the design of most existing frameworks is based on the assumption of frequency scaling and predate co-processors, parallelisation and integration of co-processors are not an easy task. The ATLAS experiment is an example of such a big experiment with a big software frame- work called Athena. In this proceedings we will present the studies on parallelisation and co-processor (GPGPU) use in data preparation and tracking for trigger and offline recon- struction as well as their integration into a multiple process based...

  9. Parallel Computing Methods For Particle Accelerator Design

    CERN Document Server

    Popescu, Diana Andreea; Hersch, Roger

    We present methods for parallelizing the transport map construction for multi-core processors and for Graphics Processing Units (GPUs). We provide an efficient implementation of the transport map construction. We describe a method for multi-core processors using the OpenMP framework which brings performance improvement over the serial version of the map construction. We developed a novel and efficient algorithm for multivariate polynomial multiplication for GPUs and we implemented it using the CUDA framework. We show the benefits of using the multivariate polynomial multiplication algorithm for GPUs in the map composition operation for high orders. Finally, we present an algorithm for map composition for GPUs.

  10. Modeling the value of strategic actions in the superior colliculus

    Directory of Open Access Journals (Sweden)

    Dhushan Thevarajah

    2010-02-01

    Full Text Available In learning models of strategic game play, an agent constructs a valuation (action value over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC, a midbrain region involved in planning saccadic eye movements, in monkeys while they performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game “matching-pennies”. In the instructed task, stochastic saccades were elicited through explicit instruction rather than free choices. In both tasks, neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Ho, Camerer, and Chong, 2007. Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions.

  11. A Study on Strategic Planning and Procurement of Medicals in Uganda's Regional Referral Hospitals.

    Science.gov (United States)

    Masembe, Ishak Kamaradi

    2016-12-31

    This study was an analysis of the effect of strategic planning on procurement of medicals in Uganda's regional referral hospitals (RRH's). Medicals were defined as essential medicines, medical devices and medical equipment. The Ministry of Health (MOH) has been carrying out strategic planning for the last 15 years via the Health Sector Strategic Plans. Their assumption was that strategic planning would translate to strategic procurement and consequently, availability of medicals in the RRH's. However, despite the existence of these plans, there have been many complaints about expired drugs and shortages in RRH's. For this purpose, a third variable was important because it served the role of mediation. A questionnaire was used to obtain information on perceptions of 206 respondents who were selected using simple random sampling. 8 key informant interviews were held, 2 in each RRH. 4 Focus Group Discussions were held, 1 for each RRH, and between 5 and 8 staff took part as discussants for approximately three hours. The findings suggested that strategic planning was affected by funding to approximately 34% while the relationship between funding and procurement was 35%. The direct relationship between strategic planning and procurement was 18%. However when the total causal effect was computed it turned out that strategic planning and the related variable of funding contributed 77% to procurement of medicals under the current hierarchical model where MOH is charged with development of strategic plans for the entire health sector. Since even with this contribution there were complaints, the study proposed a new model called CALF which according to a simulation, if adopted by MOH, strategic planning would contribute 87% to effectiveness in procurement of medicals.

  12. Automatic generation of application specific FPGA multicore accelerators

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Schleuniger, Pascal; Jensen, Nicklas Bo

    2014-01-01

    High performance computing systems make increasing use of hardware accelerators to improve performance and power properties. For large high-performance FPGAs to be successfully integrated in such computing systems, methods to raise the abstraction level of FPGA programming are required...... to identify optimal performance energy trade-offs points for a multicore based FPGA accelerator....

  13. 2014 CERN Accelerator Schools: Plasma Wake Acceleration

    CERN Multimedia

    2014-01-01

    A specialised school on Plasma Wake Acceleration will be held at CERN, Switzerland from 23-29 November, 2014.   This course will be of interest to staff and students in accelerator laboratories, university departments and companies working in or having an interest in the field of new acceleration techniques. Following introductory lectures on plasma and laser physics, the course will cover the different components of a plasma wake accelerator and plasma beam systems. An overview of the experimental studies, diagnostic tools and state of the art wake acceleration facilities, both present and planned, will complement the theoretical part. Topical seminars and a visit of CERN will complete the programme. Further information can be found at: http://cas.web.cern.ch/cas/PlasmaWake2014/CERN-advert.html http://indico.cern.ch/event/285444/

  14. STRATEGIC MANAGEMENT OF A TERRITORIAL DISTRIBUTED COMPLEX

    Directory of Open Access Journals (Sweden)

    Vidovskiy L. A.

    2015-10-01

    Full Text Available The article is devoted to strategic management and implementation of the strategy. Management strategy is based on the management of strategic potential of the enterprise. The strategic potential of the company generates only those resources that can be changed because of strategic decisions. Analysis of the potential of the enterprise should cover almost all spheres of its activity: the enterprise management, production, marketing, finance, human resources. The article has designed a system of strategic management by the example of a construction company in the information management system territorially - distributed building complexes, thus improving the competitiveness of the organization, to provide timely and quality implementation of business plans

  15. Strategic Context of Project Portfolio Management

    Directory of Open Access Journals (Sweden)

    Nedka Nikolova

    2016-06-01

    Full Text Available In 2014 Bulgaria entered its second programming period (2014-2020 which opened a new stage in the development of project management in our country. Project-oriented companies are entering a new stage in which based on experience and increased design capacity they will develop their potential and will accelerate growth. This poses new challenges for science and business to identify strategic opportunities and formulation of project objectives, programs and portfolios of projects that will increase the competitive potential of companies and the economy as a whole. This article is an expression of the shared responsibility of science to develop the scientific front to solve methodologically difficult and practically new tasks that are derived from the needs to increase the competitive potential of the business-based project approach. The main objective of this study is based on the systematization of the results of theoretical research and development of methodology of Project Portfolio Management to explore the opportunities for its application in Bulgarian industrial companies.

  16. 国外云计算企业战略并购分析与启示%An Analysis on Strategic M & A of Foreign Cloud Computing Enterprises and Enlightenment

    Institute of Scientific and Technical Information of China (English)

    赖斌慧; 林晓伟

    2014-01-01

    分析云计算产业发展现状,通过在互联网搜索找到公开报道的2009年到2014年2月间149起国外云计算企业并购交易,分析这些交易的特征、动因及对我国的启示,以期为我国云计算产业的发展提供理论和实践上的借鉴。%This article analyzes the present situation of the cloud computing industry development,then finds 1 49 openly reported cases of foreign cloud computing M&A from 2009 to February 201 4,analyzes the characteristics and motivations of theses cases,and puts forward the enlightenment to our country.It is hoped to provide theoretical and practical refer-ence for the cloud computing industry development in China.

  17. A Strategic-Equilibrium Based

    Directory of Open Access Journals (Sweden)

    Gabriel J. Turbay

    2011-03-01

    Full Text Available The strategic equilibrium of an N-person cooperative game with transferable utility is a system composed of a cover collection of subsets of N and a set of extended imputations attainable through such equilibrium cover. The system describes a state of coalitional bargaining stability where every player has a bargaining alternative against any other player to support his corresponding equilibrium claim. Any coalition in the sable system may form and divide the characteristic value function of the coalition as prescribed by the equilibrium payoffs. If syndicates are allowed to form, a formed coalition may become a syndicate using the equilibrium payoffs as disagreement values in bargaining for a part of the complementary coalition incremental value to the grand coalition when formed. The emergent well known-constant sum derived game in partition function is described in terms of parameters that result from incumbent binding agreements. The strategic-equilibrium corresponding to the derived game gives an equal value claim to all players.  This surprising result is alternatively explained in terms of strategic-equilibrium based possible outcomes by a sequence of bargaining stages that when the binding agreements are in the right sequential order, von Neumann and Morgenstern (vN-M non-discriminatory solutions emerge. In these solutions a preferred branch by a sufficient number of players is identified: the weaker players syndicate against the stronger player. This condition is referred to as the stronger player paradox.  A strategic alternative available to the stronger players to overcome the anticipated not desirable results is to voluntarily lower his bargaining equilibrium claim. In doing the original strategic equilibrium is modified and vN-M discriminatory solutions may occur, but also a different stronger player may emerge that has eventually will have to lower his equilibrium claim. A sequence of such measures converges to the equal

  18. Improved plasma accelerator

    Science.gov (United States)

    Cheng, D. Y.

    1971-01-01

    Converging, coaxial accelerator electrode configuration operates in vacuum as plasma gun. Plasma forms by periodic injections of high pressure gas that is ionized by electrical discharges. Deflagration mode of discharge provides acceleration, and converging contours of plasma gun provide focusing.

  19. Accelerator Technology Division

    Science.gov (United States)

    1992-04-01

    In fiscal year (FY) 1991, the Accelerator Technology (AT) division continued fulfilling its mission to pursue accelerator science and technology and to develop new accelerator concepts for application to research, defense, energy, industry, and other areas of national interest. This report discusses the following programs: The Ground Test Accelerator Program; APLE Free-Electron Laser Program; Accelerator Transmutation of Waste; JAERI, OMEGA Project, and Intense Neutron Source for Materials Testing; Advanced Free-Electron Laser Initiative; Superconducting Super Collider; The High-Power Microwave Program; (Phi) Factory Collaboration; Neutral Particle Beam Power System Highlights; Accelerator Physics and Special Projects; Magnetic Optics and Beam Diagnostics; Accelerator Design and Engineering; Radio-Frequency Technology; Free-Electron Laser Technology; Accelerator Controls and Automation; Very High-Power Microwave Sources and Effects; and GTA Installation, Commissioning, and Operations.

  20. High Energy Particle Accelerators

    CERN Multimedia

    Audio Productions, Inc, New York

    1960-01-01

    Film about the different particle accelerators in the US. Nuclear research in the US has developed into a broad and well-balanced program.Tour of accelerator installations, accelerator development work now in progress and a number of typical experiments with high energy particles. Brookhaven, Cosmotron. Univ. Calif. Berkeley, Bevatron. Anti-proton experiment. Negative k meson experiment. Bubble chambers. A section on an electron accelerator. Projection of new accelerators. Princeton/Penn. build proton synchrotron. Argonne National Lab. Brookhaven, PS construction. Cambridge Electron Accelerator; Harvard/MIT. SLAC studying a linear accelerator. Other research at Madison, Wisconsin, Fixed Field Alternate Gradient Focusing. (FFAG) Oakridge, Tenn., cyclotron. Two-beam machine. Comments : Interesting overview of high energy particle accelerators installations in the US in these early years. .

  1. Shortfall of Strategic Governance and Strategic Management in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Ochrana František

    2016-12-01

    Full Text Available The article analyses the problems of strategic governance and strategic management of the Czechoslovak Government, as well as the Government of the Czech Republic in the years 1989-2016. It seeks the causes and factors that have caused the low levels of strategic governance and strategic management at the level of the ministries of the Czech Republic. It examines the problem from genetic and historical perspective, and from the organizational and human capacity to exercise strategic governance. The study is based on two pieces of empirical research within the ministries of the Czech Republic. It identifies the main cause of failure of strategic governance and strategic management at the level of the central government of the Czech Republic. These include, in particular, the persistent distrust of the ideas of strategic governance and strategic management held by the right-wing governments and the generally low capacity of governments of the Czech Republic to engage in strategic governance. The organizational structure of the central state administration lacks the strategic units that generate ideas for supporting strategic governance. The empirical research of the ministries of the Czech Republic also revealed that policy workers in Czech ministries dedicate a large proportion of their work time to operational and administrative activities at the expense of analytical and strategic activities. The changes require implementation of reforms within the public administration, which (among other things will eliminate the existing causes and inhibiting factors regarding the lack of strategic governance in the Czech Republic.

  2. Accelerators, Colliders, and Snakes

    Science.gov (United States)

    Courant, Ernest D.

    2003-12-01

    The author traces his involvement in the evolution of particle accelerators over the past 50 years. He participated in building the first billion-volt accelerator, the Brookhaven Cosmotron, which led to the introduction of the "strong-focusing" method that has in turn led to the very large accelerators and colliders of the present day. The problems of acceleration of spin-polarized protons are also addressed, with discussions of depolarizing resonances and "Siberian snakes" as a technique for mitigating these resonances.

  3. Far field acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Fernow, R.C.

    1995-07-01

    Far fields are propagating electromagnetic waves far from their source, boundary surfaces, and free charges. The general principles governing the acceleration of charged particles by far fields are reviewed. A survey of proposed field configurations is given. The two most important schemes, Inverse Cerenkov acceleration and Inverse free electron laser acceleration, are discussed in detail.

  4. Accelerators and Dinosaurs

    CERN Document Server

    Turner, Michael Stanley

    2003-01-01

    Using naturally occuring particles on which to research might have made accelerators become extinct. But in fact, results from astrophysics have made accelerator physics even more important. Not only are accelerators used in hospitals but they are also being used to understand nature's inner workings by searching for Higgs bosons, CP violation, neutrino mass and dark matter (2 pages)

  5. The CERN Accelerator School

    CERN Multimedia

    2016-01-01

    Introduction to accelerator physics The CERN Accelerator School: Introduction to Accelerator Physics, which should have taken place in Istanbul, Turkey, later this year has now been relocated to Budapest, Hungary.  Further details regarding the new hotel and dates will be made available as soon as possible on a new Indico site at the end of May.

  6. Acceleration: It's Elementary

    Science.gov (United States)

    Willis, Mariam

    2012-01-01

    Acceleration is one tool for providing high-ability students the opportunity to learn something new every day. Some people talk about acceleration as taking a student out of step. In actuality, what one is doing is putting a student in step with the right curriculum. Whole-grade acceleration, also called grade-skipping, usually happens between…

  7. GPU Accelerated Vector Median Filter

    Science.gov (United States)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  8. Human dimension of strategic partnerships

    Directory of Open Access Journals (Sweden)

    Petković Mirjana M.

    2004-01-01

    Full Text Available This paper aims to point to the widespread practice of neglecting behavioral aspects of different forms of fusions and integrations of enterprises that have emerged in the process of privatization through strategic partnerships with foreign companies among Serbian enterprises. The initial hypothesis in this paper is that the process of privatization, restructuring and transformation in Serbian enterprises cannot be completely successful and equally advantageous for all the subjects involved if there is no concern for human dimension of these processes. Without this concern there is a possibility for behavioral problems to arise, and the only way to resolve them is through post festum respecting and introducing elements that should never have been neglected in the first place. This paper refers to the phenomenon of collision of cultures and the ways of resolving it while forming strategic partnerships.

  9. Arms Control and Strategic Stability

    Institute of Scientific and Technical Information of China (English)

    Hu; Yumin

    2014-01-01

    This essay intends to offer a comment on concepts, trends and attitudes concerning arms control and strategic stability with reference to the current international security situation. It also offers observations from two different perspectives about strategic stability: one proceeds from the concept of universal security and aims to prevent conflicts and instability from disrupting regional and international security environment on which nation states depend so much for their peaceful development; the other starts from maintaining the global leadership by a super power and aiming to contain any challenge that sways or is likely to sway its dominating status. If China and the United States commit themselves to the undertaking of a new type of major powers relationship that stresses win-win cooperation, they will be able to contribute greatly to a stable international security architecture that is good for world peaceful development.

  10. Corporate Foresight and Strategic Decisions

    DEFF Research Database (Denmark)

    Gomez Portaleoni, Claudio; Marinova, Svetla Trifonova; Ul-Haq, Rehan;

    The investigation of the future of an organization has always captivated the attention of academics and business managers. Presently, the aspiration to entrench future-relevant insights into management practices is a must. Companies that have made attempts to use corporate foresight have generally...... dealt successfully with internal information sharing processes that in most cases have prepared them for the challenges of the future. Corporate Foresights and Strategic Decisions investigates the relationships between corporate foresight and management decision-making processes in organizations...... accountability and integrity of the participating departments as well as by the apparent nature of environmental explosiveness. This book provides clear confirmations showing that the impacts of corporate foresight on strategic decisions are critically affected by the evaluative and analytical verdicts...

  11. Strategic Sourcing in the Army

    Science.gov (United States)

    2013-09-01

    sector:  Total Cost of Ownership tools are applied to understand the life cycle costs of a product or service  Supplier Scorecards to apply a...sourcing efforts, as well as prioritizing new initiatives. In addition to cost and performance goals, any strategic sourcing plan must be balanced with...article/97687/ 39 INITIAL DISTRIBUTION LIST 1. Defense Technical Information Center Ft. Belvoir, Virginia 2. Dudley Knox Library Naval Postgraduate School Monterey, California

  12. Strategic Complexity and Global Expansion

    DEFF Research Database (Denmark)

    Oladottir, Asta Dis; Hobdari, Bersant; Papanastassiou, Marina

    2012-01-01

    The purpose of this paper is to analyse the determinants of global expansion strategies of newcomer Multinational Corporations (MNCs) by focusing on Iceland, Israel and Ireland. We argue that newcomer MNCs from small open economies pursue complex global expansion strategies (CGES). We distinguish....... The empirical evidence suggests that newcomer MNCs move away from simplistic dualities in the formulation of their strategic choices towards more complex options as a means of maintaining and enhancing their global competitiveness....

  13. Strategic Sealift Supporting Army Deployments

    Science.gov (United States)

    2016-06-10

    Debarkation POE Port of Embarkation RPOE Rapid Port Opening Element RRF Ready Reserve Force SDDC Surface Deployment and Distribution Command SPOD Sea... Sustainment 42, no. 2 (March- April 2010): 4. 3 Domestic Contract (RDC).4 Managing the timeliness of these options, as well as the suitability...force for overseas missions. Building on this is the assumption that strategic sealift options will continue to be the high-volume leg of the mobility

  14. 2011 Army Strategic Planning Guidance

    Science.gov (United States)

    2011-03-25

    TESI ) of 22,000 Soldiers, the Army’s total force by the end of the mid-term period is programmed to be 520K (AC). We will achieve a more...dwell ratios, extending TESI authority to adequately man deploying units and sustain the All-Volunteer Force, right-sizing the generating force, and... TESI Temporary End-Strength Increase WMD Weapons of Mass Destruction 2011 ARMY STRATEGIC PLANNING GUIDANCE Page 19 2011

  15. Strategic Analysis for Patch Ltd.

    OpenAIRE

    Louis, Owen

    2012-01-01

    This paper is a strategic analysis for the start-up Patch Ltd. Patch has developed innovative products for growing produce in homes and will compete in the consumer containergrowing industry. The industry and the company are introduced along with urban agriculture trends. The industry is analysed using Porter’s 5 forces analysis, and a competitive analysis compares Patch to its competitors in key success factors found in the 5 forces analysis. A strategy is developed using opportunities and t...

  16. Checkpointing for a hybrid computing node

    Energy Technology Data Exchange (ETDEWEB)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  17. The plasma physics of shock acceleration

    Science.gov (United States)

    Jones, Frank C.; Ellison, Donald C.

    1991-01-01

    The history and theory of shock acceleration is reviewed, paying particular attention to theories of parallel shocks which include the backreaction of accelerated particles on the shock structure. The work that computer simulations, both plasma and Monte Carlo, are playing in revealing how thermal ions interact with shocks and how particle acceleration appears to be an inevitable and necessary part of the basic plasma physics that governs collisionless shocks is discussed. Some of the outstanding problems that still confront theorists and observers in this field are described.

  18. Plasma is a strategic resource.

    Science.gov (United States)

    Strengers, Paul F W; Klein, Harvey G

    2016-12-01

    Plasma-derived medicinal products (PDMPs) such as immunoglobulins and clotting factors are listed by the World Health Organization as essential medicines. These and other PDMPs are crucial for the prophylaxis and treatment of patients with bleeding disorders, immune deficiencies, autoimmune and inflammatory diseases, and a variety of congenital deficiency disorders. While changes in clinical practice in developed countries have reduced the need for red blood cell transfusions thereby significantly reducing the collection volumes of whole blood and recovered plasma suitable for fractionation, the need for PDMPs worldwide continues to increase. The majority of plasma supplies for the manufacture of PDMPs is met by the US commercial plasma industry. However, geographic imbalance in the collection of plasma raises concerns that local disruptions of plasma supplies could result in regional and global shortages of essential PDMPs. Plasma, which fits the definition of a strategic resource, that is, "an economically important raw material which is subject to a higher risk of supply interruption," should be considered a strategic resource comparable to energy and drinking water. Plasma collections should be increased outside the United States, including in low- and middle-income countries. The need for capacity building in these countries is an essential part to strengthen quality plasma collection. This will require changes in national and regional policies. We advocate the need for the restoration of an equitable balance of the international plasma supply to reduce the risk of supply shortages worldwide. Strategic independence of plasma should be endorsed on a global level.

  19. Accelerator R&D: Research for Science - Science for Society

    Energy Technology Data Exchange (ETDEWEB)

    The HEP Accelerator R& D Task Force: N.R. Holtkamp,S. Biedron, S.V. Milton, L. Boeh, J.E. Clayton, G. Zdasiuk, S.A. Gourlay, M.S. Zisman,R.W. Hamm, S. Henderson, G.H. Hoffstaetter, L. Merminga, S. Ozaki, F.C. Pilat, M. White

    2012-07-01

    In September 2011 the US Senate Appropriations Committee requested a ten-year strategic plan from the Department of Energy (DOE) that would describe how accelerator R&D today could advance applications directly relevant to society. Based on the 2009 workshop 'Accelerators for America's Future' an assessment was made on how accelerator technology developed by the nation's laboratories and universities could directly translate into a competitive strength for industrial partners and a variety of government agencies in the research, defense and national security sectors. The Office of High Energy Physics, traditionally the steward for advanced accelerator R&D within DOE, commissioned a task force under its auspices to generate and compile ideas on how best to implement strategies that would help fulfill the needs of industry and other agencies, while maintaining focus on its core mission of fundamental science investigation.

  20. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  1. Hacking control systems, switching… accelerators off?

    CERN Multimedia

    Computer Security Team

    2013-01-01

    In response to our article in the last Bulletin, we received the following comment: “Wasn’t Stuxnet designed to stop the Iranian nuclear programme? Why then all this noise with regard to CERN accelerators? Don’t you realize that ‘computer security’ is not the raison d'être of CERN?”. Thank you for this golden opportunity to delve into this issue.   Given the sophistication of Stuxnet, it might have been hard to detect such a targeted attack against CERN, if at all. But this is not the point. There are much simpler risks for our accelerator complex and infrastructure. And, while “‘computer security’ is [indeed] not the raison d' être”, it is our collective responsibility to keep this risk at bay.   Examples? Just think of a simple computer virus infecting Windows-based control PCs connected to the accelerator network (the Technical Network, &ld...

  2. GPUs as Storage System Accelerators

    CERN Document Server

    Al-Kiswany, Samer; Ripeanu, Matei

    2012-01-01

    Massively multicore processors, such as Graphics Processing Units (GPUs), provide, at a comparable price, a one order of magnitude higher peak performance than traditional CPUs. This drop in the cost of computation, as any order-of-magnitude drop in the cost per unit of performance for a class of system components, triggers the opportunity to redesign systems and to explore new ways to engineer them to recalibrate the cost-to-performance relation. This project explores the feasibility of harnessing GPUs' computational power to improve the performance, reliability, or security of distributed storage systems. In this context, we present the design of a storage system prototype that uses GPU offloading to accelerate a number of computationally intensive primitives based on hashing, and introduce techniques to efficiently leverage the processing power of GPUs. We evaluate the performance of this prototype under two configurations: as a content addressable storage system that facilitates online similarity detectio...

  3. Industrial Application of Accelerators

    CERN Document Server

    CERN. Geneva

    2017-01-01

    At CERN, we are very familiar with large, high energy particle accelerators. However, in the world outside CERN, there are more than 35000 accelerators which are used for applications ranging from treating cancer, through making better electronics to removing harmful micro-organisms from food and water. These are responsible for around $0.5T of commerce each year. Almost all are less than 20 MeV and most use accelerator types that are somewhat different from what is at CERN. These lectures will describe some of the most common applications, some of the newer applications in development and the accelerator technology used for them. It will also show examples of where technology developed for particle physics is now being studied for these applications. Rob Edgecock is a Professor of Accelerator Science, with a particular interest in the medical applications of accelerators. He works jointly for the STFC Rutherford Appleton Laboratory and the International Institute for Accelerator Applications at the Univer...

  4. Industrial Application of Accelerators

    CERN Document Server

    CERN. Geneva

    2017-01-01

    At CERN, we are very familiar with large, high energy particle accelerators. However, in the world outside CERN, there are more than 35000 accelerators which are used for applications ranging from treating cancer, through making better electronics to removing harmful micro-organisms from food and water. These are responsible for around $0.5T of commerce each year. Almost all are less than 20 MeV and most use accelerator types that are somewhat different from what is at CERN. These lectures will describe some of the most common applications, some of the newer applications in development and the accelerator technology used for them. It will also show examples of where technology developed for particle physics is now being studied for these applications. Rob Edgecock is a Professor of Accelerator Science, with a particular interest in the medical applications of accelerators. He works jointly for the STFC Rutherford Appleton Laboratory and the International Institute for Accelerator Applications at the Uni...

  5. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer; PROYECTO ISLAM: Interfase para las senales de diversas experinecias de un acelerador de Van de Graaf y un ordenador PDP 11/44

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Piquer, T. A.; Yuste Santos, C.

    1986-07-01

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs.

  6. STRATEGIC ALLIANCES – VIABLE ALTERNATIVE TO CREATE A COMPETITIVE ADVANTAGE IN A GLOBAL MARKET

    Directory of Open Access Journals (Sweden)

    Irina NICOLAU

    2010-12-01

    Full Text Available In the past years, in the light of the economic turbulences all around the world, one of the most important ways to assure a competitive advantage is creating a strategic alliance. Such collaborative ventures between firms were developed as a response to the changes which have been happening to the world economy as increased competition, higher costs of developing new products, accelerated technological changes and, maybe the most important – the recent world economic crises. Being part of a strategic alliance creates competitive advantage for the companies by establishing their presence worldwide, by building up operating experience in overseas markets and gaining access to those national markets that were inaccessible before. At the same time, a strategic alliance means management commitment, special skills and forward planning for each company which takes part to an alliance.

  7. Architectural requirements for the Red Storm computing system.

    Energy Technology Data Exchange (ETDEWEB)

    Camp, William J.; Tomkins, James Lee

    2003-10-01

    This report is based on the Statement of Work (SOW) describing the various requirements for delivering 3 new supercomputer system to Sandia National Laboratories (Sandia) as part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative (ASCI) program. This system is named Red Storm and will be a distributed memory, massively parallel processor (MPP) machine built primarily out of commodity parts. The requirements presented here distill extensive architectural and design experience accumulated over a decade and a half of research, development and production operation of similar machines at Sandia. Red Storm will have an unusually high bandwidth, low latency interconnect, specially designed hardware and software reliability features, a light weight kernel compute node operating system and the ability to rapidly switch major sections of the machine between classified and unclassified computing environments. Particular attention has been paid to architectural balance in the design of Red Storm, and it is therefore expected to achieve an atypically high fraction of its peak speed of 41 TeraOPS on real scientific computing applications. In addition, Red Storm is designed to be upgradeable to many times this initial peak capability while still retaining appropriate balance in key design dimensions. Installation of the Red Storm computer system at Sandia's New Mexico site is planned for 2004, and it is expected that the system will be operated for a minimum of five years following installation.

  8. Platform computing powers enterprise grid

    CERN Multimedia

    2002-01-01

    Platform Computing, today announced that the Stanford Linear Accelerator Center is using Platform LSF 5, to carry out groundbreaking research into the origins of the universe. Platform LSF 5 will deliver the mammoth computing power that SLAC's Linear Accelerator needs to process the data associated with intense high-energy physics research (1 page).

  9. Securing a biomedical communications future: thinking strategically.

    Science.gov (United States)

    Stein, D

    1985-11-01

    Ensuring continued growth and viability of the biomedical communication function has become a critical task of the biomedical communications director. Thinking strategically is a cognitive process which assists a director in visualizing programs and tactics which meet clients needs, creates competitive advantages for the biomedical communications unit and builds on existing unit strengths. Thinking strategically can be divided into five phases: strategic vision, strategy development, strategic plan implementation, strategic plan dissemination, and strategic plan evaluation. Each sequence leads the biomedical communications director through a process designed to increase the effectiveness of the biomedical unit and to meet the challenges posed by an environment characterized by diminished financial, material, and human resources as well as respond to threats and opportunities posed by increased competition in the biomedical communications product and marketplace.

  10. SWIM: FUTURISTIC FRAMEWORK FOR STRATEGIC MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Rajendran Muthuveloo

    2014-01-01

    Full Text Available The field of strategic management is undergoing significant changes due to the constant changes taking place in the business environment due to issues like emergence of new economic power, conflicts within among countries, environmental crisis and social crisis. This study explicates how inclusiveness of strategic agilities, ethical issues and legal issues into the strategic management process will help organizations to anticipate and manage changes arises to attain business sustainability and meet the organization’s vision. It will begin with examining and analyzing the gap existing in the current strategic management process. The paper concludes with a more comprehensive strategic management process that incorporates strategic agilities, ethical issues and legal issues.

  11. Preparing design students for strategic design

    DEFF Research Database (Denmark)

    Rasmussen, Jørgen; Schiønning Mortensen, Bo; Geert Jensen, Birgitte

    2012-01-01

    This paper deals with how the visual approach from a design process can help inform companies about future opportunities at a strategic level. The paper follows an innovation project where design students worked with five companies at a 1-day workshop and with one company through a 2-week project...... can be used to facilitate discussions for companies facing strategic challenges. It also underlines the importance of rethinking design skills and communication when moving into strategic processes....

  12. Strategic Entrepreneurship: A Review and Research Agenda

    DEFF Research Database (Denmark)

    Lassen, Astrid Heidemann; Timenes Laugen, Bjørge; Middel, Rick

    2009-01-01

    This paper argues that in order to move the emerging construct of strategic entrepreneurship beyond a theoretically appealing one, we need to improve our theoretical and analytical frameworks in several key areas. Our analysis firstly discusses several challenges for the strategic entrepreneurship...... of research foci are proposed, which will enhance the understanding of the integration of advantage-seeking behaviour and opportunity-seeking behaviour which composes strategic entrepreneurship...

  13. Debunking the Myth of the Strategic Corporal

    Science.gov (United States)

    2015-04-13

    in the widely read essay "The Strategic Corporal : Leadership in the Three Block War . " General Krulak indicated that , given the modern battlefield...the strategic c01poral in the widely read essay "The Strategic Corporal: Leadership in the Three Block War ." General K.rulak indicated that, given the... Rape and Murder ......................... 28 CHAPTER 6: Case Study 4: Afghanistan: Panjwai Massacre .............................. 32 CHAPTER 7

  14. NERSC Strategic Implementation Plan 2002-2006

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William; Bethel, Wes; Craw, James; Draney, Brent; Fortney, William; Gorda, Brend; Harris, William; Meyer, Nancy; Ng, Esmond; Verdier, Francesca; Walter, Howard; Welcome, Tammy

    2002-09-01

    This strategic proposal presents NERSC's vision for its activities and new directions over the next five years. NERSC's continuing commitment to providing high-end systems and comprehensive scientific support for its users will be enhanced, and these activities will be augmented by two new strategic thrusts: support for Scientific Challenge Teams and deployment of a Unified Science Environment. The proposal is in two volumes, the Strategic Plan and the Implementation Plan.

  15. Particle-accelerator decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Opelka, J.H.; Mundis, R.L.; Marmer, G.J.; Peterson, J.M.; Siskind, B.; Kikta, M.J.

    1979-12-01

    Generic considerations involved in decommissioning particle accelerators are examined. There are presently several hundred accelerators operating in the United States that can produce material containing nonnegligible residual radioactivity. Residual radioactivity after final shutdown is generally short-lived induced activity and is localized in hot spots around the beam line. The decommissioning options addressed are mothballing, entombment, dismantlement with interim storage, and dismantlement with disposal. The recycle of components or entire accelerators following dismantlement is a definite possibility and has occurred in the past. Accelerator components can be recycled either immediately at accelerator shutdown or following a period of storage, depending on the nature of induced activation. Considerations of cost, radioactive waste, and radiological health are presented for four prototypic accelerators. Prototypes considered range from small accelerators having minimal amounts of radioactive mmaterial to a very large accelerator having massive components containing nonnegligible amounts of induced activation. Archival information on past decommissionings is presented, and recommendations concerning regulations and accelerator design that will aid in the decommissioning of an accelerator are given.

  16. Parallel computing and application of Element-Free Galerkin method for GPU acceleration%无网格 Galerkin 法 GPU 加速并行计算及其应用

    Institute of Scientific and Technical Information of China (English)

    龚曙光; 刘奇良; 卢海山; 周志勇; 张佳

    2015-01-01

    针对无网格 Galerkin 法计算耗时的问题,采用逐节点对法来组装刚度矩阵、共轭梯度法求解基于 CSR 格式存储的稀疏线性方程组,提出了一种利用罚函数法施加本质边界条件的 EFG 法 GPU 加速并行算法,给出了刚度矩阵和惩罚刚度矩阵的统一格式,以及 GPU 加速并行算法的流程图。编写了基于 CUDA 构架平台的 GPU 程序,且在 NVIDIA GeForce GTX 660显卡上通过数值算例对所提算法进行了性能测试与分析比较,探讨了影响加速比的因素。算例结果验证了所提算法的可行性,并在满足计算精度的前提下,其加速比最大可达17倍;同时线性方程组的求解对加速比起决定性影响。%In order to reduce the computing time of Element-Free Galerkin(EFG)method,a GPU accele-ration parallel algorithm of EFG method that essential boundary condition is imposed by penalty function method is proposed,in which stiffness matrix is assembled by node pair-wise approach,and sparse linear equations based on CSR format is solved by conjugate gradient methods.The unified format of stiffness matrix and penalty stiffness matrix was derived,and the flow chart of the parallel algorithm was provided.The GPU codes were programmed on CUDA,and algorithm testing was finished on the device of NVIDIA GeForce GTX 660 by numerical examples.The factors of affecting speedup ratio were discussed.The example results verified the feasibility of the proposed algorithm.The maximum speedup ratio was up to 17 times on the premise that the calculating accuracy is met,and to solve linear equations is the major factor in the speedup.

  17. The ASCI Network for SC '98: Dense Wave Division Multiplexing for Distributed and Distance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Adams, R.L.; Butman, W.; Martinez, L.G.; Pratt, T.J.; Vahle, M.O.

    1999-06-01

    This document highlights the DISCOM's Distance computing and communication team activities at the 1998 Supercomputing conference in Orlando, Florida. This conference is sponsored by the IEEE and ACM. Sandia National Laboratories, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory have participated in this conference for ten years. For the last three years, the three laboratories have a joint booth at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives. The DISCOM communication team uses the forum to demonstrate and focus communications and networking developments. At SC '98, DISCOM demonstrated the capabilities of Dense Wave Division Multiplexing. We exhibited an OC48 ATM encryptor. We also coordinated the other networking activities within the booth. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support overall strategies in ATM networking.

  18. Improving Strategic Planning for Federal Public Health Agencies Through Collaborative Strategic Management

    Science.gov (United States)

    2013-03-01

    preparedness have created an opportunity to rethink the collaborative approach to strategic planning . This thesis considers the role that collaborative...strategic management and collaborative frameworks may play in strengthening strategic planning at the federal level through a policy options analysis

  19. Strategic Planning with Critical Success Factors and Future Scenarios: An Integrated Strategic Planning Framework

    Science.gov (United States)

    2010-11-01

    This report explores the value of enhancing typical strategic planning techniques with the critical success factor (CSF) method and scenario planning...It synthesizes documented theory and research in strategic planning , CSFs, and future scenarios. It proposes an enhanced, integrated information...framework for strategic planning that can help organizations understand the broad range of interrelated elements that influence strategy development

  20. Accelerator and radiation physics

    CERN Document Server

    Basu, Samita; Nandy, Maitreyee

    2013-01-01

    "Accelerator and radiation physics" encompasses radiation shielding design and strategies for hadron therapy accelerators, neutron facilities and laser based accelerators. A fascinating article describes detailed transport theory and its application to radiation transport. Detailed information on planning and design of a very high energy proton accelerator can be obtained from the article on radiological safety of J-PARC. Besides safety for proton accelerators, the book provides information on radiological safety issues for electron synchrotron and prevention and preparedness for radiological emergencies. Different methods for neutron dosimetry including LET based monitoring, time of flight spectrometry, track detectors are documented alongwith newly measured experimental data on radiation interaction with dyes, polymers, bones and other materials. Design of deuteron accelerator, shielding in beam line hutches in synchrotron and 14 MeV neutron generator, various radiation detection methods, their characteriza...