WorldWideScience

Sample records for accelerated strategic computing

  1. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories, along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.

  2. The DOE Accelerated Strategic Computing Initiative: Challenges and opportunities for predictive materials simulation capabilities

    Science.gov (United States)

    Mailhiot, Christian

    1998-05-01

    In response to the unprecedented national security challenges emerging from the end of nuclear testing, the Defense Programs of the Department of Energy has developed a long-term strategic plan based on a vigorous Science-Based Stockpile Stewardship (SBSS) program. The main objective of the SBSS program is to ensure confidence in the performance, safety, and reliability of the stockpile on the basis of a fundamental science-based approach. A central element of this approach is the development of predictive, ‘full-physics’, full-scale computer simulation tools. As a critical component of the SBSS program, the Accelerated Strategic Computing Initiative (ASCI) was established to provide the required advances in computer platforms and to enable predictive, physics-based simulation capabilities. In order to achieve the ASCI goals, fundamental problems in the fields of computer and physical sciences of great significance to the entire scientific community must be successfully solved. Foremost among the key elements needed to develop predictive simulation capabilities, the development of improved physics-based materials models is a cornerstone. We indicate some of the materials theory, modeling, and simulation challenges and illustrate how the ASCI program will enable both the hardware and the software tools necessary to advance the state-of-the-art in the field of computational condensed matter and materials physics.

  3. Accelerating Strategic Change Through Action Learning

    DEFF Research Database (Denmark)

    Younger, Jon; Sørensen, René; Cleemann, Christine;

    2013-01-01

    Purpose – The purpose of this paper is to describe how a leading global company used action-learning based leadership development to accelerate strategic culture change. Design/methodology/approach – It describes the need for change, and the methodology and approach by which the initiative, Impact...

  4. Applications of the Strategic Defense Initiative's compact accelerators

    Science.gov (United States)

    Montanarelli, Nick; Lynch, Ted

    1991-12-01

    The Strategic Defense Initiative's (SDI) investment in particle accelerator technology for its directed energy weapons program has produced breakthroughs in the size and power of new accelerators. These accelerators, in turn, have produced spinoffs in several areas: the radio frequency quadrupole linear accelerator (RFQ linac) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI-sponsored compact induction linear accelerator may replace Cobalt-60 radiation and hazardous ethylene-oxide as a method for sterilizing medical products, and other SDIO-funded accelerators may be used to produce the radioactive isotopes oxygen-15, nitrogen-13, carbon-11, and fluorine-18 for positron emission tomography (PET). Other applications of these accelerators include bomb detection, non-destructive inspection, decomposing toxic substances in contaminated ground water, and eliminating nuclear waste.

  5. Applications of the Strategic Defense Initiative's compact accelerators

    Science.gov (United States)

    Montanarelli, Nick; Lynch, Ted

    1991-01-01

    The Strategic Defense Initiative's (SDI) investment in particle accelerator technology for its directed energy weapons program has produced breakthroughs in the size and power of new accelerators. These accelerators, in turn, have produced spinoffs in several areas: the radio frequency quadrupole linear accelerator (RFQ linac) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI-sponsored compact induction linear accelerator may replace Cobalt-60 radiation and hazardous ethylene-oxide as a method for sterilizing medical products, and other SDIO-funded accelerators may be used to produce the radioactive isotopes oxygen-15, nitrogen-13, carbon-11, and fluorine-18 for positron emission tomography (PET). Other applications of these accelerators include bomb detection, non-destructive inspection, decomposing toxic substances in contaminated ground water, and eliminating nuclear waste.

  6. Computational Biology: A Strategic Initiative LDRD

    Energy Technology Data Exchange (ETDEWEB)

    Barksy, D; Colvin, M

    2002-02-07

    The goal of this Strategic Initiative LDRD project was to establish at LLNL a new core capability in computational biology, combining laboratory strengths in high performance computing, molecular biology, and computational chemistry and physics. As described in this report, this project has been very successful in achieving this goal. This success is demonstrated by the large number of referred publications, invited talks, and follow-on research grants that have resulted from this project. Additionally, this project has helped build connections to internal and external collaborators and funding agencies that will be critical to the long-term vitality of LLNL programs in computational biology. Most importantly, this project has helped establish on-going research groups in the Biology and Biotechnology Research Program, the Physics and Applied Technology Directorate, and the Computation Directorate. These groups include three laboratory staff members originally hired as post-doctoral researchers for this strategic initiative.

  7. Accelerating Clean Energy Commercialization. A Strategic Partnership Approach

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Richard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Arent, Douglas J. [Joint Institute for Strategic Energy Analysis, Golden, CO (United States); Locklin, Ken [Impax Asset Management Group (United Kingdom)

    2016-04-01

    Technology development in the clean energy and broader clean tech space has proven to be challenging. Long-standing methods for advancing clean energy technologies from science to commercialization are best known for relatively slow, linear progression through research and development, demonstration, and deployment (RDD&D); and characterized by well-known valleys of death for financing. Investment returns expected by traditional venture capital investors have been difficult to achieve, particularly for hardware-centric innovations, and companies that are subject to project finance risks. Commercialization support from incubators and accelerators has helped address these challenges by offering more support services to start-ups; however, more effort is needed to fulfill the desired clean energy future. The emergence of new strategic investors and partners in recent years has opened up innovative opportunities for clean tech entrepreneurs, and novel commercialization models are emerging that involve new alliances among clean energy companies, RDD&D, support systems, and strategic customers. For instance, Wells Fargo and Company (WFC) and the National Renewable Energy Laboratory (NREL) have launched a new technology incubator that supports faster commercialization through a focus on technology development. The incubator combines strategic financing, technology and technical assistance, strategic customer site validation, and ongoing financial support.

  8. Snowmass 2013 Computing Frontier: Accelerator Science

    CERN Document Server

    Spentzouris, P; Joshi, C; Amundson, J; An, W; Bruhwiler, D L; Cary, J R; Cowan, B; Decyk, V K; Esarey, E; Fonseca, R A; Friedman, A; Geddes, C G R; Grote, D P; Kourbanis, I; Leemans, W P; Lu, W; Mori, W B; Ng, C; Qiang, Ji; Roberts, T; Ryne, R D; Schroeder, C B; Silva, L O; Tsung, F S; Vay, J -L; Vieira, J

    2013-01-01

    This is the working summary of the Accelerator Science working group of the Computing Frontier of the Snowmass meeting 2013. It summarizes the computing requirements to support accelerator technology in both Energy and Intensity Frontiers.

  9. Accelerating Scientific Computations using FPGAs

    Science.gov (United States)

    Pell, O.; Atasu, K.; Mencer, O.

    Field Programmable Gate Arrays (FPGAs) are semiconductor devices that contain a grid of programmable cells, which the user configures to implement any digital circuit of up to a few million gates. Modern FPGAs allow the user to reconfigure these circuits many times each second, making FPGAs fully programmable and general purpose. Recent FPGA technology provides sufficient resources to tackle scientific applications on large-scale parallel systems. As a case study, we implement the Fast Fourier Transform [1] in a flexible floating point implementation. We utilize A Stream Compiler [2] (ASC) which combines C++ syntax with flexible floating point support by providing a 'HWfloat' data-type. The resulting FFT can be targeted to a variety of FPGA platforms in FFTW-style, though not yet completely automatically. The resulting FFT circuit can be adapted to the particular resources available on the system. The optimal implementation of an FFT accelerator depends on the length and dimensionality of the FFT, the available FPGA area, the available hard DSP blocks, the FPGA board architecture, and the precision and range of the application [3]. Software-style object-orientated abstractions allow us to pursue an accelerated pace of development by maximizing re-use of design patterns. ASC allows a few core hardware descriptions to generate hundreds of different circuit variants to meet particular speed, area and precision goals. The key to achieving maximum acceleration of FFT computation is to match memory and compute bandwidths so that maximum use is made of computational resources. Modern FPGAs contain up to hundreds of independent SRAM banks to store intermediate results, providing ample scope for optimizing memory parallelism. At 175Mhz, one of Maxeler's Radix-4 FFT cores computes 4x as many 1024pt FFTs per second as a dual Pentium-IV Xeon machine running FFTW. Eight such parallel cores fit onto the largest FPGA in the Xilinx Virtex-4 family, providing a 32x speed-up over

  10. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  11. Computing tools for accelerator design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations.

  12. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  13. Cloud computing strategic framework (FY13 - FY15).

    Energy Technology Data Exchange (ETDEWEB)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.; Cox, Philip M.; Rogers, G. Kelly

    2012-11-01

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  14. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  15. The strategic planning initiative for accelerated cleanup of Rocky Flats

    International Nuclear Information System (INIS)

    The difficulties associated with the congressional funding cycles, regulatory redirection, remediation schedule deadlines, and the lack of a mixed waste (MW) repository have adversely impacted the environmental restoration (ER) program across the entire U.S. Department of Energy (DOE) complex including Rocky Flats Plant (RFP). In an effort to counteract and reduce the impacts of these difficulties, RFP management saw the need for developing a revised ER Program. The objective of the revised ER approach is to identify an initiative that would accelerate the cleanup process and reduce costs without compromising either protection of human health or the environment. A special analysis with that assigned objective was initiated in June 1993 using a team that included DOE Headquarters and Rocky Flats Field Office (RFFO), EG ampersand G personnel, and experts from nationally recognized ER firms. The analysis relied on recent regulatory and process innovations such as DOE's Streamlined Approach for Environmental Restoration (SAFER) and EPA's Superfund Accelerated Cleanup Model (SACM) and Corrective Action Management Units (CAMU). The analysis also incorporated other ongoing improvements efforts initiated by RFP, such as the Quality Action Team and the Integrated Planning Process

  16. Strategic Plan for a Scientific Cloud Computing infrastructure for Europe

    CERN Document Server

    Lengert, Maryline

    2011-01-01

    Here we present the vision, concept and direction for forming a European Industrial Strategy for a Scientific Cloud Computing Infrastructure to be implemented by 2020. This will be the framework for decisions and for securing support and approval in establishing, initially, an R&D European Cloud Computing Infrastructure that serves the need of European Research Area (ERA ) and Space Agencies. This Cloud Infrastructure will have the potential beyond this initial user base to evolve to provide similar services to a broad range of customers including government and SMEs. We explain how this plan aims to support the broader strategic goals of our organisations and identify the benefits to be realised by adopting an industrial Cloud Computing model. We also outline the prerequisites and commitment needed to achieve these objectives.

  17. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  18. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  19. Computational Examination of Parameters Influencing Practicability of Ram Accelerator

    Directory of Open Access Journals (Sweden)

    Sunil Bhat

    2004-07-01

    Full Text Available The problems concerning practicability aspects of a ram accelerator, such as intense in-bore projectile ablation, large accelerator tube length to achieve high projectile muzzle velocity, and high entry velocity of projectile in the accelerator tube for starting the accelerator have been examined. Computational models of the processes like phenomenon of projectile ablation, flow in the aero-window used as accelerator tube-end closure device in case of high drive gas filling pressure in the ram accelerator tube have been presented. New projectile design to minimise the starting velocity of the ram accelerator is discussed. Possibility of deployment of ram accelerator in the defence-oriented role has been investigated to utilise its high velocity potential.

  20. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  1. Computational studies and optimization of wakefield accelerators

    International Nuclear Information System (INIS)

    Laser- and particle beam-driven plasma wakefield accelerators produce accelerating fields thousands of times higher than radio-frequency accelerators, offering compactness and ultrafast bunches to extend the frontiers of high energy physics and to enable laboratory-scale radiation sources. Large-scale kinetic simulations provide essential understanding of accelerator physics to advance beam performance and stability and show and predict the physics behind recent demonstration of narrow energy spread bunches. Benchmarking between codes is establishing validity of the models used and, by testing new reduced models, is extending the reach of simulations to cover upcoming meter-scale multi-GeV experiments. This includes new models that exploit Lorentz boosted simulation frames to speed calculations. Simulations of experiments showed that recently demonstrated plasma gradient injection of electrons can be used as an injector to increase beam quality by orders of magnitude. Simulations are now also modeling accelerator stages of tens of GeV, staging of modules, and new positron sources to design next-generation experiments and to use in applications in high energy physics and light sources

  2. Quality Function Deployment (QFD House of Quality for Strategic Planning of Computer Security of SMEs

    Directory of Open Access Journals (Sweden)

    Jorge A. Ruiz-Vanoye

    2013-01-01

    Full Text Available This article proposes to implement the Quality Function Deployment (QFD House of Quality for strategic planning of computer security for Small and Medium Enterprises (SME. The House of Quality (HoQ applied to computer security of SME is a framework to convert the security needs of corporate computing in a set of specifications to improve computer security.

  3. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  4. Scientific computing with multicore and accelerators

    CERN Document Server

    Kurzak, Jakub; Dongarra, Jack

    2010-01-01

    Dense Linear Algebra Implementing Matrix Multiplication on the Cell B.E, Wesley Alvaro, Jakub Kurzak, and Jack DongarraImplementing Matrix Factorizations on the Cell BE, Jakub Kurzak and Jack DongarraDense Linear Algebra for Hybrid GPU-Based Systems, Stanimire Tomov and Jack DongarraBLAS for GPUs, Rajib Nath, Stanimire Tomov, and Jack DongarraSparse Linear Algebra Sparse Matrix-Vector Multiplication on Multicore and Accelerators, Samuel Williams, Nathan B

  5. Accelerating Iterative Big Data Computing Through MPI

    Institute of Scientific and Technical Information of China (English)

    梁帆; 鲁小亿

    2015-01-01

    Current popular systems, Hadoop and Spark, cannot achieve satisfied performance because of the inefficient overlapping of computation and communication when running iterative big data applications. The pipeline of computing, data movement, and data management plays a key role for current distributed data computing systems. In this paper, we first analyze the overhead of shuffle operation in Hadoop and Spark when running PageRank workload, and then propose an event-driven pipeline and in-memory shuffle design with better overlapping of computation and communication as DataMPI-Iteration, an MPI-based library, for iterative big data computing. Our performance evaluation shows DataMPI-Iteration can achieve 9X∼21X speedup over Apache Hadoop, and 2X∼3X speedup over Apache Spark for PageRank and K-means.

  6. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, Panagiotis; /Fermilab; Cary, John; /Tech-X, Boulder; Mcinnes, Lois Curfman; /Argonne; Mori, Warren; /UCLA; Ng, Cho; /SLAC; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  7. (U) Computation acceleration using dynamic memory

    Energy Technology Data Exchange (ETDEWEB)

    Hakel, Peter [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-10-24

    Many computational applications require the repeated use of quantities, whose calculations can be expensive. In order to speed up the overall execution of the program, it is often advantageous to replace computation with extra memory usage. In this approach, computed values are stored and then, when they are needed again, they are quickly retrieved from memory rather than being calculated again at great cost. Sometimes, however, the precise amount of memory needed to store such a collection is not known in advance, and only emerges in the course of running the calculation. One problem accompanying such a situation is wasted memory space in overdimensioned (and possibly sparse) arrays. Another issue is the overhead of copying existing values to a new, larger memory space, if the original allocation turns out to be insufficient. In order to handle these runtime problems, the programmer therefore has the extra task of addressing them in the code.

  8. GPU-accelerated micromagnetic simulations using cloud computing

    Science.gov (United States)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  9. GPU-accelerated micromagnetic simulations using cloud computing

    CERN Document Server

    Jermain, C L; Buhrman, R A; Ralph, D C

    2015-01-01

    Highly-parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  10. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  11. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multi-user Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implementation with four months with a computer and instrumentation cost of approximately $100K. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking and operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the efficient implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. 3 refs

  12. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  13. Accelerated Matrix Element Method with Parallel Computing

    CERN Document Server

    Schouten, Doug; Stelzer, Bernd

    2014-01-01

    The matrix element method utilizes ab initio calculations of probability densities as powerful discriminants for processes of interest in experimental particle physics. The method has already been used successfully at previous and current collider experiments. However, the computational complexity of this method for final states with many particles and degrees of freedom sets it at a disadvantage compared to supervised classification methods such as decision trees, k nearest-neighbour, or neural networks. This note presents a concrete implementation of the matrix element technique using graphics processing units. Due to the intrinsic parallelizability of multidimensional integration, dramatic speedups can be readily achieved, which makes the matrix element technique viable for general usage at collider experiments.

  14. The impact of new computer technology on accelerator control

    International Nuclear Information System (INIS)

    This paper describes some recent developments in computing and stresses their application in accelerator control systems. Among the advances that promise to have a significant impact are (1) low cost scientific workstations; (2) the use of ''windows'', pointing devices and menus in a multi-tasking operating system; (3) high resolution large-screen graphics monitors; (4) new kinds of high bandwidth local area networks. The relevant features are related to a general accelerator control system. For example, this paper examines the implications of a computing environment which permits and encourages graphical manipulation of system components, rather than traditional access through the writing of programs or ''canned'' access via touch panels

  15. Neural computation and particle accelerators research, technology and applications

    CERN Document Server

    D'Arras, Horace

    2010-01-01

    This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...

  16. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  17. Collaborative strategic board games as a site for distributed computational thinking

    OpenAIRE

    Berland, Matthew; Lee, Victor R.

    2011-01-01

    This paper examines the idea that contemporary strategic board games represent an informal, interactional context in which complex computational thinking takes place. When games are collaborative – that is, a game requires that players work in joint pursuit of a shared goal – the computational thinking is easily observed as distributed across several participants. This raises the possibility that a focus on such board games are profitable for those who wish to understand computational thinkin...

  18. Accelerating Climate and Weather Simulations through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  19. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Directory of Open Access Journals (Sweden)

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  20. Quantum computing accelerator I/O : LDRD 52750 final report.

    Energy Technology Data Exchange (ETDEWEB)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-12-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be

  1. Computing at DESY — current setup, trends and strategic directions

    Science.gov (United States)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  2. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  3. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    Science.gov (United States)

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  4. Hardware-accelerated Components for Hybrid Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chavarría-Miranda, Daniel; Nieplocha, Jaroslaw; Gorton, Ian

    2008-10-31

    We present a study on the use of component technology for encapsulating platform-specific hardwareaccelerated algorithms on hybrid HPC systems. Our research shows that component technology can have significant benefits from a software engineering pointof- view to increase encapsulation, portability and reduce or eliminate platform dependence for hardwareaccelerated algorithms. As a demonstration of this concept, we discuss our experience in designing, implementing and integrating an FPGA-accelerated kernel for Polygraph, an application in computational proteomics.

  5. COMPASS, the COMmunity Petascale Project for Accelerator Science and Simulation, a broad computational accelerator physics initiative

    Energy Technology Data Exchange (ETDEWEB)

    J.R. Cary; P. Spentzouris; J. Amundson; L. McInnes; M. Borland; B. Mustapha; B. Norris; P. Ostroumov; Y. Wang; W. Fischer; A. Fedotov; I. Ben-Zvi; R. Ryne; E. Esarey; C. Geddes; J. Qiang; E. Ng; S. Li; C. Ng; R. Lee; L. Merminga; H. Wang; D.L. Bruhwiler; D. Dechow; P. Mullowney; P. Messmer; C. Nieter; S. Ovtchinnikov; K. Paul; P. Stoltz; D. Wade-Stein; W.B. Mori; V. Decyk; C.K. Huang; W. Lu; M. Tzoufras; F. Tsung; M. Zhou; G.R. Werner; T. Antonsen; T. Katsouleas

    2007-06-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  6. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    Energy Technology Data Exchange (ETDEWEB)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-07-16

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  7. COMPASS, the COMmunity Petascale Project for Accelerator Science And Simulation, a Broad Computational Accelerator Physics Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Cary, J.R.; /Tech-X, Boulder /Colorado U.; Spentzouris, P.; Amundson, J.; /Fermilab; McInnes, L.; Borland, M.; Mustapha, B.; Norris, B.; Ostroumov, P.; Wang, Y.; /Argonne; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; /Brookhaven; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; /LBL, Berkeley; Ng, C.; Lee, R.; /SLAC; Merminga, L.; /Jefferson Lab /Tech-X, Boulder /UCLA /Colorado U. /Maryland U. /Southern California U.

    2007-11-09

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  8. A study on strategic provisioning of cloud computing services.

    Science.gov (United States)

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  9. A Study on Strategic Provisioning of Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Md Whaiduzzaman

    2014-01-01

    Full Text Available Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  10. A Study on Strategic Provisioning of Cloud Computing Services

    Science.gov (United States)

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  11. Learning to play like a human: case injected genetic algorithms for strategic computer gaming

    Science.gov (United States)

    Louis, Sushil J.; Miles, Chris

    2006-05-01

    We use case injected genetic algorithms to learn how to competently play computer strategy games that involve long range planning across complex dynamics. Imperfect knowledge presented to players requires them adapt their strategies in order to anticipate opponent moves. We focus on the problem of acquiring knowledge learned from human players, in particular we learn general routing information from a human player in the context of a strike force planning game. By incorporating case injection into a genetic algorithm, we show methods for incorporating general knowledge elicited from human players into future plans. In effect allowing the GA to take important strategic elements from human play and merging those elements into its own strategic thinking. Results show that with an appropriate representation, case injection is effective at biasing the genetic algorithm toward producing plans that contain important strategic elements used by human players.

  12. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  13. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    Directory of Open Access Journals (Sweden)

    Wei Wang

    Full Text Available Recent developments in modern computational accelerators like Graphics Processing Units (GPUs and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other

  14. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  15. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  16. Distance Computation Between Non-Holonomic Motions with Constant Accelerations

    Directory of Open Access Journals (Sweden)

    Enrique J. Bernabeu

    2013-09-01

    Full Text Available A method for computing the distance between two moving robots or between a mobile robot and a dynamic obstacle with linear or arc‐like motions and with constant accelerations is presented in this paper. This distance is obtained without stepping or discretizing the motions of the robots or obstacles. The robots and obstacles are modelled by convex hulls. This technique obtains the future instant in time when two moving objects will be at their minimum translational distance ‐ i.e., at their minimum separation or maximum penetration (if they will collide. This distance and the future instant in time are computed in parallel. This method is intended to be run each time new information from the world is received and, consequently, it can be used for generating collision‐free trajectories for non‐holonomic mobile robots.

  17. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  18. Computer network for on-lne control system of the IHEP ring accelerator

    International Nuclear Information System (INIS)

    A block-diagram for computer network of the IHEP ring accelerator control system is substantiated. The interface card for ES-1010 computer is described, it operates simultaneously on 4 channels. The system software for computer network is considered

  19. Accelerating Battery Design Using Computer-Aided Engineering Tools: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Heon, G. H.; Smith, K.

    2011-01-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  20. How endemic countries can accelerate lymphatic filariasis elimination? An analytical review identify strategic and programmatic interventions

    Directory of Open Access Journals (Sweden)

    Chandrakant Lahariya & Shailendra S. Tomar

    2011-03-01

    Full Text Available Lymphatic filariasis (LF is endemic in 81 countries in the world, and a number of these countries have targetedfor LF elimination. This review of literature and analysis was conducted to identify additional and sustainablestrategies to accelerate LF elimination from endemic countries. This review noted that adverse events due tomass drug administration (MDA of diethyl carbamazine (DEC tablets, poor knowledge and information aboutLF amongst health workers & community members, and limited focus on information, education & communication(IEC activities and interpersonal communication are the major barriers in LF elimination. The new approachesto increase compliance with DEC tablets (including exploring the possibility for DEC fortification of salt,targeted education programmes for physicians and health workers, and IEC material and inter personalcommunication to improve the knowledge of community are immediately required. There is a renewed andpressing need to conduct operational research, evolve sustainable and institutional mechanisms for education ofphysicians and health workers, ensure quality of trainings on MDA, strengthen IEC delivery mechanisms,implement internal and external monitoring of MDA activities, sufficient funding in timely manner, and toimprove political and programmatic leadership. It is also time that lessons from other elimination programmesare utilized to accelerate targeted LF elimination from the endemic countries.

  1. Computer codes for particle accelerator design and analysis: A compendium. Second edition

    International Nuclear Information System (INIS)

    The design of the next generation of high-energy accelerators will probably be done as an international collaborative efforts and it would make sense to establish, either formally or informally, an international center for accelerator codes with branches for maintenance, distribution, and consultation at strategically located accelerator centers around the world. This arrangement could have at least three beneficial effects. It would cut down duplication of effort, provide long-term support for the best codes, and provide a stimulating atmosphere for the evolution of new codes. It does not take much foresight to see that the natural evolution of accelerator design codes is toward the development of so-called Expert Systems, systems capable of taking design specifications of future accelerators and producing specifications for optimized magnetic transport and acceleration components, making a layout, and giving a fairly impartial cost estimate. Such an expert program would use present-day programs such as TRANSPORT, POISSON, and SUPERFISH as tools in the optimization process. Such a program would also serve to codify the experience of two generations of accelerator designers before it is lost as these designers reach retirement age. This document describes 203 codes that originate from 10 countries and are currently in use. The authors feel that this compendium will contribute to the dialogue supporting the international collaborative effort that is taking place in the field of accelerator physics today

  2. The computer based patient record: a strategic issue in process innovation.

    Science.gov (United States)

    Sicotte, C; Denis, J L; Lehoux, P

    1998-12-01

    Reengineering of the workplace through Information Technology is an important strategic issue for today's hospitals. The computer-based patient record (CPR) is one technology that has the potential to profoundly modify the work routines of the care unit. This study investigates a CPR project aimed at allowing physicians and nurses to work in a completely electronic environment. The focus of our analysis was the patient nursing care process. The rationale behind the introduction of this technology was based on its alleged capability to both enhance quality of care and control costs. This is done by better managing the flow of information within the organization and by introducing mechanisms such as the timeless and spaceless organization of the work place, de-localization, and automation of work processes. The present case study analyzed the implementation of a large CPR project ($45 million U.S.) conducted in four hospitals in joint venture with two computer firms. The computerized system had to be withdrawn because of boycotts from both the medical and nursing personnel. User-resistance was not the problem. Despite its failure, this project was a good opportunity to understand better the intricate complexity of introducing technology in professional work where the usefulness of information is short lived and where it is difficult to predetermine the relevancy of information. Profound misconceptions in achieving a tighter fit (synchronization) between care processes and information processes were the main problems. PMID:9871877

  3. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    Science.gov (United States)

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  4. Electromagnetic field computation and optimization in accelerator dipole magnets. Doctoral thesis

    Energy Technology Data Exchange (ETDEWEB)

    Ikaeheimo, J.

    1996-03-01

    Contents: Introduction; Dipole magnetic in particle accelerators; Field computation; Optimization of the straight section of the coil; Optimization of the end section of the coil; Adaptive mesh generation; Conclusions.

  5. Combined Compute and Storage: Configurable Memristor Arrays to Accelerate Search

    OpenAIRE

    Liu, Yang; Dwyer, Chris; Lebeck, Alvin R.

    2016-01-01

    Emerging technologies present opportunities for system designers to meet the challenges presented by competing trends of big data analytics and limitations on CMOS scaling. Specifically, memristors are an emerging high-density technology where the individual memristors can be used as storage or to perform computation. The voltage applied across a memristor determines its behavior (storage vs. compute), which enables a configurable memristor substrate that can embed computation with storage. T...

  6. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  7. Accelerating patch-based directional wavelets with multicore parallel computing in compressed sensing MRI.

    Science.gov (United States)

    Li, Qiyue; Qu, Xiaobo; Liu, Yunsong; Guo, Di; Lai, Zongying; Ye, Jing; Chen, Zhong

    2015-06-01

    Compressed sensing MRI (CS-MRI) is a promising technology to accelerate magnetic resonance imaging. Both improving the image quality and reducing the computation time are important for this technology. Recently, a patch-based directional wavelet (PBDW) has been applied in CS-MRI to improve edge reconstruction. However, this method is time consuming since it involves extensive computations, including geometric direction estimation and numerous iterations of wavelet transform. To accelerate computations of PBDW, we propose a general parallelization of patch-based processing by taking the advantage of multicore processors. Additionally, two pertinent optimizations, excluding smooth patches and pre-arranged insertion sort, that make use of sparsity in MR images are also proposed. Simulation results demonstrate that the acceleration factor with the parallel architecture of PBDW approaches the number of central processing unit cores, and that pertinent optimizations are also effective to make further accelerations. The proposed approaches allow compressed sensing MRI reconstruction to be accomplished within several seconds.

  8. Lua(Jit) for computing accelerator beam physics

    CERN Document Server

    CERN. Geneva

    2016-01-01

    As mentioned in the 2nd developers meeting, I would like to open the debate with a special presentation on another language - Lua, and a tremendous technology - LuaJit. Lua is much less known at CERN, but it is very simple, much smaller than Python and its JIT is extremely performant. The language is a dynamic scripting language easy to learn and easy to embedded in applications. I will show how we use it in HPC for accelerator beam physics as a replacement for C, C++, Fortran and Python, with some benchmarks versus Python, PyPy4 and C/C++.

  9. Convergence acceleration and shock fitting for transonic aerodynamics computations

    Science.gov (United States)

    Hafez, M. M.; Cheng, H. K.

    1975-01-01

    Two problems in computational fluid dynamics are studied in the context of transonic small-disturbance theory - namely, (1) how to speed up the convergence for currently available iterative procedures, and (2) how a shock-fitting method may be adapted to existing relaxation procedures with minimal alterations in computer programming and storage requirements. The paper contributes to a clarification of error analyses for sequence transformations based on the power method (including also the nonlinear transforms of Aitken, Shanks, and Wilkinson), and to developing a cyclic iterative procedure applying the transformations. Examples testing the procedure for a model Dirichlet problem and for a transonic airfoil problem show that savings in computer time by a factor of three to five are generally possible, depending on accuracy requirements and the particular iterative procedure used.-

  10. Computational algorithms for multiphase magnetohydrodynamics and applications to accelerator targets

    Directory of Open Access Journals (Sweden)

    R.V. Samulyak

    2010-01-01

    Full Text Available An interface-tracking numerical algorithm for the simulation of magnetohydrodynamic multiphase/free surface flows in the low-magnetic-Reynolds-number approximation of (Samulyak R., Du J., Glimm J., Xu Z., J. Comp. Phys., 2007, 226, 1532 is described. The algorithm has been implemented in multi-physics code FronTier and used for the simulation of MHD processes in liquids and weakly ionized plasmas. In this paper, numerical simulations of a liquid mercury jet entering strong and nonuniform magnetic field and interacting with a powerful proton pulse have been performed and compared with experiments. Such a mercury jet is a prototype of the proposed Muon Collider/Neutrino Factory, a future particle accelerator. Simulations demonstrate the elliptic distortion of the mercury jet as it enters the magnetic solenoid at a small angle to the magnetic axis, jet-surface instabilities (filamentation induced by the interaction with proton pulses, and the stabilizing effect of the magnetic field.

  11. Computational Science Guides and Accelerates Hydrogen Research (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2010-12-01

    This fact sheet describes NREL's accomplishments in using computational science to enhance hydrogen-related research and development in areas such as storage and photobiology. Work was performed by NREL's Chemical and Materials Science Center and Biosciences Center.

  12. Accelerator

    International Nuclear Information System (INIS)

    The invention claims equipment for stabilizing the position of the front covers of the accelerator chamber in cyclic accelerators which significantly increases accelerator reliability. For stabilizing, it uses hydraulic cushions placed between the electromagnet pole pieces and the front chamber covers. The top and the bottom cushions are hydraulically connected. The cushions are disconnected and removed from the hydraulic line using valves. (J.P.)

  13. Computation of Eigenmodes in Long and Complex Accelerating Structures by Means of Concatenation Strategies

    CERN Document Server

    Fligsen, T; Van Rienen, U

    2014-01-01

    The computation of eigenmodes for complex accelerating structures is a challenging and important task for the design and operation of particle accelerators. Discretizing long and complex structures to determine its eigenmodes leads to demanding computations typically performed on super computers. This contribution presents an application example of a method to compute eigenmodes and other parameters derived from these eigenmodes for long and complex structures using standard workstation computers. This is accomplished by the decomposition of the complex structure into several single segments. In a next step, the electromagnetic properties of the segments are described in terms of a compact state-space model. Subsequently, the state-space models of the single structures are concatenated to the full structure. The results of direct calculations are compared with results obtained by the concatenation scheme in terms of computational time and accuracy.

  14. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  15. Computer control of large accelerators design concepts and methods

    International Nuclear Information System (INIS)

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references

  16. Modern hardware architectures accelerate porous media flow computations

    Science.gov (United States)

    Kulczewski, Michal; Kurowski, Krzysztof; Kierzynka, Michal; Dohnalik, Marek; Kaczmarczyk, Jan; Borujeni, Ali Takbiri

    2012-05-01

    Investigation of rock properties, porosity and permeability particularly, which determines transport media characteristic, is crucial to reservoir engineering. Nowadays, micro-tomography (micro-CT) methods allow to obtain vast of petro-physical properties. The micro-CT method facilitates visualization of pores structures and acquisition of total porosity factor, determined by sticking together 2D slices of scanned rock and applying proper absorption cut-off point. Proper segmentation of pores representation in 3D is important to solve the permeability of porous media. This factor is recently determined by the means of Computational Fluid Dynamics (CFD), a popular method to analyze problems related to fluid flows, taking advantage of numerical methods and constantly growing computing powers. The recent advent of novel multi-, many-core and graphics processing unit (GPU) hardware architectures allows scientists to benefit even more from parallel processing and built-in new features. The high level of parallel scalability offers both, the time-to-solution decrease and greater accuracy - top factors in reservoir engineering. This paper aims to present research results related to fluid flow simulations, particularly solving the total porosity and permeability of porous media, taking advantage of modern hardware architectures. In our approach total porosity is calculated by the means of general-purpose computing on multiple GPUs. This application sticks together 2D slices of scanned rock and by the means of a marching tetrahedra algorithm, creates a 3D representation of pores and calculates the total porosity. Experimental results are compared with data obtained via other popular methods, including Nuclear Magnetic Resonance (NMR), helium porosity and nitrogen permeability tests. Then CFD simulations are performed on a large-scale high performance hardware architecture to solve the flow and permeability of porous media. In our experiments we used Lattice Boltzmann

  17. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  18. The operant reserve: a computer simulation in (accelerated) real time.

    Science.gov (United States)

    Catania, A Charles

    2005-05-31

    In Skinner's Reflex Reserve theory, reinforced responses added to a reserve depleted by responding. It could not handle the finding that partial reinforcement generated more responding than continuous reinforcement, but it would have worked if its growth had depended not just on the last response but also on earlier responses preceding a reinforcer, each weighted by delay. In that case, partial reinforcement generates steady states in which reserve decrements produced by responding balance increments produced when reinforcers follow responding. A computer simulation arranged schedules for responses produced with probabilities proportional to reserve size. Each response subtracted a fixed amount from the reserve and added an amount weighted by the reciprocal of the time to the next reinforcer. Simulated cumulative records and quantitative data for extinction, random-ratio, random-interval, and other schedules were consistent with those of real performances, including some effects of history. The model also simulated rapid performance transitions with changed contingencies that did not depend on molar variables or on differential reinforcement of inter-response times. The simulation can be extended to inhomogeneous contingencies by way of continua of reserves arrayed along response and time dimensions, and to concurrent performances and stimulus control by way of different reserves created for different response classes. PMID:15845312

  19. Accelerating unstructured finite volume computations on field-programmable gate arrays

    OpenAIRE

    Nagy, Zoltan; Nemes, Csaba; Hiba, Antal; Csik, Arpad; Kiss, Andras; Ruszinko, Miklos; Szolgay, Peter

    2014-01-01

    Accurate simulations of various physical processes on digital computers requires huge computing performance, therefore accelerating these scientific and engineering applications has a great importance. Density of programmable logic devices doubles in every 18 months according to Moore's Law. On the recent devices around one hundred double precision floating-point adders and multipliers can be implemented. In the paper an FPGA based framework is described to efficiently utilize this huge compu...

  20. Accelerate!

    Science.gov (United States)

    Kotter, John P

    2012-11-01

    The old ways of setting and implementing strategy are failing us, writes the author of Leading Change, in part because we can no longer keep up with the pace of change. Organizational leaders are torn between trying to stay ahead of increasingly fierce competition and needing to deliver this year's results. Although traditional hierarchies and managerial processes--the components of a company's "operating system"--can meet the daily demands of running an enterprise, they are rarely equipped to identify important hazards quickly, formulate creative strategic initiatives nimbly, and implement them speedily. The solution Kotter offers is a second system--an agile, networklike structure--that operates in concert with the first to create a dual operating system. In such a system the hierarchy can hand off the pursuit of big strategic initiatives to the strategy network, freeing itself to focus on incremental changes to improve efficiency. The network is populated by employees from all levels of the organization, giving it organizational knowledge, relationships, credibility, and influence. It can Liberate information from silos with ease. It has a dynamic structure free of bureaucratic layers, permitting a level of individualism, creativity, and innovation beyond the reach of any hierarchy. The network's core is a guiding coalition that represents each level and department in the hierarchy, with a broad range of skills. Its drivers are members of a "volunteer army" who are energized by and committed to the coalition's vividly formulated, high-stakes vision and strategy. Kotter has helped eight organizations, public and private, build dual operating systems over the past three years. He predicts that such systems will lead to long-term success in the 21st century--for shareholders, customers, employees, and companies themselves. PMID:23155997

  1. Accelerate!

    Science.gov (United States)

    Kotter, John P

    2012-11-01

    The old ways of setting and implementing strategy are failing us, writes the author of Leading Change, in part because we can no longer keep up with the pace of change. Organizational leaders are torn between trying to stay ahead of increasingly fierce competition and needing to deliver this year's results. Although traditional hierarchies and managerial processes--the components of a company's "operating system"--can meet the daily demands of running an enterprise, they are rarely equipped to identify important hazards quickly, formulate creative strategic initiatives nimbly, and implement them speedily. The solution Kotter offers is a second system--an agile, networklike structure--that operates in concert with the first to create a dual operating system. In such a system the hierarchy can hand off the pursuit of big strategic initiatives to the strategy network, freeing itself to focus on incremental changes to improve efficiency. The network is populated by employees from all levels of the organization, giving it organizational knowledge, relationships, credibility, and influence. It can Liberate information from silos with ease. It has a dynamic structure free of bureaucratic layers, permitting a level of individualism, creativity, and innovation beyond the reach of any hierarchy. The network's core is a guiding coalition that represents each level and department in the hierarchy, with a broad range of skills. Its drivers are members of a "volunteer army" who are energized by and committed to the coalition's vividly formulated, high-stakes vision and strategy. Kotter has helped eight organizations, public and private, build dual operating systems over the past three years. He predicts that such systems will lead to long-term success in the 21st century--for shareholders, customers, employees, and companies themselves.

  2. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Shin-ichi Kuribayashi

    2013-02-01

    Full Text Available The widespread use of cloud computing services is expected to deteriorate a Quality of Service andtoincrease the power consumption of ICT devices, since the distance to a server becomes longer thanbefore. Migration of virtual machines over a wide area can solve many problems such as load balancingand power saving in cloud computing environments.This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine ismoved to a distant center, in order to prevent the degradation in performance after live migration ofvirtual machines over a wide area. mSCTP-based data transfer using different TCP connections beforeand after migration is proposed in order to use a currently available WAN accelerator. This paper doesnot consider the performance degradation of live migration itself. Then, this paper proposes to reduce thepower consumption of ICT devices, which consists of installing WAN accelerators as part of cloudresources actively and increasing the packet transfer rate of communication link temporarily. It isdemonstrated that the power consumption with WAN accelerator could be reduced to one-tenth of thatwithout WAN accelerator.

  3. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach was evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.

  4. Examination of the relationship between Sustainable Competitive Advantage and Strategic Leadership in the Computer Industry: based on the evaluation and analysis of Dell and HP

    OpenAIRE

    Guan, Yueyao

    2008-01-01

    Abstract Based on the evaluation and analysis of two predominate companies�¢���� performances, Hewlett-Packard and Dell Inc, in the computer industry. This paper has found that a right strategic move at the right time is crucial but only makes half of the story. In order to achieve long-term successes, strategic leaders must make sure the compatibility between the corporate strategy and the competitive advantage exists before a major strategic change is made....

  5. Accelerating patch-based directional wavelets with multicore parallel computing in compressed sensing MRI.

    Science.gov (United States)

    Li, Qiyue; Qu, Xiaobo; Liu, Yunsong; Guo, Di; Lai, Zongying; Ye, Jing; Chen, Zhong

    2015-06-01

    Compressed sensing MRI (CS-MRI) is a promising technology to accelerate magnetic resonance imaging. Both improving the image quality and reducing the computation time are important for this technology. Recently, a patch-based directional wavelet (PBDW) has been applied in CS-MRI to improve edge reconstruction. However, this method is time consuming since it involves extensive computations, including geometric direction estimation and numerous iterations of wavelet transform. To accelerate computations of PBDW, we propose a general parallelization of patch-based processing by taking the advantage of multicore processors. Additionally, two pertinent optimizations, excluding smooth patches and pre-arranged insertion sort, that make use of sparsity in MR images are also proposed. Simulation results demonstrate that the acceleration factor with the parallel architecture of PBDW approaches the number of central processing unit cores, and that pertinent optimizations are also effective to make further accelerations. The proposed approaches allow compressed sensing MRI reconstruction to be accomplished within several seconds. PMID:25620521

  6. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael; Lee, Eleanor

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  7. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    Science.gov (United States)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  8. Experimental and Computational Study of Acceleration Response in Layered Cylindrical Structure Considering Impedance Mismatch Effect

    Directory of Open Access Journals (Sweden)

    Sachiko Sueki

    2011-01-01

    Full Text Available Electronic devices, especially those having high performance capabilities, are sensitive to mechanical shocks and vibrations. Failure of such devices in smart projectiles caused by vibrations has been observed. The currently accepted methodology to protect electronic devices in smart projectiles is use of stiffeners and dampers. However these methods are not effective in protecting the electronic devices from high frequency accelerations in excess of 5,000 Hz. Therefore, it is important to find more effective methods to reduce high frequency vibrations for smart projectiles. In this study, layered cylindrical structures are studied experimentally and computationally to understand the effect of impedance mismatch in axial acceleration response under an impact loading. Experiments are conducted by applying impact forces at one end of cylindrical structures and measuring accelerations at the other end. Experimental results suggest that high frequency accelerations in layered structures could be less compared to those in homogeneous cylinders if a returning wave from the end of the projectile does not interfere with the applied impact force. Computational studies using finite element analysis (FEA verified the experimental results of our interference hypothesis.

  9. A Low-Power Scalable Stream Compute Accelerator for General Matrix Multiply (GEMM

    Directory of Open Access Journals (Sweden)

    Antony Savich

    2014-01-01

    play an important role in determining the performance of such applications. This paper proposes a novel efficient, highly scalable hardware accelerator that is of equivalent performance to a 2 GHz quad core PC but can be used in low-power applications targeting embedded systems requiring high performance computation. Power, performance, and resource consumption are demonstrated on a fully-functional prototype. The proposed hardware accelerator is 36× more energy efficient per unit of computation compared to state-of-the-art Xeon processor of equal vintage and is 14× more efficient as a stand-alone platform with equivalent performance. An important comparison between simulated system estimates and real system performance is carried out.

  10. Proceedings of the conference on computer codes and the linear accelerator community

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.K. (comp.)

    1990-07-01

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned.

  11. Proceedings of the conference on computer codes and the linear accelerator community

    International Nuclear Information System (INIS)

    The conference whose proceedings you are reading was envisioned as the second in a series, the first having been held in San Diego in January 1988. The intended participants were those people who are actively involved in writing and applying computer codes for the solution of problems related to the design and construction of linear accelerators. The first conference reviewed many of the codes both extant and under development. This second conference provided an opportunity to update the status of those codes, and to provide a forum in which emerging new 3D codes could be described and discussed. The afternoon poster session on the second day of the conference provided an opportunity for extended discussion. All in all, this conference was felt to be quite a useful interchange of ideas and developments in the field of 3D calculations, parallel computation, higher-order optics calculations, and code documentation and maintenance for the linear accelerator community. A third conference is planned

  12. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  13. ACE3P Computations of Wakefield Coupling in the CLIC Two-Beam Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Candel, Arno; Li, Z.; Ng, C.; Rawat, V.; Schussman, G.; Ko, K.; /SLAC; Syratchev, I.; Grudiev, A.; Wuensch, W.; /CERN

    2010-10-27

    The Compact Linear Collider (CLIC) provides a path to a multi-TeV accelerator to explore the energy frontier of High Energy Physics. Its novel two-beam accelerator concept envisions rf power transfer to the accelerating structures from a separate high-current decelerator beam line consisting of power extraction and transfer structures (PETS). It is critical to numerically verify the fundamental and higher-order mode properties in and between the two beam lines with high accuracy and confidence. To solve these large-scale problems, SLAC's parallel finite element electromagnetic code suite ACE3P is employed. Using curvilinear conformal meshes and higher-order finite element vector basis functions, unprecedented accuracy and computational efficiency are achieved, enabling high-fidelity modeling of complex detuned structures such as the CLIC TD24 accelerating structure. In this paper, time-domain simulations of wakefield coupling effects in the combined system of PETS and the TD24 structures are presented. The results will help to identify potential issues and provide new insights on the design, leading to further improvements on the novel CLIC two-beam accelerator scheme.

  14. A Low-Power Scalable Stream Compute Accelerator for General Matrix Multiply (GEMM)

    OpenAIRE

    Antony Savich; Shawki Areibi

    2014-01-01

    Many applications ranging from machine learning, image processing, and machine vision to optimization utilize matrix multiplication as a fundamental block. Matrix operations play an important role in determining the performance of such applications. This paper proposes a novel efficient, highly scalable hardware accelerator that is of equivalent performance to a 2 GHz quad core PC but can be used in low-power applications targeting embedded systems requiring high performance computation. P...

  15. RACETRACK - a computer code for the simulation of nonlinear particle motion in accelerators

    International Nuclear Information System (INIS)

    RACETRACK is a computer code to simulate transverse nonlinear particle motion in accelerators. Transverse magnetic fields of higher order are treated in thin magnet approximation. Multipoles up to 20 poles are included. Energy oscillations due to the nonlinear synchrotron motion are taken into account. Several additional features, as linear optics calculations, chromaticity adjustment, tune variation, orbit adjustment and others are available to guarantee a fast treatment of nonlinear dynamical problems. (orig.)

  16. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  17. Enterprise-process: computer-based application for obtaining a process-organisation matrix during strategic information system planning

    Directory of Open Access Journals (Sweden)

    José Alirio Rondón

    2010-04-01

    Full Text Available A lot of material has been published about strategic information system planning (SISP methodologies. These methods are designed to help information system planners to integrate their strategies with organisational stra-tegies. Classic business system planning for strategical alignment (BSP/SA theory stands out because it provides information systems with a reactive role regarding an organisation’s objectives and strategy. BSP/SA has been described in terms of phases and the specific tasks within them. This work was aimed at presenting a computer-based application automating one of the most important tasks in BSP/SA methodology (process-organisation matrix. This matrix allows storing information about the levels of present responsibilities in positions and processes. Automating this task has facilitated students’ analysing the process-organisation matrix during SISP workshops forming part of the Systems Management course (Systems Engineering, Universidad Nacional de Colombia. Improved results have thus arisen from such workshops. The present work aims to motivate software development for supporting SISP tasks.

  18. Proposing a Strategic Framework for Distributed Manufacturing Execution System Using Cloud Computing

    Directory of Open Access Journals (Sweden)

    Shiva Khalili Gheidari

    2013-07-01

    Full Text Available This paper introduces a strategic framework that uses service-oriented architecture to design distributed MES over cloud. In this study, the main structure of framework is defined in terms of a series of modules that communicate with each other by use of a design pattern, called mediator. Framework focus is on the main module, which handles distributed orders with other ones and finally suggests the benefit of using cloud in comparison with previous architectures. The main structure of framework (mediator and the benefit of focusing on the main module by using cloud, should be pointed more, also the aim and the results of comparing this method with previous architecture whether by quality and quantity is not described.

  19. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-01-01

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate. PMID:27070606

  20. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  1. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  2. BioEM: GPU-accelerated computing of Bayesian inference of electron microscopy images

    CERN Document Server

    Cossio, Pilar; Baruffa, Fabio; Rampp, Markus; Lindenstruth, Volker; Hummer, Gerhard

    2016-01-01

    In cryo-electron microscopy (EM), molecular structures are determined from large numbers of projection images of individual particles. To harness the full power of this single-molecule information, we use the Bayesian inference of EM (BioEM) formalism. By ranking structural models using posterior probabilities calculated for individual images, BioEM in principle addresses the challenge of working with highly dynamic or heterogeneous systems not easily handled in traditional EM reconstruction. However, the calculation of these posteriors for large numbers of particles and models is computationally demanding. Here we present highly parallelized, GPU-accelerated computer software that performs this task efficiently. Our flexible formulation employs CUDA, OpenMP, and MPI parallelization combined with both CPU and GPU computing. The resulting BioEM software scales nearly ideally both on pure CPU and on CPU+GPU architectures, thus enabling Bayesian analysis of tens of thousands of images in a reasonable time. The g...

  3. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  4. Accelerating Relevance-Vector-Machine-Based Classification of Hyperspectral Image with Parallel Computing

    Directory of Open Access Journals (Sweden)

    Chao Dong

    2012-01-01

    Full Text Available Benefiting from the kernel skill and the sparse property, the relevance vector machine (RVM could acquire a sparse solution, with an equivalent generalization ability compared with the support vector machine. The sparse property requires much less time in the prediction, making RVM potential in classifying the large-scale hyperspectral image. However, RVM is not widespread influenced by its slow training procedure. To solve the problem, the classification of the hyperspectral image using RVM is accelerated by the parallel computing technique in this paper. The parallelization is revealed from the aspects of the multiclass strategy, the ensemble of multiple weak classifiers, and the matrix operations. The parallel RVMs are implemented using the C language plus the parallel functions of the linear algebra packages and the message passing interface library. The proposed methods are evaluated by the AVIRIS Indian Pines data set on the Beowulf cluster and the multicore platforms. It shows that the parallel RVMs accelerate the training procedure obviously.

  5. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    Energy Technology Data Exchange (ETDEWEB)

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  6. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  7. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Glotzer, Sharon [University of Michigan; McCurdy, Bill [University of California Davis; Roberto, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2010-07-26

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. New materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  8. THE ACCELERATED SEARCH-EXTENSION METHOD FOR COMPUTING MULTIPLE SOLUTIONS OF SEMILINEAR PDEs

    Institute of Scientific and Technical Information of China (English)

    Liu Yuewu; Xie Ziqing; Chen Chuanmiao

    2009-01-01

    In this paper, we propose an accelerated search-extension method (ASEM) based on the interpolated coefficient finite element method, the search-extension method (SEM) and the two-grid method to obtain the multiple solutions for semilinear elliptic equations. This strategy is not only successfully implemented to obtain multiple solutions for a class of semilinear elliptic boundary value problems, but also reduces the expensive computation greatly. The numerical results in 1-D and 2-D cases will show the efficiency of our approach.

  9. Computer simulation of 2-D and 3-D ion beam extraction and acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Ido, Shunji; Nakajima, Yuji [Saitama Univ., Urawa (Japan). Faculty of Engineering

    1997-03-01

    The two-dimensional code and the three-dimensional code have been developed to study the physical features of the ion beams in the extraction and acceleration stages. By using the two-dimensional code, the design of first electrode(plasma grid) is examined in regard to the beam divergence. In the computational studies by using the three-dimensional code, the axis-off model of ion beam is investigated. It is found that the deflection angle of ion beam is proportional to the gap displacement of the electrodes. (author)

  10. Computer-mediated communication as a channel for social resistance : The strategic side of SIDE

    NARCIS (Netherlands)

    Spears, R; Lea, M; Corneliussen, RA; Postmes, T; Ter Haar, W

    2002-01-01

    In two studies, the authors tested predictions derived from the social identity model of deindividuation effects (SIDE) concerning the potential of computer-mediated communication (CMC) to serve as a means to resist powerful out-groups. Earlier research using the SIDE model indicates that the anonym

  11. A computational study of dielectric photonic-crystal-based accelerator cavities

    Science.gov (United States)

    Bauer, C. A.

    Future particle accelerator cavities may use dielectric photonic crystals to reduce harmful wakefields and increase the accelerating electric field (or gradient). Reduced wakefields are predicted based on the bandgap property of some photonic crystals (i.e. frequency-selective reflection/transmission). Larger accelerating gradients are predicted based on certain dielectrics' strong resistance to electrical breakdown. Using computation, this thesis investigated a hybrid design of a 2D sapphire photonic crystal and traditional copper conducting cavity. The goals were to test the claim of reduced wakefields and, in general, judge the effectiveness of such structures as practical accelerating cavities. In the process, we discovered the following: (1) resonant cavities in truncated photonic crystals may confine radiation weakly compared to conducting cavities (depending on the level of truncation); however, confinement can be dramatically increased through optimizations that break lattice symmetry (but retain certain rotational symmetries); (2) photonic crystal cavities do not ideally reduce wakefields; using band structure calculations, we found that wakefields are increased by flat portions of the frequency dispersion (where the waves have vanishing group velocities). A complete comparison was drawn between the proposed photonic crystal cavities and the copper cavities for the Compact Linear Collider (CLIC); CLIC is one of the candidates for a future high-energy electron-positron collider that will study in greater detail the physics learned at the Large Hadron Collider. We found that the photonic crystal cavity, when compared to the CLIC cavity: (1) can lower maximum surface magnetic fields on conductors (growing evidence suggests this limits accelerating gradients by inducing electrical breakdown); (2) shows increased transverse dipole wakefields but decreased longitudinal monopole wakefields; and (3) exhibits lower accelerating efficiencies (unless a large photonic

  12. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  13. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  14. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  15. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  16. Strategic Implications for E-Business Organizations in the Ubiquitous Computing Economy

    Institute of Scientific and Technical Information of China (English)

    YUM Jihwan; KIM Hyoungdo

    2004-01-01

    The ubiquitous economy brings both pros and cons for the organizations. The third space emerged by the development of ubiquitous computing generates new concept of community. The community is tightly coupled with people, products, and systems. Organizational strategies need to be reshaped for the changing environment in the third space and community. Organizational structure also needs to change for community serving organization. Community serving concept equipped with the standardized technology will be essential. One of the key technologies, RFID service will play a key role to acknowledge identification and services required. When the needs for sensing the environment increase,technological requirement such as the ubiquitous sensor network (USN) will be critically needed.

  17. Strategic Entrepreneurship

    OpenAIRE

    Peter G. Klein; Jay B. Barney; Nicolai J. Foss

    2012-01-01

    Strategic entrepreneurship is a newly recognized field that draws, not surprisingly, from the fields of strategic management and entrepreneurship. The field emerged officially with the 2001 special issue of the Strategic Management Journal on “strategic entrepreneurship”; the first dedicated periodical, the Strategic Entrepreneurship Journal, appeared in 2007. Strategic entrepreneurship is built around two core ideas. (1) Strategy formulation and execution involves attributes that are fundame...

  18. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  19. Acceleration of Hessenberg Reduction for Nonsymmetric Eigenvalue Problems in a Hybrid CPU-GPU Computing Environment

    Directory of Open Access Journals (Sweden)

    Kinji Kimura

    2011-07-01

    Full Text Available Solution of large-scale dense nonsymmetric eigenvalue problem is required in many areas of scientific and engineering computing, such as vibration analysis of automobiles and analysis of electronic diffraction patterns. In this study, we focus on the Hessenberg reduction step and consider accelerating it in a hybrid CPU-GPU computing environment. Considering that the Hessenberg reduction algorithm consists almost entirely of BLAS (Basic Linear Algebra Subprograms operations, we propose three approaches for distributing the BLAS operations between CPU and GPU. Among them, the third approach, which assigns small-size BLAS operations to CPU and distributes large-size BLAS operations between CPU and GPU in some optimal manner, was found to be consistently faster than the other two approaches. On a machine with an Intel Core i7 processor and an NVIDIA Tesla C1060 GPU, this approach achieved 3.2 times speedup over the CPU-only case when computing the Hessenberg form of a 8,192×8,192 real matrix.

  20. X-ray beam hardening correction for measuring density in linear accelerator industrial computed tomography

    Institute of Scientific and Technical Information of China (English)

    ZHOU Ri-Feng; WANG Jue; CHEN Wei-Min

    2009-01-01

    Due to X-ray attenuation being approximately proportional to material density, it is possible to measure the inner density through Industrial Computed Tomography (ICT) images accurately. In practice, however, a number of factors including the non-linear effects of beam hardening and diffuse scattered radia-tion complicate the quantitative measurement of density variations in materials. This paper is based on the linearization method of beam hardening correction, and uses polynomial fitting coefficient which is obtained by the curvature of iron polychromatic beam data to fit other materials. Through theoretical deduction, the paper proves that the density measure error is less than 2% if using pre-filters to make the spectrum of linear accelerator range mainly 0.3 MeV to 3 MeV. Experiment had been set up at an ICT system with a 9 MeV electron linear accelerator. The result is satisfactory. This technique makes the beam hardening correction easy and simple, and it is valuable for measuring the ICT density and making use of the CT images to recognize materials.

  1. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    Science.gov (United States)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  2. An improved coarse-grained parallel algorithm for computational acceleration of ordinary Kriging interpolation

    Science.gov (United States)

    Hu, Hongda; Shu, Hong

    2015-05-01

    Heavy computation limits the use of Kriging interpolation methods in many real-time applications, especially with the ever-increasing problem size. Many researchers have realized that parallel processing techniques are critical to fully exploit computational resources and feasibly solve computation-intensive problems like Kriging. Much research has addressed the parallelization of traditional approach to Kriging, but this computation-intensive procedure may not be suitable for high-resolution interpolation of spatial data. On the basis of a more effective serial approach, we propose an improved coarse-grained parallel algorithm to accelerate ordinary Kriging interpolation. In particular, the interpolation task of each unobserved point is considered as a basic parallel unit. To reduce time complexity and memory consumption, the large right hand side matrix in the Kriging linear system is transformed and fixed at only two columns and therefore no longer directly relevant to the number of unobserved points. The MPI (Message Passing Interface) model is employed to implement our parallel programs in a homogeneous distributed memory system. Experimentally, the improved parallel algorithm performs better than the traditional one in spatial interpolation of annual average precipitation in Victoria, Australia. For example, when the number of processors is 24, the improved algorithm keeps speed-up at 20.8 while the speed-up of the traditional algorithm only reaches 9.3. Likewise, the weak scaling efficiency of the improved algorithm is nearly 90% while that of the traditional algorithm almost drops to 40% with 16 processors. Experimental results also demonstrate that the performance of the improved algorithm is enhanced by increasing the problem size.

  3. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation....... This model incorporates elements of central strategizing, autonomous entrepreneurial behavior, interactive information processing, and open communication systems that enhance the organization's ability to observe exogenous changes and respond effectively to them....

  4. Accelerating Design of Batteries Using Computer-Aided Engineering Tools (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.

    2010-11-01

    Computer-aided engineering (CAE) is a proven pathway, especially in the automotive industry, to improve performance by resolving the relevant physics in complex systems, shortening the product development design cycle, thus reducing cost, and providing an efficient way to evaluate parameters for robust designs. Academic models include the relevant physics details, but neglect engineering complexities. Industry models include the relevant macroscopic geometry and system conditions, but simplify the fundamental physics too much. Most of the CAE battery tools for in-house use are custom model codes and require expert users. There is a need to make these battery modeling and design tools more accessible to end users such as battery developers, pack integrators, and vehicle makers. Developing integrated and physics-based CAE battery tools can reduce the design, build, test, break, re-design, re-build, and re-test cycle and help lower costs. NREL has been involved in developing various models to predict the thermal and electrochemical performance of large-format cells and has used in commercial three-dimensional finite-element analysis and computational fluid dynamics to study battery pack thermal issues. These NREL cell and pack design tools can be integrated to help support the automotive industry and to accelerate battery design.

  5. Computations of longitudinal electron dynamics in the recirculating cw RF accelerator-recuperator for the high average power FEL

    Science.gov (United States)

    Sokolov, A. S.; Vinokurov, N. A.

    1994-03-01

    The use of optimal longitudinal phase-energy motion conditions for bunched electrons in a recirculating RF accelerator gives the possibility to increase the final electron peak current and, correspondingly, the FEL gain. The computer code RECFEL, developed for simulations of the longitudinal compression of electron bunches with high average current, essentially loading the cw RF cavities of the recirculator-recuperator, is briefly described and illustrated by some computational results.

  6. Accelerating groundwater flow simulation in MODFLOW using JASMIN-based parallel computing.

    Science.gov (United States)

    Cheng, Tangpei; Mo, Zeyao; Shao, Jingli

    2014-01-01

    To accelerate the groundwater flow simulation process, this paper reports our work on developing an efficient parallel simulator through rebuilding the well-known software MODFLOW on JASMIN (J Adaptive Structured Meshes applications Infrastructure). The rebuilding process is achieved by designing patch-based data structure and parallel algorithms as well as adding slight modifications to the compute flow and subroutines in MODFLOW. Both the memory requirements and computing efforts are distributed among all processors; and to reduce communication cost, data transfers are batched and conveniently handled by adding ghost nodes to each patch. To further improve performance, constant-head/inactive cells are tagged and neglected during the linear solving process and an efficient load balancing strategy is presented. The accuracy and efficiency are demonstrated through modeling three scenarios: The first application is a field flow problem located at Yanming Lake in China to help design reasonable quantity of groundwater exploitation. Desirable numerical accuracy and significant performance enhancement are obtained. Typically, the tagged program with load balancing strategy running on 40 cores is six times faster than the fastest MICCG-based MODFLOW program. The second test is simulating flow in a highly heterogeneous aquifer. The AMG-based JASMIN program running on 40 cores is nine times faster than the GMG-based MODFLOW program. The third test is a simplified transient flow problem with the order of tens of millions of cells to examine the scalability. Compared to 32 cores, parallel efficiency of 77 and 68% are obtained on 512 and 1024 cores, respectively, which indicates impressive scalability.

  7. Accelerating the Gauss-Seidel Power Flow Solver on a High Performance Reconfigurable Computer

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Jong-Ho; Ravindran, Arun; Mukherjee, Arindam; Joshi, Bharat; Chassin, David P.

    2009-09-01

    The computationally intensive power flow problem determines the voltage magnitude and phase angle at each bus in a power system for hundreds of thousands of buses under balanced three-phase steady-state conditions. We report an FPGA acceleration of the Gauss-Seidel based power flow solver employed in the transmission module of the GridLAB-D power distribution simulator and analysis tool. The prototype hardware is implemented on an SGI Altix-RASC system equipped with a Xilinx Virtex II 6000 FPGA. Due to capacity limitations of the FPGA, only the bus voltage calculations of the power network are implemented on hardware while the branch current calculations are implemented in software. For a 200,000 bus system, the bus voltage calculation on the FPGA achieves a 48x speed-up with PQ buses and a 62 times for PV over an equivalent sequential software implementation. The average overall speed up of the FPGA-CPU implementation with 100 iterations of the Gauss-Seidel power solver is 2.6x over a software implementation, with the branch calculations on the CPU accounting for 85% of the total execution time. The FPGA-CPU implementation also shows linear scaling with increase in the size of the input power network.

  8. Subcritical set coupled to accelerator (ADS) for transmutation of radioactive wastes: an approach of computational modelling

    International Nuclear Information System (INIS)

    Nuclear fission devices coupled to particle accelerators ADS are being widely studied. These devices have several applications, including nuclear waste transmutation and producing hydrogen, both applications with strong social and environmental impact. The essence of this work was to model an ADS geometry composed of small TRISO fuel loaded with a mixture of MOX uranium and thorium target material spallation of uranium, using methods of computational modeling probabilistic, in particular the MCNPX 2.6e program to evaluate the physical characteristics of the device and their ability to transmutation. As a result of the characterization of the spallation target, it can be concluded that production of neutrons per incident proton increases with increasing dimensions of the spallation target (thickness and radius), until it reached the maximum production of neutrons per incident proton or call the region saturation. The results obtained in modeling the ADS device bed kind of balls with respect to isotopic variation in the isotopes of plutonium and minor actinides considered in the analysis revealed that accumulation of mass of the isotopes of plutonium and minor actinides increase for subcritical configuration considered. In the particular case of the isotope 239Pu, it is observed a reduction of the mass from the time of burning of 99 days. The increase of power in the core, whereas tungsten spallation targets and Lead is among the key future developments of this work

  9. Users' guide for the Accelerated Leach Test Computer Program

    Energy Technology Data Exchange (ETDEWEB)

    Fuhrmann, M.; Heiser, J.H.; Pietrzak, R.; Franz, Eena-Mai; Colombo, P.

    1990-11-01

    This report is a step-by-step guide for the Accelerated Leach Test (ALT) Computer Program developed to accompany a new leach test for solidified waste forms. The program is designed to be used as a tool for performing the calculations necessary to analyze leach test data, a modeling program to determine if diffusion is the operating leaching mechanism (and, if not, to indicate other possible mechanisms), and a means to make extrapolations using the diffusion models. The ALT program contains four mathematical models that can be used to represent the data. The leaching mechanisms described by these models are: (1) diffusion through a semi-infinite medium (for low fractional releases), (2) diffusion through a finite cylinder (for high fractional releases), (3) diffusion plus partitioning of the source term, (4) solubility limited leaching. Results are presented as a graph containing the experimental data and the best-fit model curve. Results can also be output as LOTUS 1-2-3 files. 2 refs.

  10. Definition of the loading of process digital computer, used in the same class of accelerator control systems

    International Nuclear Information System (INIS)

    A relationship has been studied between computer loading on the one part and the properties of the parameter under control and discrete interval value on the other part. The computer loading is characterized by an inquiry probability value per calculation of the correcting signal. A mathematic expressing has been obtained which determined the inquiry probability. The expression is a multidimensional integral. The Monte-Carlo method has been employed for computation of the integral. A structural diagram of the algorithm is presented which elaborates the method so as to compute the probability. An error of the method has been assessed. The algorithm has been employed on the M-220 computer. The results obtained confirm correctness of the suggested methods of determination of the computer loading for operation in the accelerator control system

  11. Strategic Leadership

    Science.gov (United States)

    Davies, Barbara; Davies, Brent

    2004-01-01

    This article explores the nature of strategic leadership and assesses whether a framework can be established to map the dimensions of strategic leadership. In particular it establishes a model which outlines both the organizational abilities and the individual characteristics of strategic leaders.

  12. Strategic Entrepreneurship

    DEFF Research Database (Denmark)

    Klein, Peter G.; Barney, Jay B.; Foss, Nicolai Juul

    periodical, the Strategic Entrepreneurship Journal, appeared in 2007. Strategic entrepreneurship is built around two core ideas. (1) Strategy formulation and execution involves attributes that are fundamentally entrepreneurial, such as alertness, creativity, and judgment, and entrepreneurs try to create...... explains the specific links between strategy and entrepreneurship, reviews the emergence and development of the strategic entrepreneurship field, and discusses key implications and applications....

  13. Specific features of planning algorithms for dispatching software of a digital computer, operating in an accelerator control system

    International Nuclear Information System (INIS)

    The main principles are presented of a program dispatching system (DS) of the computer operating in the control and data acquisition system of the accelerator. The DS is intended for planning and execution of operating program sequence in accordance with the operational features of the accelerator. Modularity and hierarchy principles are the main characteristics of the system. The ''Planner'' module is described. The module regulates inquiries for utilization of the processor and memory and ensures their service. The ''Planner'' operation algorithm provides a simultaneous execution of programs with the processes occurring in the accelerator. The ''Planner'' planning algorithm controls presence of the programs requested and ensures their execution under multiprogram conditions. Brief characteristics of other modules of the DS are given. They are the ''distributor'', ''loader'', and ''interrupter''. Characteristics of the planning algorithms described have been realized in the DS and found in full agreement with all the conditions and limitations of the system

  14. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  15. High Performance Computer Acoustic Data Accelerator: A New System for Exploring Marine Mammal Acoustics for Big Data Applications

    OpenAIRE

    Dugan, Peter; Zollweg, John; Marian POPESCU; Risch, Denise; Glotin, Herve; LeCun, Yann; Clark, and Christopher

    2015-01-01

    This paper presents a new software model designed for distributed sonic signal detection runtime using machine learning algorithms called DeLMA. A new algorithm--Acoustic Data-mining Accelerator (ADA)--is also presented. ADA is a robust yet scalable solution for efficiently processing big sound archives using distributing computing technologies. Together, DeLMA and the ADA algorithm provide a powerful tool currently being used by the Bioacoustics Research Program (BRP) at the Cornell Lab of O...

  16. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer

    International Nuclear Information System (INIS)

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs

  17. Acceleration of color computer-generated hologram from three-dimensional scenes with texture and depth information

    Science.gov (United States)

    Shimobaba, Tomoyoshi; Kakue, Takashi; Ito, Tomoyoshi

    2014-06-01

    We propose acceleration of color computer-generated holograms (CGHs) from three-dimensional (3D) scenes that are expressed as texture (RGB) and depth (D) images. These images are obtained by 3D graphics libraries and RGB-D cameras: for example, OpenGL and Kinect, respectively. We can regard them as two-dimensional (2D) cross-sectional images along the depth direction. The generation of CGHs from the 2D cross-sectional images requires multiple diffraction calculations. If we use convolution-based diffraction such as the angular spectrum method, the diffraction calculation takes a long time and requires large memory usage because the convolution diffraction calculation requires the expansion of the 2D cross-sectional images to avoid the wraparound noise. In this paper, we first describe the acceleration of the diffraction calculation using "Band-limited double-step Fresnel diffraction," which does not require the expansion. Next, we describe color CGH acceleration using color space conversion. In general, color CGHs are generated on RGB color space; however, we need to repeat the same calculation for each color component, so that the computational burden of the color CGH generation increases three-fold, compared with monochrome CGH generation. We can reduce the computational burden by using YCbCr color space because the 2D cross-sectional images on YCbCr color space can be down-sampled without the impairing of the image quality.

  18. The Role of Implicit Motives in Strategic Decision-Making: Computational Models of Motivated Learning and the Evolution of Motivated Agents

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2015-11-01

    Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.

  19. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers

    NARCIS (Netherlands)

    Garza, J.L.B.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.W.; Huysmans, M.A.; Dieën, J.H. van; Beek, A.J. van der; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120

  20. Collective Tuning Initiative: automating and accelerating development and optimization of computing systems

    OpenAIRE

    Fursin, Grigori

    2009-01-01

    International audience Computing systems rarely deliver best possible performance due to ever increasing hardware and software complexity and limitations of the current optimization technology. Additional code and architecture optimizations are often required to improve execution time, size, power consumption, reliability and other important characteristics of computing systems. However, it is often a tedious, repetitive, isolated and time consuming process. In order to automate, simplify ...

  1. Strategic Responsiveness

    DEFF Research Database (Denmark)

    Pedersen, Carsten; Juul Andersen, Torben

    decision making is often conceived as ‘standing on the two feet’ of deliberate or intended strategic decisions by top management and emergent strategic decisions pursued by lower-level managers and employees. In this view, the paper proposes that bottom-up initiatives have a hard time surfacing...... in hierarchical organizations and that lowerlevel managers and employees, therefore, pursue various strategies to bypass the official strategy processes to act on emerging strategic issues and adapt to changing environmental conditions.......The analysis of major resource committing decisions is central focus in the strategy field, but despite decades of rich conceptual and empirical research we still seem distant from a level of understanding that can guide corporate practices under dynamic and unpredictable conditions. Strategic...

  2. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...

  3. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts......Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...

  4. Strategic Management

    OpenAIRE

    Vančata, Jan

    2012-01-01

    Strategic management is a process in which managers determine long-lasting direction of the company, set specific targets of performance and develop appropriate strategies to achieve objectives considering all reasonable internal and external factors of the company. Also make concrete steps toward realization through the selected plan. Why is the strategic management important in the company? The answer is quite simple. It assigns a specific role to each person, it leads towards the differenc...

  5. Study of irradiation induced restructuring of high burnup fuel - Use of computer and accelerator for fuel science and engineering -

    Energy Technology Data Exchange (ETDEWEB)

    Sataka, M.; Ishikawa, N.; Chimn, Y.; Nakamura, J.; Amaya, M. [Japan Atomic Energy Agency, Naka Gun (Japan); Iwasawa, M.; Ohnuma, T.; Sonoda, T. [Central Research Institute of Electric Power Industry, Tokyo (Japan); Kinoshita, M.; Geng, H. Y.; Chen, Y.; Kaneta, Y. [The Univ. of Tokyo, Tokyo (Japan); Yasunaga, K.; Matsumura, S.; Yasuda, K. [Kyushu Univ., Motooka (Japan); Iwase [Osaka Prefecture Univ., Osaka (Japan); Ichinomiya, T.; Nishiuran, Y. [Hokkaido Univ., Kitaku (Japan); Matzke, HJ. [Academy of Ceramics, Karlsruhe (Germany)

    2008-10-15

    In order to develop advanced fuel for future LWR reactors, trials were made to simulate the high burnup restructuring of the ceramics fuel, using accelerator irradiation out of pile and with computer simulation. The target is to reproduce the principal complex process as a whole. The reproduction of the grain subdivision (sub grain formation) was successful at experiments with sequential combined irradiation. It was made by recovery process of the accumulated dislocations, making cells and sub-boundaries at grain boundaries and pore surfaces. Details of the grain sub division mechanism is now in front of us outside of the reactor. Extensive computer science studies, first principle and molecular dynamics gave behavior of fission gas atoms and interstitial oxygen, assisting the high burnup restructuring.

  6. Study of irradiation induced restructuring of high burnup fuel - Use of computer and accelerator for fuel science and engineering -

    International Nuclear Information System (INIS)

    In order to develop advanced fuel for future LWR reactors, trials were made to simulate the high burnup restructuring of the ceramics fuel, using accelerator irradiation out of pile and with computer simulation. The target is to reproduce the principal complex process as a whole. The reproduction of the grain subdivision (sub grain formation) was successful at experiments with sequential combined irradiation. It was made by recovery process of the accumulated dislocations, making cells and sub-boundaries at grain boundaries and pore surfaces. Details of the grain sub division mechanism is now in front of us outside of the reactor. Extensive computer science studies, first principle and molecular dynamics gave behavior of fission gas atoms and interstitial oxygen, assisting the high burnup restructuring

  7. Computation of thermal properties via 3D homogenization of multiphase materials using FFT-based accelerated scheme

    CERN Document Server

    Lemaitre, Sophie; Choi, Daniel; Karamian, Philippe

    2015-01-01

    In this paper we study the thermal effective behaviour for 3D multiphase composite material consisting of three isotropic phases which are the matrix, the inclusions and the coating media. For this purpose we use an accelerated FFT-based scheme initially proposed in Eyre and Milton (1999) to evaluate the thermal conductivity tensor. Matrix and spherical inclusions media are polymers with similar properties whereas the coating medium is metallic hence better conducting. Thus, the contrast between the coating and the others media is very large. For our study, we use RVEs (Representative volume elements) generated by RSA (Random Sequential Adsorption) method developed in our previous works, then, we compute effective thermal properties using an FFT-based homogenization technique validated by comparison with the direct finite elements method. We study the thermal behaviour of the 3D-multiphase composite material and we show what features should be taken into account to make the computational approach efficient.

  8. Large full band gaps for photonic crystals in two dimensions computed by an inverse method with multigrid acceleration

    Science.gov (United States)

    Chern, R. L.; Chang, C. Chung; Chang, Chien C.; Hwang, R. R.

    2003-08-01

    In this study, two fast and accurate methods of inverse iteration with multigrid acceleration are developed to compute band structures of photonic crystals of general shape. In particular, we report two-dimensional photonic crystals of silicon air with an optimal full band gap of gap-midgap ratio Δω/ωmid=0.2421, which is 30% larger than ever reported in the literature. The crystals consist of a hexagonal array of circular columns, each connected to its nearest neighbors by slender rectangular rods. A systematic study with respect to the geometric parameters of the photonic crystals was made possible with the present method in drawing a three-dimensional band-gap diagram with reasonable computing time.

  9. Computer-controlled back scattering and sputtering-experiment using a heavy-ion-accelerator

    International Nuclear Information System (INIS)

    Control and data acquisition of a PDP 11/40 computer and CAMAC instrumentation are reported for an experiment that has been developed to measure sputtering in yields and energy losses for heavy 100 - 300 keV ions in thin metal foils. Besides a quadrupole mass filter or a bending magnet, a multichannel analyser is coupled to the computer, so that also pulse height analysis can be performed under computer control. CAMAC instrumentation and measuring programs are built in a modular form to enable an easy application to other experimental problems. (orig.) 891 KBE/orig. 892 BRE

  10. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    Science.gov (United States)

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  11. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  12. Emerging Multinational Companies and Strategic Fit

    DEFF Research Database (Denmark)

    Gammeltoft, Peter; Filatotchev, Igor; Hobdari, Bersant

    2012-01-01

    framework of strategic fit. This theoretical approach may provide important insights concerning both the original impetus to the contemporary acceleration of these flows and their specific features. By building on the early literature on fit in strategic management we outline an institutional framework...

  13. Accelerating selected columns of the density matrix computations via approximate column selection

    CERN Document Server

    Damle, Anil; Ying, Lexing

    2016-01-01

    Localized representation of the Kohn-Sham subspace plays an important role in quantum chemistry and materials science. The recently developed selected columns of the density matrix (SCDM) method [J. Chem. Theory Comput. 11, 1463, 2015] is a simple and robust procedure for finding a localized representation of a set of Kohn-Sham orbitals from an insulating system. The SCDM method allows the direct construction of a well conditioned (or even orthonormal) and localized basis for the Kohn-Sham subspace. The SCDM procedure avoids the use of an optimization procedure and does not depend on any adjustable parameters. The most computationally expensive step of the SCDM method is a column pivoted QR factorization that identifies the important columns for constructing the localized basis set. In this paper, we develop a two stage approximate column selection strategy to find the important columns at much lower computational cost. We demonstrate the effectiveness of this process using a dissociation process of a BH$_{3}...

  14. Acceleration of FEM-based transfer matrix computation for forward and inverse problems of electrocardiography.

    Science.gov (United States)

    Farina, Dmytro; Jiang, Y; Dössel, O

    2009-12-01

    The distributions of transmembrane voltage (TMV) within the cardiac tissue are linearly connected with the patient's body surface potential maps (BSPMs) at every time instant. The matrix describing the relation between the respective distributions is referred to as the transfer matrix. This matrix can be employed to carry out forward calculations in order to find the BSPM for any given distribution of TMV inside the heart. Its inverse can be used to reconstruct the cardiac activity non-invasively, which can be an important diagnostic tool in the clinical practice. The computation of this matrix using the finite element method can be quite time-consuming. In this work, a method is proposed allowing to speed up this process by computing an approximate transfer matrix instead of the precise one. The method is tested on three realistic anatomical models of real-world patients. It is shown that the computation time can be reduced by 50% without loss of accuracy.

  15. Intro - High Performance Computing for 2015 HPC Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Klitsner, Tom [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The recent Executive Order creating the National Strategic Computing Initiative (NSCI) recognizes the value of high performance computing for economic competitiveness and scientific discovery and commits to accelerate delivery of exascale computing. The HPC programs at Sandia –the NNSA ASC program and Sandia’s Institutional HPC Program– are focused on ensuring that Sandia has the resources necessary to deliver computation in the national interest.

  16. PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)

    Science.gov (United States)

    Vincenti, Henri

    2016-03-01

    The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.

  17. LCODE: a parallel quasistatic code for computationally heavy problems of plasma wakefield acceleration

    CERN Document Server

    Sosedkin, Alexander

    2015-01-01

    LCODE is a freely-distributed quasistatic 2D3V code for simulating plasma wakefield acceleration, mainly specialized at resource-efficient studies of long-term propagation of ultrarelativistic particle beams in plasmas. The beam is modeled with fully relativistic macro-particles in a simulation window copropagating with the light velocity; the plasma can be simulated with either kinetic or fluid model. Several techniques are used to obtain exceptional numerical stability and precision while maintaining high resource efficiency, enabling LCODE to simulate the evolution of long particle beams over long propagation distances even on a laptop. A recent upgrade enabled LCODE to perform the calculations in parallel. A pipeline of several LCODE processes communicating via MPI (Message-Passing Interface) is capable of executing multiple consecutive time steps of the simulation in a single pass. This approach can speed up the calculations by hundreds of times.

  18. LCODE: A parallel quasistatic code for computationally heavy problems of plasma wakefield acceleration

    Science.gov (United States)

    Sosedkin, A. P.; Lotov, K. V.

    2016-09-01

    LCODE is a freely distributed quasistatic 2D3V code for simulating plasma wakefield acceleration, mainly specialized at resource-efficient studies of long-term propagation of ultrarelativistic particle beams in plasmas. The beam is modeled with fully relativistic macro-particles in a simulation window copropagating with the light velocity; the plasma can be simulated with either kinetic or fluid model. Several techniques are used to obtain exceptional numerical stability and precision while maintaining high resource efficiency, enabling LCODE to simulate the evolution of long particle beams over long propagation distances even on a laptop. A recent upgrade enabled LCODE to perform the calculations in parallel. A pipeline of several LCODE processes communicating via MPI (Message-Passing Interface) is capable of executing multiple consecutive time steps of the simulation in a single pass. This approach can speed up the calculations by hundreds of times.

  19. ActiWiz – optimizing your nuclide inventory at proton accelerators with a computer code

    CERN Document Server

    Vincke, Helmut

    2014-01-01

    When operating an accelerator one always faces unwanted, but inevitable beam losses. These result in activation of adjacent material, which in turn has an obvious impact on safety and handling constraints. One of the key parameters responsible for activation is the chemical composition of the material which often can be optimized in that respect. In order to facilitate this task also for non-expert users the ActiWiz software has been developed at CERN. Based on a large amount of generic FLUKA Monte Carlo simulations the software applies a specifically developed risk assessment model to provide support to decision makers especially during the design phase as well as common operational work in the domain of radiation protection.

  20. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    -preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  1. Strategic serendipity

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark; Lemmergaard, Jeanette

    2014-01-01

    -of-the-art knowledge and in-depth understanding of the affordances of different communication channels, we discuss the importance of establishing opportunities for serendipity in strategic communication planning. The contribution of the paper is to develop the concept of strategic serendipity and show how......This paper contributes to critical voices on the issue of strategic communication. It does so by exploring how an organisation can seize the moment of serendipity based on careful preparation of its issues management and communication channels. The focus of the study is the media coverage...... of – and communicative responses to – Kopenhagen Fur's campaign The World's Best – but not perfect in both broadcast media (e.g. print and television) and social media, more specifically Facebook. Through understanding how an organisation can plan for and take advantage of the unpredictable through state...

  2. Strategic analysis

    OpenAIRE

    Chládek, Vítězslav

    2012-01-01

    The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...

  3. Strategic analysis

    OpenAIRE

    Bartuňková, Alena

    2008-01-01

    The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...

  4. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  5. Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns

    CERN Document Server

    Pethiyagoda, Ravindra; Moroney, Timothy J; Back, Julian M

    2014-01-01

    The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-...

  6. Adjacency-Based Data Reordering Algorithm for Acceleration of Finite Element Computations

    Directory of Open Access Journals (Sweden)

    Min Zhou

    2010-01-01

    Full Text Available Effective use of the processor memory hierarchy is an important issue in high performance computing. In this work, a part level mesh topological traversal algorithm is used to define a reordering of both mesh vertices and regions that increases the spatial locality of data and improves overall cache utilization during on processor finite element calculations. Examples based on adaptively created unstructured meshes are considered to demonstrate the effectiveness of the procedure in cases where the load per processing core is varied but balanced (e.g., elements are equally distributed across cores for a given partition. In one example, the effect of the current ajacency-based data reordering is studied for different phases of an implicit analysis including element-data blocking, element-level computations, sparse-matrix filling and equation solution. These results are compared to a case where reordering is applied to mesh vertices only. The computations are performed on various supercomputers including IBM Blue Gene (BG/L and BG/P, Cray XT (XT3 and XT5 and Sun Constellation Cluster. It is observed that reordering improves the per-core performance by up to 24% on Blue Gene/L and up to 40% on Cray XT5. The CrayPat hardware performance tool is used to measure the number of cache misses across each level of the memory hierarchy. It is determined that the measured decrease in L1, L2 and L3 cache misses when data reordering is used, closely accounts for the observed decrease in the overall execution time.

  7. Strategic Bonding.

    Science.gov (United States)

    Davis, Lynn; Tyson, Ben

    2003-01-01

    Many school buildings are in dire need of renovation, expansion, or replacement. Brief case studies from around the country illustrate the importance of finding out why people vote for or against a construction referendum. Lists recommendations for a strategic campaign. (MLF)

  8. Computational acceleration of orbital neutral sensor ionizer simulation through phenomena separation

    Science.gov (United States)

    Font, Gabriel I.

    2016-07-01

    Simulation of orbital phenomena is often difficult because of the non-continuum nature of the flow, which forces the use of particle methods, and the disparate time scales, which make long run times necessary. In this work, the computational work load has been reduced by taking advantage of the low number of collisions between different species. This allows each population of particles to be brought into convergence separately using a time step size optimized for its particular motion. The converged populations are then brought together to simulate low probability phenomena, such as ionization or excitation, on much longer time scales. The result of this technique has the effect of reducing run times by a factor of 103-104. The technique was applied to the simulation of a low earth orbit neutral species sensor with an ionizing element. Comparison with laboratory experiments of ion impacts generated by electron flux shows very good agreement.

  9. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    Science.gov (United States)

    Adamo, Mark E; Gerber, Scott A

    2016-09-07

    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc.

  10. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    Science.gov (United States)

    Adamo, Mark E; Gerber, Scott A

    2016-01-01

    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc. PMID:27603022

  11. Strategic patenting and software innovation

    OpenAIRE

    Noel, Michael; Schankerman, Mark

    2013-01-01

    Strategic patenting is widely believed to raise the costs of innovating, especially in industries characterised by cumulative innovation. This paper studies the effects of strategic patenting on R&D, patenting and market value in the computer software industry. We focus on two key aspects: patent portfolio size, which affects bargaining power in patent disputes, and the fragmentation of patent rights (‘patent thickets’) which increases the transaction costs of enforcement. We develop a model ...

  12. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  13. Strategic Marketing

    OpenAIRE

    Potter, Ned

    2012-01-01

    This chapter from The Library Marketing Toolkit focuses on marketing strategy. Marketing is more successful when it happens as part of a constantly-renewing cycle. The aim of this chapter is to demystify the process of strategic marketing, simplifying it into seven key stages with advice on how to implement each one. Particular emphasis is put on dividing your audience and potential audience into segments, and marketing different messages to each group. It includes case studies from Terr...

  14. Thinking strategically.

    Science.gov (United States)

    Goree, Michael

    2002-01-01

    Over the course of the past 20 years, human resources has tried a variety of strategic initiatives to add value to the working environment, from the alphabets of TQM, CQI, EVA, ROI, ISO, QS, Theory X, Y, Z, Generation X and Y to re-engineering, balanced scorecard, lean, hoshin, six sigma, to Margaret Wheatley's "The Simpler Way" and finally to cheese and fish. The problem is that none of these is a strategy. They are all tactics to accomplish or achieve a strategy.

  15. Strategic Management

    CERN Document Server

    Jeffs, Chris

    2008-01-01

    The Sage Course Companion on Strategic Management is an accessible introduction to the subject that avoids lengthy debate in order to focus on the core concepts. It will help the reader to develop their understanding of the key theories, whilst enabling them to bring diverse topics together in line with course requirements. The Sage Course Companion also provides advice on getting the most from your course work; help with analysing case studies and tips on how to prepare for examinations. Designed to compliment existing strategy textbooks, the Companion provides: -Quick and easy access to the

  16. Accelerated Aging of BKC 44306-10 Rigid Polyurethane Foam: FT-IR Spectroscopy, Dimensional Analysis, and Micro Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gilbertson, Robert D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Patterson, Brian M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Smith, Zachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-01-02

    An accelerated aging study of BKC 44306-10 rigid polyurethane foam was carried out. Foam samples were aged in a nitrogen atmosphere at three different temperatures: 50 °C, 65 °C, and 80 °C. Foam samples were periodically removed from the aging canisters at 1, 3, 6, 9, 12, and 15 month intervals when FT-IR spectroscopy, dimensional analysis, and mechanical testing experiments were performed. Micro Computed Tomography imaging was also employed to study the morphology of the foams. Over the course of the aging study the foams the decreased in size by a magnitude of 0.001 inches per inch of foam. Micro CT showed the heterogeneous nature of the foam structure likely resulting from flow effects during the molding process. The effect of aging on the compression and tensile strength of the foam was minor and no cause for concern. FT-IR spectroscopy was used to follow the foam chemistry. However, it was difficult to draw definitive conclusions about the changes in chemical nature of the materials due to large variability throughout the samples.

  17. 看企业信息化战略规划SOA和云计算技术的融入%Look enterprise information technology strategic planning SOA and cloud computing technology integration

    Institute of Scientific and Technical Information of China (English)

    牛昊天

    2014-01-01

    应用SOA和云计算技术服务企业信息化战略规划制定,是支持企业实现自身既定发展战略规划目的的强有力措施,对于企业业务流程创新、提升运营效率、降低信息运作成本有积极意义。本文分析了企业信息化战略规划中存在的问题和战略规划思路,对SOA技术和云计算技术的优势与局限性进行了分析,立足于二者优势设计了SOA和云计算技术融合结构,以更好的服务于企业信息化建设。%Application of SOA and cloud computing enterprise information technology services strategic planning is to support companies achieve their strategic planning purposes established strong measures for business process innovation,improve operational efficiency and reduce operating costs of information has a positive meaning.This paper analyzes the enterprise information problems in strategic planning and strategic planning ideas,on the advantages and limitations of SOA and cloud computing technologies are analyzed,based on the design of both the advantages of SOA and cloud computing technology integration structures to more good service in the construction of enterprise information.

  18. Strategic Engagement

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    “Pakistan regards China as a strategic partner and the bilateral ties have endured the test of time.”Pakistani Prime Minister Shaukat Aziz made the comment during his four-day official visit to China on April 16 when he met Chinese President Hu Jintao,Premier Wen Jiabao and the NPC Standing Committee Chairman Wu Bangguo.His visit to China also included a trip to Boao,where he delivered a keynote speech at the Boao Forum for Asia held on April 20-22. During his stay in Beijing,the two countries signed 13 agreements on cooperation in the fields of space,telecommunications,education and legal assistance,which enhanced an already close strategic partnership. In an interview with Beijing Review reporter Pan Shuangqin,Prime Minister Aziz addressed a number of issues ranging from Asia’s searching for a win-win economic situation to the influence of Sino-Pakistani relations on regional peace.

  19. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate. PMID:27073853

  20. Strategic Planning: What's so Strategic about It?

    Science.gov (United States)

    Strong, Bart

    2005-01-01

    The words "strategic" and "planning" used together can lead to confusion unless one spent the early years of his career in never-ending, team-oriented, corporate training sessions. Doesn't "strategic" have something to do with extremely accurate bombing or a defensive missile system or Star Wars or something? Don't "strategic" and "planning" both…

  1. On "enabling systems - A strategic review"

    Digital Repository Service at National Institute of Oceanography (India)

    Nayak, M.R.

    Enabling Systems is a formal strategic planning exercise that sets its direction in an organization for the 21st century. Information technology (IT), Computer Centre (CC) and Analytical Laboratory (AnLab) are identified as three important...

  2. Using compute unified device architecture-enabled graphic processing unit to accelerate fast Fourier transform-based regression Kriging interpolation on a MODIS land surface temperature image

    Science.gov (United States)

    Hu, Hongda; Shu, Hong; Hu, Zhiyong; Xu, Jianhui

    2016-04-01

    Kriging interpolation provides the best linear unbiased estimation for unobserved locations, but its heavy computation limits the manageable problem size in practice. To address this issue, an efficient interpolation procedure incorporating the fast Fourier transform (FFT) was developed. Extending this efficient approach, we propose an FFT-based parallel algorithm to accelerate regression Kriging interpolation on an NVIDIA® compute unified device architecture (CUDA)-enabled graphic processing unit (GPU). A high-performance cuFFT library in the CUDA toolkit was introduced to execute computation-intensive FFTs on the GPU, and three time-consuming processes were redesigned as kernel functions and executed on the CUDA cores. A MODIS land surface temperature 8-day image tile at a resolution of 1 km was resampled to create experimental datasets at eight different output resolutions. These datasets were used as the interpolation grids with different sizes in a comparative experiment. Experimental results show that speedup of the FFT-based regression Kriging interpolation accelerated by GPU can exceed 1000 when processing datasets with large grid sizes, as compared to the traditional Kriging interpolation running on the CPU. These results demonstrate that the combination of FFT methods and GPU-based parallel computing techniques greatly improves the computational performance without loss of precision.

  3. Using compute unified device architecture-enabled graphic processing unit to accelerate fast Fourier transform-based regression Kriging interpolation on a MODIS land surface temperature image

    Science.gov (United States)

    Hu, Hongda; Shu, Hong; Hu, Zhiyong; Xu, Jianhui

    2016-04-01

    Kriging interpolation provides the best linear unbiased estimation for unobserved locations, but its heavy computation limits the manageable problem size in practice. To address this issue, an efficient interpolation procedure incorporating the fast Fourier transform (FFT) was developed. Extending this efficient approach, we propose an FFT-based parallel algorithm to accelerate regression Kriging interpolation on an NVIDIA® compute unified device architecture (CUDA)-enabled graphic processing unit (GPU). A high-performance cuFFT library in the CUDA toolkit was introduced to execute computation-intensive FFTs on the GPU, and three time-consuming processes were redesigned as kernel functions and executed on the CUDA cores. A MODIS land surface temperature 8-day image tile at a resolution of 1 km was resampled to create experimental datasets at eight different output resolutions. These datasets were used as the interpolation grids with different sizes in a comparative experiment. Experimental results show that speedup of the FFT-based regression Kriging interpolation accelerated by GPU can exceed 1000 when processing datasets with large grid sizes, as compared to the traditional Kriging interpolation running on the CPU. These results demonstrate that the combination of FFT methods and GPU-based parallel computing techniques greatly improves the computational performance without loss of precision.

  4. Experiences of How Developed Country Accelerated Scientific and Technological Achievements Transformation of Strategic Emerging Industry---Taking Electric Vehicles as an Example%发达国家加速战略性新兴产业科技成果转化的经验--以电动汽车为例

    Institute of Scientific and Technical Information of China (English)

    杨荣

    2013-01-01

    The strategic emerging industries are a highly challenging industry, improving the conversion rate of the results is an important goal to pursue.Taking electric vehicles as an example, the author analyzed how developed countries promoted strategic emerging industries and technological achievements into practice : special strategic planning, special laws and regula-tions, technical standards research, attention to research cooperation, carrying out demonstration operations, and vigorously pro-moting promotion, setting up a management and intermediary organizations, focusing on education and training of professional and technical personnel, etc. These practices experience to accelerate the development of strategic emerging industries, im-proved strategic emerging industries and technological achievements can have rate value for references.%战略性新兴产业是一个具有高度挑战性的产业,提高其成果的转化率是一个重要的追求目标。文章以电动汽车为例,分析了发达国家促使战略性新兴产业科技成果转化的做法经验,即制定专项战略规划、制定专门法律法规、开展技术标准研究、重视产学研合作、开展示范运营、大力宣传推广、设立管理和中介机构、注重专业技术人才教育培训等,这些做法经验对加快我国战略性新兴产业的发展,提高战略性新兴产业科技成果转化率有参考借鉴价值。

  5. Strategic management or strategic planning for defense?

    OpenAIRE

    Tritten, James John; Roberts, Nancy Charlotte

    1989-01-01

    Approved for public release; distribution is unlimited. This report describes problems associated with strategic planning and strategic management within DoD. Authors offer a series of suggested reforms to enhance mono-level planning and management within DoD, primarily by closer ties with industry planning groups, education, organizational structure, management information systems, and better integration. Additional sponsors are: OSD competitive Strategies Office, OSD Strategic Planning B...

  6. Strategizing NATOs Narratives

    DEFF Research Database (Denmark)

    Nissen, Thomas Elkjer

    2014-01-01

    , implementation structures, and capabilities can be used to inform the construction of strategic narratives in NATO. Using Libya as a case study he explains that the formulation and implementation of strategic narratives in NATO currently is a fragmented process that rarely takes into account the grand strategic......In chapter eleven , Thomas Elkjer Nissen argues that strategic narratives are a means for political actors to construct a shared meaning of international politics and to shape the perceptions, beliefs, and behavior of domestic and international actors. He demonstrates how time, position, legitimacy...... objectives formulated in NATO headquarters. Consequently, the future construction of strategic narratives in NATO must be based on the strategic variables....

  7. 7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

    CERN Multimedia

    Anna Pantelia

    2013-01-01

    7 March 2013 -Stanford University Professor N. McKeown FREng, Electrical Engineering and Computer Science and B. Leslie, Creative Labs visiting CERN Control Centre and the LHC tunnel with Director for Accelerators and Technology S. Myers.

  8. Revisiting Strategic versus Non-strategic Cooperation

    NARCIS (Netherlands)

    Reuben, E.; Suetens, S.

    2009-01-01

    We use a novel experimental design to disentangle strategically- and non-strategically-motivated cooperation. By using contingent responses in a repeated sequential prisoners’ dilemma with a known probabilistic end, we differentiate end-game behavior from continuation behavior within individuals whi

  9. Strategic Risk Management

    OpenAIRE

    Sax, Johanna

    2015-01-01

    The aim of this thesis is to contribute to the literature with an investigation into strategic risk management practices from a strategic management and management accounting perspective. Previous research in strategic risk management has not provided sufficient evidence on the mechanisms behind firm practices, processes and tools for managing strategic risks, and their contingencies for value creation. In particular, the purpose of the thesis has been to fill the gaps in the l...

  10. Strategic information security

    CERN Document Server

    Wylder, John

    2003-01-01

    Introduction to Strategic Information SecurityWhat Does It Mean to Be Strategic? Information Security Defined The Security Professional's View of Information Security The Business View of Information SecurityChanges Affecting Business and Risk Management Strategic Security Strategic Security or Security Strategy?Monitoring and MeasurementMoving Forward ORGANIZATIONAL ISSUESThe Life Cycles of Security ManagersIntroductionThe Information Security Manager's Responsibilities The Evolution of Data Security to Information SecurityThe Repository Concept Changing Job Requirements Business Life Cycles

  11. Strategic Leadership Reconsidered

    Science.gov (United States)

    Davies, Brent; Davies, Barbara J.

    2005-01-01

    This paper will address the challenge of how strategic leadership can be defined and articulated to provide a framework for developing a strategically focused school drawing on a NCSL research project. The paper is structured into three main parts. Part one outlines the elements that comprise a strategically focused school, develops an…

  12. Optimal Strategic Pricing of Reproducible Consumer Products

    OpenAIRE

    Fernando Nascimento; Vanhonacker, Wilfried R.

    1988-01-01

    This paper investigates the strategic pricing of consumer durable products which can be acquired through either purchase or reproduction (e.g., computer software). As copy piracy results in an opportunity loss, its adverse effect on profits needs to be incorporated in strategic decisions such as pricing. Using a dual diffusion model which parsimoniously describes sales and copying, and employing control theory methodology, optimal price trajectories are derived for the period of monopoly. The...

  13. Contribution to the algorithmic and efficient programming of new parallel architectures including accelerators for neutron physics and shielding computations

    International Nuclear Information System (INIS)

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multi-core chips, and their application to eigenvalue problems for neutron simulation. Then, in order to address the massive parallelism of supercomputers national, we also study the use of asynchronous hybrid methods for solving eigenvalue problems with this very high level of parallelism. Then we experiment the work of this research on several national supercomputers such as the Titane hybrid machine of the Computing Center, Research and Technology (CCRT), the Curie machine of the Very Large Computing Centre (TGCC), currently being installed, and the Hopper machine at the Lawrence Berkeley National Laboratory (LBNL). We also do our experiments on local workstations to illustrate the interest of this research in an everyday use with local computing resources. (author)

  14. Learning without experience: Understanding the strategic implications of deregulation and competition in the electricity industry

    Energy Technology Data Exchange (ETDEWEB)

    Lomi, A. [School of Economics, University of Bologna, Bologna (Italy); Larsen, E.R. [Dept. of Managements Systems and Information, City University Business School, London (United Kingdom)

    1998-11-01

    As deregulation of the electricity industry continues to gain momentum around the world, electricity companies face unprecedented challenges. Competitive complexity and intensity will increase substantially as deregulated companies find themselves competing in new industries, with new rules, against unfamiliar competitors - and without any history to learn from. We describe the different kinds of strategic issues that newly deregulated utility companies are facing, and the risks that strategic issues implicate. We identify a number of problems induced by experiential learning under conditions of competence-destroying change, and we illustrate ways in which companies can activate history-independent learning processes. We suggest that Micro worlds - a new generation of computer-based learning environments made possible by conceptual and technological progress in the fields of system dynamics and systems thinking - are particularly appropriate tools to accelerate and enhance organizational and managerial learning under conditions of increased competitive complexity. (au)

  15. Understanding the effect of touchdown distance and ankle joint kinematics on sprint acceleration performance through computer simulation.

    Science.gov (United States)

    Bezodis, Neil Edward; Trewartha, Grant; Salo, Aki Ilkka Tapio

    2015-06-01

    This study determined the effects of simulated technique manipulations on early acceleration performance. A planar seven-segment angle-driven model was developed and quantitatively evaluated based on the agreement of its output to empirical data from an international-level male sprinter (100 m personal best = 10.28 s). The model was then applied to independently assess the effects of manipulating touchdown distance (horizontal distance between the foot and centre of mass) and range of ankle joint dorsiflexion during early stance on horizontal external power production during stance. The model matched the empirical data with a mean difference of 5.2%. When the foot was placed progressively further forward at touchdown, horizontal power production continually reduced. When the foot was placed further back, power production initially increased (a peak increase of 0.7% occurred at 0.02 m further back) but decreased as the foot continued to touchdown further back. When the range of dorsiflexion during early stance was reduced, exponential increases in performance were observed. Increasing negative touchdown distance directs the ground reaction force more horizontally; however, a limit to the associated performance benefit exists. Reducing dorsiflexion, which required achievable increases in the peak ankle plantar flexor moment, appears potentially beneficial for improving early acceleration performance.

  16. Strategic Analysis Overview

    Science.gov (United States)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  17. Implementation of the networked computer based control system for PEFP 100MeV proton linear accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young Gi; Kwon, Hyeok Jung; Jang, Ji Ho; Cho, Yong Sub [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The 100MeV Radio Frequency (RF) linac for the pulsed proton source is under development in KAERI. The main systems of the linac, such as the general timing control, the high power RF system, the control system of klystrons, the power supply system of magnets, the vacuum subsystem, and the cooling system, should be integrated into the control system of PEFP. Various subsystems units of the linac are to be made by other manufacturers with different standards. The technical integration will be based upon Experimental Physics and Industrial Control System (EPICS) software framework. The network attached computers, such as workstation, server, VME, and embedded system, will be applied as control devices. This paper is discussed on integration and implementation of the distributed control systems using networked computer systems.

  18. Next Processor Module: A Hardware Accelerator of UT699 LEON3-FT System for On-Board Computer Software Simulation

    Science.gov (United States)

    Langlois, Serge; Fouquet, Olivier; Gouy, Yann; Riant, David

    2014-08-01

    On-Board Computers (OBC) are more and more using integrated systems on-chip (SOC) that embed processors running from 50MHz up to several hundreds of MHz, and around which are plugged some dedicated communication controllers together with other Input/Output channels. For ground testing and On-Board SoftWare (OBSW) validation purpose, a representative simulation of these systems, faster than real-time and with cycle-true timing of execution, is not achieved with current purely software simulators. Since a few years some hybrid solutions where put in place ([1], [2]), including hardware in the loop so as to add accuracy and performance in the computer software simulation. This paper presents the results of the works engaged by Thales Alenia Space (TAS-F) at the end of 2010, that led to a validated HW simulator of the UT699 by mid- 2012 and that is now qualified and fully used in operational contexts.

  19. Strategic Thoughts in Organizations

    Directory of Open Access Journals (Sweden)

    Juliane Inês Di Francesco Kich

    2014-08-01

    Full Text Available This paper aims to analyze a new way of thinking about the organizational strategies through a theoretical discussion of the term "strategic thoughts", and its development in organizations. To achieve this, a bibliographical research was conducted in order to go more deeply on the theme and reach a conceptual background, which can support further analysis. Among the results of this research, it is emphasized that the pragmatic characteristics of strategic planningappears to not have more space in the current organizational world, this tool needs to be interconnected to the strategic thought process to bring more effective results. In this regard, the challenged is present in how the organizations could develop a strategic planning that encourages strategic thoughts instead of undermine it, as well as, the development of tools that promote the ability to think strategically in all employees, regardless of the hierarchical levels.

  20. Complex Strategic Choices

    DEFF Research Database (Denmark)

    Leleur, Steen

    . Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value...... and students in the field of planning and decision analysis as well as practitioners dealing with strategic analysis and decision making. More broadly, Complex Strategic Choices acts as guide for professionals and students involved in complex planning tasks across several fields such as business...... to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students...

  1. Strategic Belief Management

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul

    While (managerial) beliefs are central to many aspects of strategic organization, interactive beliefs are almost entirely neglected, save for some game theory treatments. In an increasingly connected and networked economy, firms confront coordination problems that arise because of network effects....... The capability to manage beliefs will increasingly be a strategic one, a key source of wealth creation, and a key research area for strategic organization scholars....

  2. Cultivating strategic thinking skills.

    Science.gov (United States)

    Shirey, Maria R

    2012-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author presents an overview of strategic leadership and offers approaches for cultivating strategic thinking skills.

  3. Strategic Marketing Planning Audit

    OpenAIRE

    Violeta Radulescu

    2012-01-01

    Market-oriented strategic planning is the process of defining and maintaining a viable relationship between objectives, training of personnel and resources of an organization, on the one hand and market conditions, on the other hand. Strategic marketing planning is an integral part of the strategic planning process of the organization. For successful marketing organization to obtain a competitive advantage, but also to measure the effectiveness of marketing actions the company is required to ...

  4. Strategic thinking in business

    OpenAIRE

    Špatenková, Lenka

    2011-01-01

    Strategic thinking in a bussines – summary This diploma work deals with the issue of strategic thinking in business, which is an inseparable part of the development of company strategy. The utilisation of the principles of strategies thinking as well as the processes and analyses of strategic management is shown on the example of REBYTO BEAR Ltd. The Theoretical Background Chapter provides explanation of important terminology whose knowledge is necessary for the practical use of strate...

  5. Telepreneurship : Strategic bliss

    OpenAIRE

    Erasmus, Izak Pierre

    2010-01-01

    The strategic management literature indirectly considers entrepreneurship as a subset of strategy, and the historical evolution of the field, specifically that of the Entrepreneurship division of the Academy of Management. Schendel (1990) which placed great emphasis on the topic of entrepreneurship and admitted that some argue that entrepreneurship is at the very heart of strategic management. This thesis explores the strategic use of entrepreneurship in the telecommunication industry. Throug...

  6. Strategic Management: General Concepts

    OpenAIRE

    Shahram Tofighi

    2010-01-01

    In the era after substitution of long term planning by strategic planning, it was wished that the managers could act more successful in implementing their plans. The outcomes were far from the expected, there were minor improvements. In the organizations, a plenty of namely strategic plans has been developed during strategic planning processes, but most of these plans have been kept in the shelves, a few of them played their roles as guiding documents for the entire organization. What are the...

  7. Computational study of transport and energy deposition of intense laser-accelerated proton beams in solid density matter

    Science.gov (United States)

    Kim, J.; McGuffey, C.; Qiao, B.; Beg, F. N.; Wei, M. S.; Grabowski, P. E.

    2015-11-01

    With intense proton beams accelerated by high power short pulse lasers, solid targets are isochorically heated to become partially-ionized warm or hot dense matter. In this regime, the thermodynamic state of the matter significantly changes, varying the proton stopping power where both bound and free electrons contribute. Additionally, collective beam-matter interaction becomes important to the beam transport. We present self-consistent hybrid particle-in-cell (PIC) simulation results of proton beam transport and energy deposition in solid-density matter, where the individual proton stopping and the collective effects are taken into account simultaneously with updates of stopping power in the varying target conditions and kinetic motions of the beam in the driven fields. Broadening of propagation range and self-focusing of the beam led to unexpected target heating by the intense proton beams, with dependence on the beam profiles and target conditions. The behavior is specifically studied for the case of an experimentally measured proton beam from the 1.25 kJ, 10 ps OMEGA EP laser transporting through metal foils. This work was supported by the U.S. DOE under Contracts No. DE-NA0002034 and No. DE-AC52-07NA27344 and by the U.S. AFOSR under Contract FA9550-14-1-0346.

  8. Prediction of peak ground acceleration of Iran’s tectonic regions using a hybrid soft computing technique

    Institute of Scientific and Technical Information of China (English)

    Mostafa Gandomi; Mohsen Soltanpour; Mohammad R. Zolfaghari; Amir H. Gandomi

    2016-01-01

    A new model is derived to predict the peak ground acceleration (PGA) utilizing a hybrid method coupling artificial neural network (ANN) and simulated annealing (SA), called SA-ANN. The proposed model re-lates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran’s tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R ¼ 0.835 and r ¼ 0.0908) and it is subsequently converted into a tractable design equation.

  9. Vol. 34 - Optimization of quench protection heater performance in high-field accelerator magnets through computational and experimental analysis

    CERN Document Server

    Salmi, Tiina

    2016-01-01

    Superconducting accelerator magnets with increasingly hi gh magnetic fields are being designed to improve the performance of the Large Hadron Collider (LHC) at CERN. One of the technical challenges is the magnet quench p rotection, i.e., preventing damage in the case of an unexpected loss of superc onductivity and the heat generation related to that. Traditionally this is d one by disconnecting the magnet current supply and using so-called protection he aters. The heaters suppress the superconducting state across a large fraction of the winding thus leading to a uniform dissipation of the stored energy. Preli minary studies suggested that the high-field Nb 3 Sn magnets under development for the LHC luminosity upgrade (HiLumi) could not be reliably protected using the existing heaters. In this thesis work I analyzed in detail the present state-of-the-art protection heater technology, aiming to optimize its perfo rmance and evaluate the prospects in high-field magnet protection. The heater efficiency analyses ...

  10. Computer simulation of rocket/missile safing and arming mechanism (containing pin pallet runaway escapement, three-pass involute gear train and acceleration driven rotor)

    Science.gov (United States)

    Gorman, P. T.; Tepper, F. R.

    1986-03-01

    A complete simulation of missile and rocket safing and arming (S&A) mechanisms containing an acceleration-driven rotor, a three-pass involute gear train, and a pin pallet runaway escapement was developed. In addition, a modification to this simulation was formulated for the special case of the PATRIOT M143 S&A mechanism which has a pair of driving gears in addition to the three-pass gear train. The three motion regimes involved in escapement operation - coupled motion, free motion, and impact - are considered in the computer simulation. The simulation determines both the arming time of the device and the non-impact contact forces of all interacting components. The program permits parametric studies to be made, and is capable of analyzing pallets with arbitrarily located centers of mass. A sample simulation of the PATRIOT M143 S&A in an 11.9 g constant acceleration arming test was run. The results were in good agreement with laboratory test data.

  11. Sandia Strategic Plan 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    Sandia embarked on its first exercise in corporate strategic planning during the winter of 1989. The results of that effort were disseminated with the publication of Strategic Plan 1990. Four years later Sandia conducted their second major planning effort and published Strategic Plan 1994. Sandia`s 1994 planning effort linked very clearly to the Department of Energy`s first strategic plan, Fueling a Competitive Economy. It benefited as well from the leadership of Lockheed Martin Corporation, the management and operating contractor. Lockheed Martin`s corporate success is founded on visionary strategic planning and annual operational planning driven by customer requirements and technology opportunities. In 1996 Sandia conducted another major planning effort that resulted in the development of eight long-term Strategic Objectives. Strategic Plan 1997 differs from its predecessors in that the robust elements of previous efforts have been integrated into one comprehensive body. The changes implemented so far have helped establish a living strategic plan with a stronger business focus and with clear deployment throughout Sandia. The concept of a personal line of sight for all employees to this strategic plan and its objectives, goals, and annual milestones is becoming a reality.

  12. Strategic planning for neuroradiologists.

    Science.gov (United States)

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan. PMID:22902108

  13. DCOI Strategic Plan

    Data.gov (United States)

    General Services Administration — Under the Data Center Optimization Initiative (DCOI), covered agencies are required to post DCOI Strategic Plans and updates to their FITARA milestones publicly on...

  14. Strategic planning for neuroradiologists.

    Science.gov (United States)

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan.

  15. How Strategic are Strategic Information Systems?

    Directory of Open Access Journals (Sweden)

    Alan Eardley

    1996-11-01

    Full Text Available There are many examples of information systems which are claimed to have created and sustained competitive advantage, allowed beneficial collaboration or simply ensured the continued survival of the organisations which used them These systems are often referred to as being 'strategic'. This paper argues that many of the examples of strategic information systems as reported in the literature are not sufficiently critical in determining whether the systems meet the generally accepted definition of the term 'strategic' - that of achieving sustainable competitive advantage. Eight of the information systems considered to be strategic are examined here from the standpoint of one widely-accepted 'competition' framework- Porter's model of industry competition . The framework is then used to question the linkage between the information systems and the mechanisms which are required for the enactment of strategic business objectives based on competition. Conclusions indicate that the systems are compatible with Porter's framework. Finally, some limitations of the framework are discussed and aspects of the systems which extend beyond the framework are highlighted

  16. The Relationship between Firms’ Strategic Orientations and Strategic Planning Process

    OpenAIRE

    Hasnanywati Hassan

    2010-01-01

    The study examines the quantity surveying (QS) firms’ strategic orientation and its relation to strategic planningprocess. The strategic orientations based on Miles and Snow typology were used to identify the strategicorientation for QS firms. The strategic planning process that includes the efforts of strategic planning, degree ofinvolvement in strategic planning and formality were also determined. The declined period in Malaysianconstruction industry from year 2001 to 2005 has been determin...

  17. Linear Accelerators

    CERN Document Server

    Vretenar, M

    2014-01-01

    The main features of radio-frequency linear accelerators are introduced, reviewing the different types of accelerating structures and presenting the main characteristics aspects of linac beam dynamics.

  18. Developing Strategic Leaders.

    Science.gov (United States)

    Carter, Patricia; Terwilliger, Leatha; Alfred, Richard L.; Hartleb, David; Simone, Beverly

    2002-01-01

    Highlights the importance of developing community college leaders capable of demonstrating strategic leadership and responding to the global forces that influence community college education. Discusses the Consortium for Community College Development's Strategic Leadership Forum and its principles, format, content, and early results. (RC)

  19. Strategic Risk Assessment

    Science.gov (United States)

    Derleth, Jason; Lobia, Marcus

    2009-01-01

    This slide presentation provides an overview of the attempt to develop and demonstrate a methodology for the comparative assessment of risks across the entire portfolio of NASA projects and assets. It includes information about strategic risk identification, normalizing strategic risks, calculation of relative risk score, and implementation options.

  20. Strategic Leadership in Schools

    Science.gov (United States)

    Williams, Henry S.; Johnson, Teryl L.

    2013-01-01

    Strategic leadership is built upon traits and actions that encompass the successful execution of all leadership styles. In a world that is rapidly changing, strategic leadership in schools guides school leader through assuring constant improvement process by anticipating future trends and planning for them and noting that plans must be flexible to…

  1. Manage "Human Capital" Strategically

    Science.gov (United States)

    Odden, Allan

    2011-01-01

    To strategically manage human capital in education means restructuring the entire human resource system so that schools not only recruit and retain smart and capable individuals, but also manage them in ways that support the strategic directions of the organization. These management practices must be aligned with a district's education improvement…

  2. Strategic HRD within companies

    NARCIS (Netherlands)

    Wognum, A.A.M.; Mulder, M.M.

    1999-01-01

    This article reports a preliminary survey that was conducted within the framework of the project on strategic human resource development (HRD), in which for various aspects of organisations the effects of strategic HRD are explored. The aim of the survey was to explore some conditions that are impor

  3. 11. Strategic planning.

    Science.gov (United States)

    2014-05-01

    There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.

  4. On strategic spatial planning

    Directory of Open Access Journals (Sweden)

    Tošić Branka

    2014-01-01

    Full Text Available The goal of this paper is to explain the origin and development of strategic spatial planning, to show complex features and highlight the differences and/or advantages over traditional, physical spatial planning. Strategic spatial planning is seen as one of approaches in legally defined planning documents, and throughout the display of properties of sectoral national strategies, as well as issues of strategic planning at the local level in Serbia. The strategic approach is clearly recognized at the national and sub-national level of spatial planning in European countries and in our country. It has been confirmed by the goals outlined in documents of the European Union and Serbia that promote the grounds of territorial cohesion and strategic integrated planning, emphasizing cooperation and the principles of sustainable spatial development. [Projekat Ministarstva nauke Republike Srbije, br. 176017

  5. Strategic planning in transition

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2012-01-01

    In this paper, we analyse how contested transitions in planning rationalities and spatial logics have shaped the processes and outputs of recent episodes of Danish ‘strategic spatial planning’. The practice of ‘strategic spatial planning’ in Denmark has undergone a concerted reorientation...... style of ‘strategic spatial planning’ with its associated spatial logics is continuously challenged by a persistent regulatory, top-down rationality of ‘strategic spatial planning’, rooted in spatial Keynesianism, which has long characterised the Danish approach. The findings reveal the emergence...... of a particularly Danish approach, retaining strong regulatory aspects. However this approach does not sit easily within the current neoliberal political climate, raising concerns of an emerging crisis of ‘strategic spatial planning’....

  6. The strategic issues - structural elements of strategic management

    OpenAIRE

    Balta Corneliu

    2013-01-01

    The paper presents the most important concepts related to strategic management and the connection with strategic results as they are obtained after the main steps in strategic management are followed. The dynamics of the relationship between strategic issues and objectives is included

  7. Strategic Communication Institutionalized

    DEFF Research Database (Denmark)

    Kjeldsen, Anna Karina

    2013-01-01

    of institutionalization when strategic communication is not yet visible as organizational practice, and how can such detections provide explanation for the later outcome of the process? (2) How can studies of strategic communication benefit from an institutional perspective? How can the virus metaphor generate a deeper...... understanding of the mechanisms that interact from the time an organization is exposed to a new organizational idea such as strategic communication until it surfaces in the form of symptoms such as mission and vision statements, communication manuals and communication positions? The first part of the article...... communication in three Danish art museums....

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  12. Strategic Management: General Concepts

    Directory of Open Access Journals (Sweden)

    Shahram Tofighi

    2010-05-01

    Full Text Available In the era after substitution of long term planning by strategic planning, it was wished that the managers could act more successful in implementing their plans. The outcomes were far from the expected, there were minor improvements. In the organizations, a plenty of namely strategic plans has been developed during strategic planning processes, but most of these plans have been kept in the shelves, a few of them played their roles as guiding documents for the entire organization. What are the factors inducing such outcomes? Different scientists have offered a variety of justifications, according to their expe-riences."nThe first examined issue was misunderstanding stra-tegic planning by the managers and staff; it means the strategic planning process may be executed erroneously, and what they had expected from this process was not accurate. Substantially, strategic planning looks at the future and coming situations, and is designed to answer the questions which emerge in the future. Unfortunately, this critical and fundamental characteristic of strategic planning is obscured."nStrategic planning conveys the concept of drawing the future and developing a set of different probable scenarios along with defining a set of solutions in order to combat undesirable coming conditions for positioning the system or business. It helps organizations save themselves safe and maintain them successful. In other words, in strategic planning efforts we are seeking solutions fit for problems which will appear in the future for the conditions that will emerge in the future. Unfortunately, most of strategic plans which have been developed in the organizations lack this important and critical characteristic; I mean in most of them the developers had offered solutions in order to solve today's problems in the future! "nThe second issue which was considered by the scientists, was the task of ensuring the continuity of effectiveness of the planning, there was a

  13. Complex Strategic Choices

    DEFF Research Database (Denmark)

    Leleur, Steen

    . Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value...... resulting in new material stemming from and focusing on practical application of a systemic approach. The outcome is a coherent and flexible approach named systemic planning. The inclusion of both the theoretical and practical aspects of systemic planning makes this book a key resource for researchers...... and students in the field of planning and decision analysis as well as practitioners dealing with strategic analysis and decision making. More broadly, Complex Strategic Choices acts as guide for professionals and students involved in complex planning tasks across several fields such as business...

  14. Strategic agility for nursing leadership.

    Science.gov (United States)

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  15. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2014-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  16. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2015-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  17. Cognitive Ability and the Effect of Strategic Uncertainty

    OpenAIRE

    Nobuyuki Hanaki; Nicolas Jacquemet; Stéphane Luchini; Adam Zylberstejn

    2014-01-01

    How is one’s cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  18. GPU accelerated face detection

    OpenAIRE

    Mäkelä, J.

    2013-01-01

    Graphics processing units have massive parallel processing capabilities, and there is a growing interest in utilizing them for generic computing. One area of interest is computationally heavy computer vision algorithms, such as face detection and recognition. Face detection is used in a variety of applications, for example the autofocus on cameras, face and emotion recognition, and access control. In this thesis, the face detection algorithm was accelerated with GPU using OpenCL. The goal was...

  19. THE STRATEGIC OPTIONS IN INVESTMENT PROJECTS VALUATION

    Directory of Open Access Journals (Sweden)

    VIOLETA SĂCUI

    2012-11-01

    Full Text Available The topic of real options applies the option valuation techniques to capital budgeting exercises in which a project is coupled with a put or call option. In many project valuation settings, the firm has one or more options to make strategic changes to the project during its life. These strategic options, which are known as real options, are typically ignored in standard discounted cash-flow analysis where a single expected present value is computed. This paper presents the types of real options that are met in economic activity.

  20. Strategizing in multiple ways

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Madsen, Charlotte Øland; Rasmussen, Jørgen Gulddahl

    2013-01-01

    Strategy processes are kinds of wayfaring where different actors interpret a formally defined strat-egy differently. In the everyday practice of organizations strategizing takes place in multiple ways through narratives and sensible actions. This forms a meshwork of polyphonic ways to enact one a...... based on this development paper is whether one can understand these diver-gent strategic wayfaring processes as constructive for organizations....

  1. 2015 Enterprise Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-08-01

    This document aligns with the Department of Energy Strategic Plan for 2014-2018 and provides a framework for integrating our missions and direction for pursuing DOE’s strategic goals. The vision is a guide to advancing world-class science and engineering, supporting our people, modernizing our infrastructure, and developing a management culture that operates a safe and secure enterprise in an efficient manner.

  2. Restriction of the use of hazardous substances (RoHS in the personal computer segment: analysis of the strategic adoption by the manufacturers settled in Brazil

    Directory of Open Access Journals (Sweden)

    Ademir Brescansin

    2015-09-01

    Full Text Available The enactment of the RoHS Directive (Restriction of Hazardous Substances in 2003, limiting the use of certain hazardous substances in electronic equipment has forced companies to adjust their products to comply with this legislation. Even in the absence of similar legislation in Brazil, manufacturers of personal computers which are located in this country have been seen to adopt RoHS for products sold in the domestic market and abroad. The purpose of this study is to analyze whether these manufacturers have really adopted RoHS, focusing on their motivations, concerns, and benefits. This is an exploratory study based on literature review and interviews with HP, Dell, Sony, Lenovo, Samsung, LG, Itautec, and Positivo, using summative content analysis. The results showed that initially, global companies adopted RoHS to market products in Europe, and later expanded this practice to all products. Brazilian companies, however, adopted RoHS to participate in the government’s sustainable procurement bidding processes. It is expected that this study can assist manufacturers in developing strategies for reducing or eliminating hazardous substances in their products and processes, as well as help the government to formulate public policies on reducing risks of environmental contamination.

  3. Three-dimensional conformal arc radiotherapy using a C-arm linear accelerator with a computed tomography on-rail system for prostate cancer: clinical outcomes

    International Nuclear Information System (INIS)

    We report the feasibility and treatment outcomes of image-guided three-dimensional conformal arc radiotherapy (3D-CART) using a C-arm linear accelerator with a computed tomography (CT) on-rail system for localized prostate cancer. Between 2006 and 2011, 282 consecutive patients with localized prostate cancer were treated with in-room CT-guided 3D-CART. Biochemical failure was defined as a rise of at least 2.0 ng/ml beyond the nadir prostate-specific antigen level. Toxicity was scored according to the National Cancer Institute Common Terminology Criteria for Adverse Events, version 4.0. A total of 261 patients were analyzed retrospectively (median follow-up: 61.6 months). The median prescribed 3D-CART dose was 82 Gy (2 Gy/fraction, dose range: 78–86 Gy), and 193 of the patients additionally received hormonal therapy. The 5-year overall survival rate was 93.9 %. Among low-, intermediate-, and high-risk patients, 5-year rates of freedom from biochemical failure were 100, 91.5 and 90.3 %, respectively. Rates of grade 2–3 late gastrointestinal and genitourinary toxicities were 2.3 and 11.4 %, respectively. No patient experienced late grade 4 or higher toxicity. In-room CT-guided 3D-CART was feasible and effective for localized prostate cancer. Treatment outcomes were comparable to those previously reported for intensity-modulated radiotherapy

  4. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  5. Strategic analysis of the company

    OpenAIRE

    Matoušková, Irena

    2012-01-01

    Strategic analysis of the company In my thesis I developed a strategic analysis of the company Pacovské strojírny a.s. I describe the various methods of internal and external strategic analysis in the theoretical part. I followed the methods used in the practical part. In an internal strategic analysis, I focused on the identification of internal resources and capabilities, the financial analysis and the chain of creating value. External strategic analysis includes PEST analysis, Porter's fiv...

  6. COMPUTING

    CERN Document Server

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  7. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  8. Working and strategic memory deficits in schizophrenia

    Science.gov (United States)

    Stone, M.; Gabrieli, J. D.; Stebbins, G. T.; Sullivan, E. V.

    1998-01-01

    Working memory and its contribution to performance on strategic memory tests in schizophrenia were studied. Patients (n = 18) and control participants (n = 15), all men, received tests of immediate memory (forward digit span), working memory (listening, computation, and backward digit span), and long-term strategic (free recall, temporal order, and self-ordered pointing) and nonstrategic (recognition) memory. Schizophrenia patients performed worse on all tests. Education, verbal intelligence, and immediate memory capacity did not account for deficits in working memory in schizophrenia patients. Reduced working memory capacity accounted for group differences in strategic memory but not in recognition memory. Working memory impairment may be central to the profile of impaired cognitive performance in schizophrenia and is consistent with hypothesized frontal lobe dysfunction associated with this disease. Additional medial-temporal dysfunction may account for the recognition memory deficit.

  9. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  10. Collective ion acceleration

    International Nuclear Information System (INIS)

    Progress achieved in the understanding and development of collective ion acceleration is presented. Extensive analytic and computational studies of slow cyclotron wave growth on an electron beam in a helix amplifier were performed. Research included precise determination of linear coupling between beam and helix, suppression of undesired transients and end effects, and two-dimensional simulations of wave growth in physically realizable systems. Electrostatic well depths produced exceed requirements for the Autoresonant Ion Acceleration feasibility experiment. Acceleration of test ions to modest energies in the troughs of such waves was also demonstrated. Smaller efforts were devoted to alternative acceleration mechanisms. Langmuir wave phase velocity in Converging Guide Acceleration was calculated as a function of the ratio of electron beam current to space-charge limiting current. A new collective acceleration approach, in which cyclotron wave phase velocity is varied by modulation of electron beam voltage, is proposed. Acceleration by traveling Virtual Cathode or Localized Pinch was considered, but appears less promising. In support of this research, fundamental investigations of beam propagation in evacuated waveguides, of nonneutral beam linear eigenmodes, and of beam stability were carried out. Several computer programs were developed or enhanced. Plans for future work are discussed

  11. Guidelines for strategic planning

    Energy Technology Data Exchange (ETDEWEB)

    1991-07-01

    Strategic planning needs to be done as one of the integral steps in fulfilling our overall Departmental mission. The role of strategic planning is to assure that the longer term destinations, goals, and objectives which the programs and activities of the Department are striving towards are the best we can envision today so that our courses can then be set to move in those directions. Strategic planning will assist the Secretary, Deputy Secretary, and Under Secretary in setting the long-term directions and policies for the Department and in making final decisions on near-term priorities and resource allocations. It will assist program developers and implementors by providing the necessary guidance for multi-year program plans and budgets. It is one of the essential steps in the secretary's Strategic Planning Initiative. The operational planning most of us are so familiar with deals with how to get things done and with the resources needed (people, money, facilities, time) to carry out tasks. Operating plans like budgets, capital line item projects, R D budgets, project proposals, etc., are vital to the mission of the Department. They deal, however, with how to carry out programs to achieve some objective or budget assumption. Strategic planning deals with the prior question of what it is that should be attempted. It deals with what objectives the many programs and activities of the Department of Department should be striving toward. The purpose of this document is to provide guidance to those organizations and personnel starting the process for the first time as well as those who have prepared strategic plans in the past and now wish to review and update them. This guideline should not be constructed as a rigid, restrictive or confining rulebook. Each organization is encouraged to develop such enhancements as they think may be useful in their planning. The steps outlined in this document represent a very simplified approach to strategic planning. 9 refs.

  12. Strategic forces briefing

    Energy Technology Data Exchange (ETDEWEB)

    Bing, G.; Chrzanowski, P.; May, M.; Nordyke, M.

    1989-04-06

    The Strategic Forces Briefing'' is our attempt, accomplished over the past several months, to outline and highlight the more significant strategic force issues that must be addressed in the near future. Some issues are recurrent: the need for an effective modernized Triad and a constant concern for force survivability. Some issues derive from arms control: the Strategic Arms Reduction Talks (SALT) are sufficiently advanced to set broad numerical limits on forces, but not so constraining as to preclude choices among weapon systems and deployment modes. Finally, a new administration faced with serious budgetary problems must strive for the most effective strategic forces limited dollars can buy and support. A review of strategic forces logically begins with consideration of the missions the forces are charged with. We begin the briefing with a short review of targeting policy and implementation within the constraints of available unclassified information. We then review each element of the Triad with sections on SLBMs, ICBMs, and Air-Breathing (bomber and cruise missile) systems. A short section at the end deals with the potential impact of strategic defense on offensive force planning. We consider ABM, ASAT, and air defense; but we do not attempt to address the technical issues of strategic defense per se. The final section gives a brief overview of the tritium supply problem. We conclude with a summary of recommendations that emerge from our review. The results of calculation on the effectiveness of various weapon systems as a function of cost that are presented in the briefing are by Paul Chrzanowski.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  15. Quantum optical device accelerating dynamic programming

    OpenAIRE

    Grigoriev, D.; Kazakov, A.; Vakulenko, S

    2005-01-01

    In this paper we discuss analogue computers based on quantum optical systems accelerating dynamic programming for some computational problems. These computers, at least in principle, can be realized by actually existing devices. We estimate an acceleration in resolving of some NP-hard problems that can be obtained in such a way versus deterministic computers

  16. Computer simulation of the three-dimensional regime of proton acceleration in the interaction of laser radiation with a thin spherical target

    International Nuclear Information System (INIS)

    Results from particle-in-cell simulations of the three-dimensional regime of proton acceleration in the interaction of laser radiation with a thin spherical target are presented. It is shown that the density of accelerated protons can be several times higher than that in conventional accelerators. The focusing of fast protons created in the interaction of laser radiation with a spherical target is demonstrated. The focal spot of fast protons is localized near the center of the sphere. The conversion efficiency of laser energy into fast ion energy attains 5%. The acceleration mechanism is analyzed and the electron and proton energy spectra are obtained

  17. Features List,Innovational Property and the Strategic Trend of Chinese Cloud Computing Industry%特征举证、创新属性与云计算产业的战略取向

    Institute of Scientific and Technical Information of China (English)

    田杰棠

    2012-01-01

    Cloud Computing is a new kind communicating and using mode of information technologic resources,which could reduce cost and enhance the efficiency of resources utilizing,promoting information sharing greatly,and has the characteristics of technology with the purpose of common use.Its obvious efficiency of cost reducing will promote fast development of information industry,cut down the information barrier in other industries,deepen informatization,make more industry get benefits,and bring in positive macroeconomic effect.To promote the development of China's Cloud Computing,environment is more important than pilot project,we have to plan the strategic distribution of massive data center comprehensively.To develop Cloud Computing project,we should not only has the thought of infrastructure,but also the thought of technology reform,and try best to avoid the risk of national safety.%云计算是一种新的信息技术资源交付和使用模式,可降低成本、提高资源的利用效率、促进信息的高度共享,具备通用目的技术的一些特征。云计算的显著成本节约效应将促使信息产业更快发展,并将降低其他行业的信息化壁垒,推动深度信息化,使更多行业获益,最终产生积极的宏观经济效应。推进云计算发展,环境建设重于项目示范,应统筹大规模数据中心的战略布局。云计算项目不能只有"基本建设"思维,还应有"技术改造"思维,应尽量避免国家安全风险。

  18. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  2. COMPUTING

    CERN Document Server

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. COMPUTING

    CERN Document Server

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. ‘DEOS CHAMP-01C 70’: a model of the Earth’s gravity field computed from accelerations of the CHAMP satellite

    NARCIS (Netherlands)

    Ditmar, P.G.; Kuznetsov, V.; Van Eck van der Sluis, A.A.; Schrama, E.; Klees, R.

    2005-01-01

    Performance of a recently proposed technique for gravity field modeling has been assessed with data from the CHAMP satellite. The modeling technique is a variant of the acceleration approach. It makes use of the satellite accelerations that are derived from the kinematic orbit with the 3-point numer

  7. Accelerating Value Creation with Accelerators

    DEFF Research Database (Denmark)

    Jonsson, Eythor Ivar

    2015-01-01

    accelerator programs. Microsoft runs accelerators in seven different countries. Accelerators have grown out of the infancy stage and are now an accepted approach to develop new ventures based on cutting-edge technology like the internet of things, mobile technology, big data and virtual reality. It is also...... with the traditional audit and legal universes and industries are examples of emerging potentials both from a research and business point of view to exploit and explore further. The accelerator approach may therefore be an Idea Watch to consider, no matter which industry you are in, because in essence accelerators...

  8. Strategic self-ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.;

    2016-01-01

    We examine strategic self-ignorance—the use of ignorance as an excuse to over-indulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals—a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58%) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  9. Strategic Self-Ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.;

    We examine strategic self-ignorance—the use of ignorance as an excuse to overindulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals — a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58 percent) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  10. Tourism and Strategic Planning

    DEFF Research Database (Denmark)

    Pasgaard, Jens Christian

    2012-01-01

    The main purpose of this report is to explore and unfold the complexity of the tourism phenomenon in order to qualify the general discussion of tourism-related planning challenges. Throughout the report I aim to demonstrate the strategic potential of tourism in a wider sense and more specifically...... the potential of ‘the extraordinary’ tourism-dominated space. As highlighted in the introduction, this report does not present any systematic analysis of strategic planning processes; neither does it provide any unequivocal conclusions. Rather, the report presents a collection of so-called ‘detours......’ – a collection of theoretical discussions and case studies with the aim to inspire future strategic planning. Due to the complexity and heterogeneity of the phenomenon I use a non-linear and non-chronological report format with the ambition to create a new type of overview. In this regard the report is intended...

  11. Strategic environmental assessment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    1997-01-01

    The integration of environmental considerations into strategic decision making is recognized as a key to achieving sustainability. In the European Union a draft directive on Strategic Environmental Assessment (SEA) is currently being reviewed by the member states. The nature of the proposed SEA...... directive is outlined, together with its relationship with the process of spatial planning and public participation in Denmark. This paper analyses the adoption of the proposed directive and discusses whether the SEA is an appropriate tool for opening and democratizing political structures. It concludes...

  12. Laser accelerator

    OpenAIRE

    Vigil, Ricardo

    2014-01-01

    Approved for public release; distribution is unlimited In 1979,W. B. Colson and S. K. Ride proposed a new kind of electron accelerator using a uniform magnetic field in combination with a circularly-polarized laser field. A key concept is to couple the oscillating electric field to the electron’s motion so that acceleration is sustained. This dissertation investigates the performance of the proposed laser accelerator using modern high powered lasers and mag-netic fields that are significan...

  13. Mapping strategic diversity: strategic thinking from a variety of perspectives

    NARCIS (Netherlands)

    D. Jacobs

    2010-01-01

    In his influential work, Strategy Safari, Henry Mintzberg and his colleagues presented ten schools of strategic thought. In this impressive book, Dany Jacobs demonstrates that the real world of strategic management is much wider and richer. In Mapping Strategic Diversity, Jacobs distinguishes betwee

  14. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  16. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  18. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  19. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  1. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...

  2. Leadership Excellence Through Accelerated Development (LEAD)

    OpenAIRE

    Wanders, Stephen P.

    2014-01-01

    The Leadership Excellence through Accelerated Development (LEAD) Institute at CH2M HILL provides a yearlong curriculum designed to accelerate the development of leaders of individual contributors and leaders of managers who have demonstrated skill, potential, and aspiration for roles requiring strategic, operational, and leadership capabilities. An overview of the program and intended outcomes will be presented and the various elements comprising the LEAD Institute will be discussed.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. Strategic Marketing for Presidents.

    Science.gov (United States)

    Pappas, Richard J., Ed.

    Designed to inform the marketing efforts of community college presidents, this document describes the importance of marketing, presents a targeted approach, and outlines the specific roles and skills needed by the president to ensure successful efforts and effective institutions. The first chapter, "Developing a Marketing-Strategic Plan," by…

  5. The strategic research positioning:

    DEFF Research Database (Denmark)

    Viala, Eva Silberschmidt

    to provide new insights into ‘immigrant’ parents’ perspective on home/school partnership in Denmark. The majority of the immigrant parents came from non-Western countries, and they had already been ‘labelled’ difficult in terms of home/school partnership. This calls for what I call ‘strategic...

  6. Strategically Planning to Change

    Science.gov (United States)

    Atkins, Kemal

    2010-01-01

    Higher education, like the private sector, is searching for innovative ways to respond to demographic shifts, globalization, greater accountability, and new technologies. New organizational models are needed to meet these challenges. In a rapidly changing world, the development of such models can occur through effective strategic analysis and…

  7. Strategic Targeted Advertising

    NARCIS (Netherlands)

    A. Galeotti; J.L. Moraga-Gonzalez (José Luis)

    2003-01-01

    textabstractWe present a strategic game of pricing and targeted-advertising. Firms can simultaneously target price advertisements to different groups of customers, or to the entire market. Pure strategy equilibria do not exist and thus market segmentation cannot occur surely. Equilibria exhibit rand

  8. EMSL Strategic Plan 2008

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2008-08-15

    This Strategic Plan is EMSL’s template for achieving our vision of simultaneous excellence in all aspects of our mission as a national scientific user facility. It reflects our understanding of the long-term stewardship we must work toward to meet the scientific challenges of the Department of Energy and the nation. During the next decade, we will implement the strategies contained in this Plan, working closely with the scientific community, our advisory committees, DOE’s Office of Biological and Environmental Research, and other key stakeholders. This Strategic Plan is fully aligned with the strategic plans of DOE and its Office of Science. We recognize that shifts in science and technology, national priorities, and resources made available through the Federal budget process create planning uncertainties and, ultimately, a highly dynamic planning environment. Accordingly, this Strategic Plan should be viewed as a living document for which we will continually evaluate changing needs and opportunities posed by our stakeholders (i.e., DOE, users, staff, advisory committees), work closely with them to understand and respond to those changes, and align our strategy accordingly.

  9. Strategic planning for marketers.

    Science.gov (United States)

    Wilson, I

    1978-12-01

    The merits of strategic planning as a marketing tool are discussed in this article which takes the view that although marketers claim to be future-oriented, they focus too little attention on long-term planning and forecasting. Strategic planning, as defined by these authors, usually encompasses periods of between five and twenty-five years and places less emphasis on the past as an absolute predictor of the future. It takes a more probabilistic view of the future than conventional marketing strategy and looks at the corporation as but one component interacting with the total environment. Inputs are examined in terms of environmental, social, political, technological and economic importance. Because of its futuristic orientation, an important tenant of strategic planning is the preparation of several alternative scenarios ranging from most to least likely. By planning for a wide-range of future market conditions, a corporation is more able to be flexible by anticipating the course of future events, and is less likely to become a captive reactor--as the authors believe is now the case. An example of strategic planning at General Elecric is cited.

  10. Strategic Tutor Monitoring.

    Science.gov (United States)

    Chee-kwong, Kenneth Chao

    1996-01-01

    Discusses effective tutor monitoring strategies based on experiences at the Open Learning Institute of Hong Kong. Highlights include key performance and strategic control points; situational factors, including tutor expectations and relevant culture; Theory X versus Theory Y leadership theories; and monitoring relationships with tutors. (LRW)

  11. Towards Strategic Language Learning

    NARCIS (Netherlands)

    Oostdam, R.; Rijlaarsdam, Gert

    1995-01-01

    Towards Strategic Language Learning is the result of extensive research in the relationship between mother tongue education and foreign language learning. As language skills that are taught during native language lessons are applied in foreign language performance as well, it is vital that curricula

  12. A Strategic Planning Workbook.

    Science.gov (United States)

    Austin, William

    This workbook outlines the Salem Community College's (New Jersey) Strategic Planning Initiative (SPI), which will enable the college to enter the 21st Century as an active agent in the educational advancement of the Salem community. SPI will allow college faculty, staff, students, and the local community to reflect on the vitality of the college…

  13. Strategic CSR in Afghanistan

    DEFF Research Database (Denmark)

    Azizi, Sameer

    CSR is a rising phenomena in Afghanistan – but why are firms concerned about CSR in a least-developed context such as Afghanistan, and what are the strategic benefits? This paper is one of the first to explore these CSR issues in a least-developed country. It does so by focusing on CSR in the Afg......CSR is a rising phenomena in Afghanistan – but why are firms concerned about CSR in a least-developed context such as Afghanistan, and what are the strategic benefits? This paper is one of the first to explore these CSR issues in a least-developed country. It does so by focusing on CSR...... advantages through CSR in Afghanistan, and if so which and how these strategic benefits are gained. The paper shows that the developmental challenges of Afghanistan are the key explanations for why companies engage in CSR. Roshan has engaged in proactive CSR to overcome the contextual barriers for growth......’, but is based on a ‘license to operate’ motivation, where businesses have free room for maneuvering CSR towards their strategic priorities and business goals. Whether this creates a ‘shared value’ for both business and in particularly for the society is however still questionable....

  14. Naming as Strategic Communication:

    DEFF Research Database (Denmark)

    Schmeltz, Line; Kjeldsen, Anna Karina

    2016-01-01

    This article presents a framework for understanding corporate name change as strategic communication. From a corporate branding perspective, the choice of a new name can be seen as a wish to stand out from a group of similar organizations. Conversely, from an institutional perspective, name change...

  15. Using Collaborative Strategic Reading.

    Science.gov (United States)

    Klingner, Janette K.; Vaughn, Sharon

    1998-01-01

    Describes collaborative strategic reading (CSR), a technique for teaching students, such as those with learning disabilities, reading comprehension and vocabulary skills in a cooperative setting. Covers teaching the four strategies of CSR (preview, click and clunk, get the gist, and wrap up), as well as teaching students cooperative learning group…

  16. Strategic Alignment of Business Intelligence

    OpenAIRE

    Cederberg, Niclas

    2010-01-01

    This thesis is about the concept of strategic alignment of business intelligence. It is based on a theoretical foundation that is used to define and explain business intelligence, data warehousing and strategic alignment. By combining a number of different methods for strategic alignment a framework for alignment of business intelligence is suggested. This framework addresses all different aspects of business intelligence identified as relevant for strategic alignment of business intelligence...

  17. Strategic Mismatches in Competing Teams

    OpenAIRE

    Kräkel, Matthias

    1999-01-01

    This paper discusses the strategic role of mismatching, where players voluntarily form inefficient teams or forego the formation of efficient teams, respectively. Strategic mismatching can be rational when players realize a competitive advantage (e.g. harming other competitors). In addition, the results show that free riding can be beneficial for a team in combination with strategic mismatching and that the loser?s curse may be welfare improving by mitigating the problem of strategic mismatch...

  18. Strategic analysis of choosen subject

    OpenAIRE

    Gebhart, Ondřej

    2014-01-01

    Thesis "Strategic Analysis of the selected entity 'of the chosen subject is about strategy BRISK Tabor as The aim is to assess the strategic focus of the company, the discovery and propose changes in the current Strategy. The work is divided into a theoretical part and a practical part of the conclusion. The theoretical part is from the perspective of economic theory and practice is portrayed strategy, strategic management process, thinking and planning. Strategic analysis of the external and...

  19. STRATEGIC CONTROLLING IMPLEMENTATION IN MOLDOVAN BAKERY INDUSTRY

    Directory of Open Access Journals (Sweden)

    Maria Oleiniuc

    2012-03-01

    Full Text Available Strategic controlling is a strategic management subsystem that coordinates the functionsof strategic analysis, training objectives, planning and strategy correction, controls the operation of theentire system and well as develops and monitors the strategic informational assurance subsystem.

  20. Being Strategic in HE Management

    Science.gov (United States)

    West, Andrew

    2008-01-01

    The call to be strategic--and with it the concept of strategic management--can bring to mind a wide range of definitions, and there is now a huge array of academic literature supporting the different schools of thought. At a basic level, however, strategic thinking is probably most simply about focusing on the whole, rather than the part. In…

  1. Peaceful Development and Strategic Opportunity

    Institute of Scientific and Technical Information of China (English)

    Yang Yi

    2006-01-01

    @@ The international strategic situation and environment China faces have changed dramatically since September 11. China has closely followed and adapted itself to the ever-changing situation, seized strategic opportunity, adjusted its global strategy, adhered to peaceful development and displayed diplomacy and strategic flexibility. These are manifested in the following four aspects:

  2. Strategic Partnerships in Higher Education

    Science.gov (United States)

    Ortega, Janet L.

    2013-01-01

    The purpose of this study was to investigate the impacts of strategic partnerships between community colleges and key stakeholders; to specifically examine strategic partnerships; leadership decision-making; criteria to evaluate strategic partnerships that added value to the institution, value to the students, faculty, staff, and the local…

  3. Strategic market planning for hospitals.

    Science.gov (United States)

    Zallocco, R L; Joseph, W B; Doremus, H

    1984-01-01

    The application of strategic market planning to hospital management is discussed, along with features of the strategic marketing management process. A portfolio analysis tool, the McKinsey/G.E. Business Screen, is presented and, using a large urban hospital as an example, discussed in detail relative to hospital administration. Finally, strategic implications of the portfolio analysis are examined.

  4. The Possibilities of Strategic Finance

    Science.gov (United States)

    Chaffee, Ellen

    2010-01-01

    Strategic finance is aligning financial decisions--regarding revenues, creating and maintaining institutional assets, and using those assets--with the institution's mission and strategic plan. The concept known as "strategic finance" increasingly is being seen as a useful perspective for helping boards and presidents develop a sustainable…

  5. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  6. Strategic Planning and Financial Management

    Science.gov (United States)

    Conneely, James F.

    2010-01-01

    Strong financial management is a strategy for strategic planning success in student affairs. It is crucial that student affairs professionals understand the necessity of linking their strategic planning with their financial management processes. An effective strategic planner needs strong financial management skills to implement the plan over…

  7. The strategic entrepreneurial thinking imperative

    Directory of Open Access Journals (Sweden)

    S. Dhliwayo

    2007-12-01

    Full Text Available Purpose: The aim of this paper is to demonstrate that strategic entrepreneurial thinking is a unitary concept which should be viewed as a standalone construct. Design/Methodology/Approach: The concept strategic entrepreneurial thinking is modelled from an analysis of strategic thinking and entrepreneurial thinking from available literature. The strategic entrepreneurial mindset imperative is then emphasised and confirmed. Findings: This paper's finding is that there is no difference between strategic thinking and the entrepreneurial mindset. Instead, the composite strategic entrepreneurial mindset construct should be treated as a unitary construct. Practical implications: The importance for practitioners is that the paper integrates two constructs, strategic thinking and entrepreneurial thinking into a new concept, strategic entrepreneurial thinking. The paper shows how difficult it is to split this thinking and behaviour into separate strategic and entrepreneurial thought and action processes. Originality/Value: The paper explores the ''thinking'' aspect of the strategic entrepreneurial concept which prominent authors on the strategic entrepreneurship topic seem to have not focused on. The resultant strategic entrepreneurial mindset is modelled into a new stand alone concept on its own.

  8. LIBO accelerates

    CERN Multimedia

    2002-01-01

    The prototype module of LIBO, a linear accelerator project designed for cancer therapy, has passed its first proton-beam acceleration test. In parallel a new version - LIBO-30 - is being developed, which promises to open up even more interesting avenues.

  9. Acceleration of the FERM nodal program

    International Nuclear Information System (INIS)

    It was tested three acceleration methods trying to reduce the number of outer iterations in the FERM nodal program. The results obtained indicated that the Chebychev polynomial acceleration method with variable degree results in a economy of 50% in the computer time. Otherwise, the acceleration method by source asymptotic extrapolation or by zonal rebalance did not result in economy of the global computer time, however some acceleration had been verified in outer iterations. (M.C.K.)

  10. Acceleration of the nodal program FERM

    International Nuclear Information System (INIS)

    Acceleration of the nodal FERM was tried by three acceleration schemes. Results of the calculations showed the best acceleration with the Tchebyshev method where the savings in the computing time were of the order of 50%. Acceleration with the Assymptotic Source Extrapoltation Method and with the Coarse-Mesh Rebalancing Method did not result in any improvement on the global computational time, although a reduction in the number of outer iterations was observed. (Author)

  11. Superposed-laser electron acceleration

    International Nuclear Information System (INIS)

    A new mechanism is proposed for electron acceleration by using two superposed laser beams in vacuum. In this mechanism, an electron is accelerated by the longitudinal component of the wave electric field in the overlapped region of two laser beams. Single-particle computations and analytical works are performed in order to demonstrate the viability. These results show that the electron can be accelerated well in this proposed mechanism. (author)

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. Strategic Analysis of the Enterprise Mobile Device Management Software Industry

    OpenAIRE

    Shesterin, Dmitry

    2012-01-01

    This paper analyzes the enterprise mobile device management industry and evaluates three strategic alternatives by which an established computer systems management software manufacturing company can enter this industry. The analysis of the three strategic alternatives to build, buy or partner in order to bring to market an enterprise mobile device management product offering delves into an examination of the company’s existing position and performance; conducts an external analysis of the ent...

  14. Correct and efficient accelerator programming

    OpenAIRE

    Cohen, Albert; Donaldson, Alistair F.; Huisman, Marieke; Katoen, Joost-Pieter

    2013-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 13142 “Correct and Efficient Accelerator Programming”. The aim of this Dagstuhl seminar was to bring together researchers from various sub-disciplines of computer science to brainstorm and discuss the theoretical foundations, design and implementation of techniques and tools for correct and efficient accelerator programming.

  15. Analysis of Accelerated Gossip Algorithms

    NARCIS (Netherlands)

    Liu, J.; Anderson, B.D.O.; Cao, M.; Morse, A.S.

    2009-01-01

    This paper investigates accelerated gossip algorithms for distributed computations in networks where shift-registers are utilized at each node. By using tools from matrix analysis, we prove the existence of the desired acceleration and establish the fastest rate of convergence in expectation for two

  16. Strategic Management and Business Analysis

    CERN Document Server

    Williamson, David; Jenkins, Wyn; Moreton, Keith Michael

    2003-01-01

    Strategic Business Analysis shows students how to carry out a strategic analysis of a business, with clear guidelines on where and how to apply the core strategic techniques and models that are the integral tools of strategic management.The authors identify the key questions in strategic analysis and provide an understandable framework for answering these questions.Several case studies are used to focus understanding and enable a more thorough analysis of the concepts and issues, especially useful for students involved with case study analysis.Accompanying the text is a CD-Rom containing the m

  17. Strategizing Communication. Theory and Practice

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    Strategizing Communication offers a unique perspective on the theory and practice of strategic communication. Written for students and practitioners interested in learning about and acquiring tools for dealing with the technological, environmental and managerial challenges, which organizations face...... when communicating in today’s mediascape, this book presents an array of theories, concepts and models through which we can understand and practice communication strategically. The core of the argument is in the title: strategizing – meaning the act of making something strategic. This entails looking...

  18. ERP as a strategic management tool: six evolutionary stages

    OpenAIRE

    Shank, John K.; San Miguel, Joseph G.

    2002-01-01

    Approved for public release; distribution is unlimited Over the past 50 years computers have evolved from vacuum tubes to integrated circuits to PCs. As hardware changed so did managerial computing, from MIS to ERP. Outsourcers developed niche markets and the promise of seamless processing for strategic decision-making seemed possible. Here the history, progress to date, and unmet challenges are explored

  19. Strategizing Communication. Theory and Practice

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Just, Sine Nørholm

    Strategizing Communication offers a unique perspective on the theory and practice of strategic communication. Written for students and practitioners interested in learning about and acquiring tools for dealing with the technological, environmental and managerial challenges, which organizations face...... when communicating in today’s mediascape, this book presents an array of theories, concepts and models through which we can understand and practice communication strategically. The core of the argument is in the title: strategizing – meaning the act of making something strategic. This entails looking...... beyond, but not past instrumental, rational plans in order to become better able to understand and manage the concrete, incremental practices and contexts in which communication becomes strategic. Thus, we argue that although strategic communicators do (and should) make plans, a plan in itself does...

  20. Strategic Planning in U.S. Municipalities

    Directory of Open Access Journals (Sweden)

    James VAN RAVENSWAY

    2015-12-01

    Full Text Available Strategic planning started in the U.S. as a corporate planning endeavor. By the 1960’s, it had become a major corporate management tool in the Fortune 500. At fi rst, it was seen as a way of interweaving policies, values and purposes with management, resources and market information in a way that held the organization together. By the 1950’s, the concept was simplifi ed somewhat to focus on SWOT as a way of keeping the corporation afl oat in a more turbulent world. The public sector has been under pressure for a long time to become more effi cient, effective and responsive. Many have felt that the adoption of business practices would help to accomplish that. One tool borrowed from business has been strategic planning. At the local government level, strategic planning became popular starting in the 1980’s, and the community’s planning offi ce was called on to lead the endeavor. The planning offi ce was often the advocate of the process. Urban planning offi ces had been doing long-range plans for decades, but with accelerating urban change a more rapid action-oriented response was desired. The paper describes this history and process in the East Lansing, Michigan, U.S., where comprehensive community plans are the result of a multi-year visioning process and call for action- oriented, strategies for targeted parts of the community.

  1. Accelerating QDP++ using GPUs

    CERN Document Server

    Winter, Frank

    2011-01-01

    Graphic Processing Units (GPUs) are getting increasingly important as target architectures in scientific High Performance Computing (HPC). NVIDIA established CUDA as a parallel computing architecture controlling and making use of the compute power of GPUs. CUDA provides sufficient support for C++ language elements to enable the Expression Template (ET) technique in the device memory domain. QDP++ is a C++ vector class library suited for quantum field theory which provides vector data types and expressions and forms the basis of the lattice QCD software suite Chroma. In this work accelerating QDP++ expression evaluation to a GPU was successfully implemented leveraging the ET technique and using Just-In-Time (JIT) compilation. The Portable Expression Template Engine (PETE) and the C API for CUDA kernel arguments were used to build the bridge between host and device memory domains. This provides the possibility to accelerate Chroma routines to a GPU which are typically not subject to special optimisation. As an ...

  2. Strategizing on innovation systems

    DEFF Research Database (Denmark)

    Jofre, Sergio

    This paper explores the strategic context of the implementation of the European Institute of Technology (EIT) from the perspective of National Innovation Systems (NIS) and the Triple Helix of University-Government-Industry relationship. The analytical framework is given by a comparative study of...... welfare creation – therefore, the NIS analysis inherently focuses on economy and emphasizes the role of industry over government and university. A complementary analytical concept that regards NIS from a slightly different perspective is the Triple Helix Model of innovation (e.g. Etzkowitz & Leydesdorff...... NIS in the EU and its closer competitors the US and Japan, with emphasis on the particular features and developments of their respective Triple Helix models. Based on the results of this analysis, the paper suggests strategic recommendations regarding the EIT deployment. The work aims an additional...

  3. The Strategic Mediator

    DEFF Research Database (Denmark)

    Rossignoli, Cecilia; Carugati, Andrea; Mola, Lapo

    2009-01-01

    The last 10 years have witnessed the emergence of electronic marketplaces as players that leverage new technologies to facilitate B2B internet-mediated collaborative business. Nowadays these players are augmenting their services from simple intermediation to include new inter-organizational relat......The last 10 years have witnessed the emergence of electronic marketplaces as players that leverage new technologies to facilitate B2B internet-mediated collaborative business. Nowadays these players are augmenting their services from simple intermediation to include new inter...... as an exclusive club, the belonging to which provides a strategic advantage. The technology brought forth by the marketplace participates in shaping the strategic demands of the participants which in turn request the marketplace to redesign its own strategy. Profiting from this unintended demand, the e...

  4. 3RDWHALE STRATEGIC ANALYSIS

    OpenAIRE

    Johnson, Andrew

    2009-01-01

    This essay provides a detailed strategic analysis of 3rdWhale, a Vancouver-based start-up in the sustainability sector, along with an analysis of the smartphone applications industry. Porter?s five forces model is used to perform an industry analysis of the smartphone application industry and identify key success factors for application developers. Using the identified factors, 3rdWhale is compared to its indirect competitors to identify opportunities and threats and produce a range of strate...

  5. Strategic Leadership towards Sustainability

    OpenAIRE

    Robèrt, Karl-Henrik; Broman, Göran; Waldron, David; Ny, Henrik; Byggeth, Sophie; Cook, David; Johansson, Lena; Oldmark, Jonas; Basile, George; Haraldsson, Hördur V.

    2004-01-01

    The Master's programme named "Strategic Leadership Towards Sustainability" is offered at the Blekinge Institute of Technology (Blekinge Tekniska Högskola) in Karlskrona, Sweden. This Master's programme builds on four central themes: (1) four scientific principles for socio-ecological sustainability; (2) a planning methodology of "backcasting" based on those scientific principles for sustainability; (3) a five-level model for planning in complex systems, into which backcasting is incorporated ...

  6. Centralized digital control of accelerators

    International Nuclear Information System (INIS)

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors

  7. Vacuum Brazing of Accelerator Components

    International Nuclear Information System (INIS)

    Commonly used materials for accelerator components are those which are vacuum compatible and thermally conductive. Stainless steel, aluminum and copper are common among them. Stainless steel is a poor heat conductor and not very common in use where good thermal conductivity is required. Aluminum and copper and their alloys meet the above requirements and are frequently used for the above purpose. The accelerator components made of aluminum and its alloys using welding process have become a common practice now a days. It is mandatory to use copper and its other grades in RF devices required for accelerators. Beam line and Front End components of the accelerators are fabricated from stainless steel and OFHC copper. Fabrication of components made of copper using welding process is very difficult and in most of the cases it is impossible. Fabrication and joining in such cases is possible using brazing process especially under vacuum and inert gas atmosphere. Several accelerator components have been vacuum brazed for Indus projects at Raja Ramanna Centre for Advanced Technology (RRCAT), Indore using vacuum brazing facility available at RRCAT, Indore. This paper presents details regarding development of the above mentioned high value and strategic components/assemblies. It will include basics required for vacuum brazing, details of vacuum brazing facility, joint design, fixturing of the jobs, selection of filler alloys, optimization of brazing parameters so as to obtain high quality brazed joints, brief description of vacuum brazed accelerator components etc.

  8. Constructing the ASCI computational grid

    Energy Technology Data Exchange (ETDEWEB)

    BEIRIGER,JUDY I.; BIVENS,HUGH P.; HUMPHREYS,STEVEN L.; JOHNSON,WILBUR R.; RHEA,RONALD E.

    2000-06-01

    The Accelerated Strategic Computing Initiative (ASCI) computational grid is being constructed to interconnect the high performance computing resources of the nuclear weapons complex. The grid will simplify access to the diverse computing, storage, network, and visualization resources, and will enable the coordinated use of shared resources regardless of location. To match existing hardware platforms, required security services, and current simulation practices, the Globus MetaComputing Toolkit was selected to provide core grid services. The ASCI grid extends Globus functionality by operating as an independent grid, incorporating Kerberos-based security, interfacing to Sandia's Cplant{trademark},and extending job monitoring services. To fully meet ASCI's needs, the architecture layers distributed work management and criteria-driven resource selection services on top of Globus. These services simplify the grid interface by allowing users to simply request ''run code X anywhere''. This paper describes the initial design and prototype of the ASCI grid.

  9. Contributions to algorithms and efficient programming of new parallel architectures including computational accelerators in the field of neutron transport and shielding

    OpenAIRE

    Jérôme, Dubois

    2011-01-01

    In science, simulation is a key process for research or validation. Modern computer technology allows faster numerical experiments, which are cheaper than real models. In the field of neutron simulation, the calculation of eigenvalues is one of the key challenges. The complexity of these problems is such that a lot of computing power may be necessary. The work of this thesis is first the evaluation of new computing hardware such as graphics card or massively multicore chips, and their applica...

  10. SYSTEM REFLEXIVE STRATEGIC MARKETING MANAGEMENT

    Directory of Open Access Journals (Sweden)

    A. Dligach

    2013-10-01

    Full Text Available This article reviews the System Reflexive paradigm of strategic marketing management, being based on the alignment of strategic economic interests of stakeholders, specifically, enterprise owners and hired managers, and consumers. The essence of marketing concept of management comes under review, along with the strategic management approaches to business, buildup and alignment of economic interests of business stakeholders. A roadmap for resolving the problems of modern marketing is proposed through the adoption of System Reflexive marketing theory.

  11. An Introduction to Strategic Communication

    OpenAIRE

    Thomas, Gail Fann; Stephens, Kimberlie J.

    2014-01-01

    The article of record as published may be located at http://dx.doi.org/10.1177/2329488414560469 Strategic communication is an emerging area of study in the communication and management social sciences. Recent academic conversations around this topic have appeared in publications such as the International Journal of Strategic Communication, which was established in 2007, and The Routledge Handbook of Strategic Communication, which will be published this year. Likewise, the discu...

  12. The Strategic Human Resource Management

    OpenAIRE

    Merthová, Hana

    2012-01-01

    This thesis deals with the topic " Strategic management of human resources." It is divided into two main parts, the theoretical and the practical. The theoretical part focuses on the characteristics and definition of terms related to issues of strategic human resources management and defining relationships between these concepts. In this part there is further processed one of the most important fields of strategic management of human resources in the company which is the education and per...

  13. STRATEGIC ENTREPRENEURSHIP IN FAMILY BUSINESS

    OpenAIRE

    G.T. LUMPKIN; L. STEIER; Wright, M

    2012-01-01

    The purpose of this special issue is to promote research on the role of family in nurturing entrepreneurial ventures as well as on the importance of strategic entrepreneurship in maintaining the strength and viability of established and multigenerational family firms. Two related research questions are at the heart of this inquiry: (1) In what ways does the influence of family matter to strategic entrepreneurship?; and (2) How can strategic entrepreneurship contribute to understanding and str...

  14. Strategic Human Resources Management

    Directory of Open Access Journals (Sweden)

    Marta Muqaj

    2016-07-01

    Full Text Available Strategic Human Resources Management (SHRM represents an important and sensitive aspect of the functioning and development of a company, business, institution, state, public or private agency of a country. SHRM is based on a point of view of the psychological practices, especially by investing on empowerment, broad training and teamwork. This way it remains the primary resource to maintain stability and competitiveness. SHRM has lately evolved on fast and secure steps, and the transformation from Management of Human Resources to SHRM is becoming popular, but it still remains impossible to exactly estimate how much SHRM has taken place in updating the practices of HRM in organizations and institutions in general. This manuscript aims to make a reflection on strategic management, influence factors in its practices on some organizations. Researchers aim to identify influential factors that play key roles in SHRM, to determine its challenges and priorities which lay ahead, in order to select the most appropriate model for achieving a desirable performance. SHRM is a key factor in the achievement of the objectives of the organization, based on HR through continuous performance growth, it’s a complex process, unpredictable and influenced by many outside and inside factors, which aims to find the shortest way to achieve strategic competitive advantages, by creating structure planning, organizing, thinking values, culture, communication, perspectives and image of the organization. While traditional management of HR is focused on the individual performance of employees, the scientific one is based on the organizational performance, the role of the HRM system as main factor on solving business issues and achievement of competitive advantage within its kind.

  15. Strategic planning and republicanism

    Directory of Open Access Journals (Sweden)

    Mazza Luigi

    2010-01-01

    Full Text Available The paper develops two main linked themes: (i strategic planning reveals in practice limits that are hard to overcome; (ii a complete planning system is efficacy only in the framework of a republican political, social and government culture. It is argued that the growing disappointment associated to strategic planning practices, may be due to excessive expectations, and the difficulties encountered by strategic planning are traced to three main issues: (a the relationship between politics and planning; (b the relationship between government and governance; and (c the relationship between space and socioeconomic development. Some authors recently supported an idea of development as consisting in the qualitative evolution of forms of social rationality and argued that a reflection about the relationships between physical transformations and visions of development could be a way of testing innovations. But such strong demands might be satisfied only if we manage to make a 'new social and territorial pact for development', recreating a social fabric imbued with shared values. The re-creation of a social fabric imbued with shared values requires a rich conception of the political community and the possibility that the moral purposes of the community may be incorporated by the state. All this is missing today. Outside a republican scheme planning activities are principally instruments for legitimising vested interests and facilitating their investments, and the resolution of the conflicts that arise between the planning decisions of the various levels of government becomes at least impracticable. A complete planning system can be practised if can be referred to the authority and syntheses expressed in and by statehood, which suggests that in a democratic system planning is republican by necessity rather than by choice.

  16. Strategic Planning of the Forest Sector: Summary Report of a Nordic Meeting

    OpenAIRE

    Seppaelae, R.; Loennstedt, L.; Morgan, A.

    1983-01-01

    Strategic planning of the forest sector in Nordic countries focuses on the major long-term problems and issues which are or will be confronting forestry and the forest industry. In this paper these problems and issues are described. Examples of strategic planning and the use of models and computers in the forest industry are given. It can be concluded that current forest sector modeling is of major importance for strategic planning of the forest industry in Nordic countries.

  17. Strategizing y liderazgo

    OpenAIRE

    Marín Tuyá, Belén

    2013-01-01

    El desarrollo del strategizing, concepto introducido por Whittington (1996) que enfoca la estrategia en la práctica “cómo algo que las personas hacen”, surgió por la creciente insatisfacción con la investigación convencional en estrategia. Así mientras las personas realizaban la estrategia, las teorías se centraban en análisis multivariantes sobre los efectos de la estrategia en el rendimiento de la organización con una curiosa ausencia de los actores humanos. Con el objetivo de avanzar en el...

  18. Strategic Aspects of Bundling

    International Nuclear Information System (INIS)

    The increase of bundle supply has become widespread in several sectors (for instance in telecommunications and energy fields). This paper review relates strategic aspects of bundling. The main purpose of this paper is to analyze profitability of bundling strategies according to the degree of competition and the characteristics of goods. Moreover, bundling can be used as price discrimination tool, screening device or entry barriers. In monopoly case bundling strategy is efficient to sort consumers in different categories in order to capture a maximum of surplus. However, when competition increases, the profitability on bundling strategies depends on correlation of consumers' reservations values. (author)

  19. Beyond Strategic Vision

    CERN Document Server

    Cowley, Michael

    2012-01-01

    Hoshin is a system which was developed in Japan in the 1960's, and is a derivative of Management By Objectives (MBO). It is a Management System for determining the appropriate course of action for an organization, and effectively accomplishing the relevant actions and results. Having recognized the power of this system, Beyond Strategic Vision tailors the Hoshin system to fit the culture of North American and European organizations. It is a "how-to" guide to the Hoshin method for executives, managers, and any other professionals who must plan as part of their normal job. The management of an o

  20. Guam Strategic Energy Plan

    Energy Technology Data Exchange (ETDEWEB)

    Conrad, M. D.

    2013-07-01

    Describes various energy strategies available to Guam to meet the territory's goal of diversifying fuel sources and reducing fossil energy consumption 20% by 2020.The information presented in this strategic energy plan will be used by the Guam Energy Task Force to develop an energy action plan. Available energy strategies include policy changes, education and outreach, reducing energy consumption at federal facilities, and expanding the use of a range of energy technologies, including buildings energy efficiency and conservation, renewable electricity production, and alternative transportation. The strategies are categorized based on the time required to implement them.

  1. Strategic Urban Governance

    DEFF Research Database (Denmark)

    Pagh, Jesper

    2014-01-01

    The days of long-term predict-and-provide planning that saw its heydays in the post-war decades are long gone. As our late-modern time presents us with an evermore complex and contrasting view of the world, planning has become a much more fragmented and ambivalent affair. That a country or a city...... should be run like a private corporation has increasingly become common sense, and thus the competition among entities – be it countries, regions or cities – to a greater and greater extent defines success and the means to achieve it. What has been collected under the umbrella term Strategic Urban...

  2. System of strategic planning of enterprises activity

    OpenAIRE

    Тригоб’юк, Сергій Сергійович

    2012-01-01

    The review of researches of strategic management is resulted in the article, especially features of strategic administrative decisions, strategic diagnostics, in the system of the strategic planning of activity of enterprises, and the problems of leadership, development of strategic thinking and social aspects of business conduct in a modern variable environment also.

  3. 7 CFR 25.202 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Strategic plan. 25.202 Section 25.202 Agriculture... Procedure § 25.202 Strategic plan. (a) Principles of strategic plan. The strategic plan included in the application must be developed in accordance with the following four key principles: (1) Strategic vision...

  4. Accelerator Experiments for Astrophysics

    OpenAIRE

    Ng, Johnny S. T.

    2003-01-01

    Many recent discoveries in astrophysics involve phenomena that are highly complex. Carefully designed experiments, together with sophisticated computer simulations, are required to gain insights into the underlying physics. We show that particle accelerators are unique tools in this area of research, by providing precision calibration data and by creating extreme experimental conditions relevant for astrophysics. In this paper we discuss laboratory experiments that can be carried out at the S...

  5. Operationalizing strategic marketing.

    Science.gov (United States)

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  6. Accelerator operations

    International Nuclear Information System (INIS)

    This section is concerned with the operation of both the tandem-linac system and the Dynamitron, two accelerators that are used for entirely different research. Developmental activities associated with the tandem and the Dynamitron are also treated here, but developmental activities associated with the superconducting linac are covered separately because this work is a program of technology development in its own right

  7. Advanced accelerators

    International Nuclear Information System (INIS)

    This report discusses the suitability of four novel particle acceleration technologies for multi-TeV particle physics machines: laser driven linear accelerators (linac), plasma beat-wave devices, plasma wakefield devices, and switched power and cavity wakefield linacs. The report begins with the derivation of beam parameters practical for multi-TeV devices. Electromagnetic field breakdown of materials is reviewed. The two-beam accelerator scheme for using a free electron laser as the driver is discussed. The options recommended and the conclusions reached reflect the importance of cost. We recommend that more effort be invested in achieving a self-consistent range of TeV accelerator design parameters. Beat-wave devices have promise for 1-100 GeV applications and, while not directly scalable to TeV designs, the current generation of ideas are encouraging for the TeV regime. In particular, surfatrons, finite-angle optical mixing devices, plasma grating accelerator, and the Raman forward cascade schemes all deserve more complete analysis. The exploitation of standard linac geometry operated in an unconventional mode is in a phase of rapid evolution. While conceptual projects abound, there are no complete designs. We recommend that a fraction of sponsored research be devoted to this approach. Wakefield devices offer a great deal of potential; trades among their benefits and constraints are derived and discussed herein. The study of field limitation processes has received inadequate attention; this limits experiment designers. The costs of future experiments are such that investment in understanding these processes is prudent. 34 refs., 12 figs., 3 tabs

  8. Tax rates as strategic substitutes

    NARCIS (Netherlands)

    H. Vrijburg (Hendrik); R.A. de Mooij (Ruud)

    2016-01-01

    textabstractThis paper analytically derives conditions under which the slope of the tax-reaction function is negative in a classical tax competition model. If countries maximize welfare, a negative slope (reflecting strategic substitutability) occurs under relatively mild conditions. The strategic t

  9. Strategic planning for the board.

    Science.gov (United States)

    Davies, A

    1991-04-01

    Although the planning operation is regarded by some observers as unrealistic in conditions of rapid change and increasing competition, the discipline of strategic thinking and the need for strategic leadership continue to be of vital importance. The author examines the purpose of the Board of Directors and its role in the management of strategy. PMID:10111271

  10. Transfers, Contracts and Strategic Games

    NARCIS (Netherlands)

    Kleppe, J.; Hendrickx, R.L.P.; Borm, P.E.M.; Garcia-Jurado, I.; Fiestras-Janeiro, G.

    2007-01-01

    This paper analyses the role of transfer payments and strategic con- tracting within two-person strategic form games with monetary pay- offs. First, it introduces the notion of transfer equilibrium as a strat- egy combination for which individual stability can be supported by allowing the possibilit

  11. Strategic Planning for Higher Education.

    Science.gov (United States)

    Kotler, Philip; Murphy, Patrick E.

    1981-01-01

    The framework necessary for achieving a strategic planning posture in higher education is outlined. The most important benefit of strategic planning for higher education decision makers is that it forces them to undertake a more market-oriented and systematic approach to long- range planning. (Author/MLW)

  12. Strategic Partnerships in International Development

    Science.gov (United States)

    Treat, Tod; Hartenstine, Mary Beth

    2013-01-01

    This chapter provides a framework and recommendations for development of strategic partnerships in a variety of cultural contexts. Additionally, this study elucidates barriers and possibilities in interagency collaborations. Without careful consideration regarding strategic partnerships' approaches, functions, and goals, the ability to…

  13. Strategic Planning Is an Oxymoron

    Science.gov (United States)

    Bassett, Patrick F.

    2012-01-01

    The thinking on "strategic thinking" has evolved significantly over the years. In the previous century, the independent school strategy was to focus on long-range planning, blithely projecting 10 years into the future. For decades this worked well enough, but in the late 20th century, independent schools shifted to "strategic planning," with its…

  14. Strategic Interactions in Franchise Relationships

    NARCIS (Netherlands)

    Croonen, Evelien Petronella Maria

    2006-01-01

    This dissertation deals with understanding strategic interactions between franchisors and franchisees. The empirical part of this study consists of in-depth case studies in four franchise systems in the Dutch drugstore industry. The case studies focus on a total of eight strategic change processes i

  15. Development of leadership capacities as a strategic factor for sustainability

    OpenAIRE

    Cabeza-Erikson, Isabel; Edwards, Kimberly; Brabant, Theo Van

    2008-01-01

    Building capacities of sustainability change agents is primordial to increase the effectiveness and to accelerate the process towards a sustainable society. This research investigates the current challenges and practices of sustainability change agents and analyses current research in the field of leadership development. A Framework for Strategic Sustainable Development is described as a means to overcome and address the complex challenges that society faces today. Furthermore the development...

  16. IS and Business Leaders' Strategizing

    DEFF Research Database (Denmark)

    Hansen, Anne Mette

    business environments characterized by changes in market demands, technology options, and government regulations. Under such dynamic conditions, IS and business leaders must constantly strategize and set new IS strategic objectives so that they can retain and improve innovation, competitiveness......, and productivity. However, strategizing in such dynamic environments is not straightforward process. While IS and business leaders must develop new IS strategic objectives and move quickly towards new opportunities, they must also be good at exploiting the value of current assets and reducing the costs of existing...... operations. Drawing upon organizational learning theory (March 1991), this process involves two learning processes: exploration and exploitation. Leaders explore new strategic initiatives in response to environmental shifts while at the same time effectively exploiting what they have already learned from...

  17. Strategic groups in the Belgian fishing fleet

    OpenAIRE

    Stouten, H.; A. HEENE; Gellynck, X.; Polet, H

    2011-01-01

    This study examines the heterogeneity of the Belgian fishing fleet based on “strategic groups”, a concept borrowed from the field of strategic management. Its objectives are: (1) to define strategic groups within the Belgian fishing fleet; (2) to examine the performance differences among these strategic groups; (3) to examine whether firms (i.e., vessels) move between strategic groups over time; and (4) to examine if firm-movement (i.e., vessel-movement) differs across strategic groups. In th...

  18. Strategic cycling: shaking complacency in healthcare strategic planning.

    Science.gov (United States)

    Begun, J; Heatwole, K B

    1999-01-01

    As the conditions affecting business and healthcare organizations in the United States have become more turbulent and uncertain, strategic planning has decreased in popularity. Strategic planning is criticized for stiffling creative responses to the new marketplace and for fostering compartmentalized organizations, adherence to outmoded strategies, tunnel vision in strategy formulation, and overemphasis on planning to the detriment of implementation. However, effective strategic planning can be a force for mobilizing all the constituents of an organization, creating discipline in pursuit of a goal, broadening an organization's perspective, improving communication among disciplines, and motivating the organization's workforce. It is worthwhile for healthcare organizations to preserve these benefits of strategic planning at the same time recognizing the many sources of turbulence and uncertainty in the healthcare environment. A model of "strategic cycling" is presented to address the perceived shortcomings of traditional strategic planning in a dynamic environment. The cycling model facilitates continuous assessment of the organization's mission/values/vision and primary strategies based on feedback from benchmark analysis, shareholder impact, and progress in strategy implementation. Multiple scenarios and contingency plans are developed in recognition of the uncertain future. The model represents a compromise between abandoning strategic planning and the traditional, linear model of planning based on progress through predetermined stages to a masterpiece plan. PMID:10621138

  19. Capabilities for Strategic Adaptation

    DEFF Research Database (Denmark)

    Distel, Andreas Philipp

    firms’ ability to absorb and leverage new knowledge. The third paper is an empirical study which conceptualizes top managers’ resource cognition as a managerial capability underlying firms’ resource adaptation; it empirically examines the performance implications of this capability and organizational......This dissertation explores capabilities that enable firms to strategically adapt to environmental changes and preserve competitiveness over time – often referred to as dynamic capabilities. While dynamic capabilities being a popular research domain, too little is known about what these capabilities...... empirical studies through the dynamic capabilities lens and develops propositions for future research. The second paper is an empirical study on the origins of firm-level absorptive capacity; it explores how organization-level antecedents, through their impact on individual-level antecedents, influence...

  20. Strategizing on innovation systems

    DEFF Research Database (Denmark)

    Jofre, Sergio

    and the supranational levels. Data is gathered from available literature. Conclusions and discussion The findings suggest that there are important disparities among NIS particularly at the level of systemic functions such as knowledge creation, knowledge diffusion, guidance, and human and financial resource......This paper explores the strategic context of the implementation of the European Institute of Technology (EIT) from the perspective of National Innovation Systems (NIS) and the Triple Helix of University-Government-Industry relationship. The analytical framework is given by a comparative study...... within the US innovation system such as the Massachusetts Institute of Technology (MIT), the EIT deployment does not considers the establishment of a physical institution such as MIT but a supranational network of pre-existing institutions. Whether this strategy will be able to bring the expected...

  1. ABSTRACTS Preliminary Study of Strategic Inner Cores

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    When a strategic entity attempts to make a dicision, first the project must be m accoroance wlm its strategic framework as well as make the strategic inner cores prominent. The existing theories of development strategy indicate that the formation of the framework can be divided into the following parts: inside and outside environments, purpose, goal, key points, and countermeasures. The strategic inner cores that put forward by this paper is the intensification and advancement for the theory of strategic framework, strategic orientation, strategic vision and main line are inciuded. Appearance of these ideas have improved the theory and enhanced strategic practice.

  2. Strategic Cost Management

    Institute of Scientific and Technical Information of China (English)

    陆春玲

    1999-01-01

    本文作者管理学教授Robin Cooper和管理会计学教授Regine Slagmulder认为:一个公司全局性的战略成本管理通常包括两个模式。模式之一着重于业务改进,旨在提高公司的效率;模式之二则着重于战略成本,旨在帮助公司明确可获得利润的来源。这两种模式所需要的成本信息是截然不同的。遗憾的是许多公司都企图仅用其中的一种模式去实现两种模式协调方能实现的功能。这种单一的模式体系必然会导致公司运行效率低下。 本文的结构非常清楚。两位教授在分别论述了两种模式的不同后着重指出它们有相同的基本目标,应该相互连贯,两者连结的critial point is that the operational improvement module should be the primary source of much of the information used by the strategic costing module.而在保持两者关系的数种方法中,a particularly effective one is to use two sets of standards: the "updated" and the "strategic".】

  3. KEKB accelerator

    International Nuclear Information System (INIS)

    KEKB, the B-Factory at High Energy Accelerator Research Organization (KEK) recently achieved the luminosity of 1 x 1034 cm-2s-1. This luminosity is two orders higher than the world's level at 1990 when the design of KEKB started. This unprecedented result was made possible by KEKB's innovative design and technology in three aspects - beam focusing optics, high current storage, and beam - beam interaction. Now KEKB is leading the luminosity frontier of the colliders in the world. (author)

  4. Accelerating networks

    International Nuclear Information System (INIS)

    Evolving out-of-equilibrium networks have been under intense scrutiny recently. In many real-world settings the number of links added per new node is not constant but depends on the time at which the node is introduced in the system. This simple idea gives rise to the concept of accelerating networks, for which we review an existing definition and-after finding it somewhat constrictive-offer a new definition. The new definition provided here views network acceleration as a time dependent property of a given system as opposed to being a property of the specific algorithm applied to grow the network. The definition also covers both unweighted and weighted networks. As time-stamped network data becomes increasingly available, the proposed measures may be easily applied to such empirical datasets. As a simple case study we apply the concepts to study the evolution of three different instances of Wikipedia, namely, those in English, German, and Japanese, and find that the networks undergo different acceleration regimes in their evolution

  5. The Beta Tech electron accelerator

    International Nuclear Information System (INIS)

    After describing the background of the Swedish Electron Sterilization Centre, the proposed linear accelerator sterilization plant is outlined. The accelerator will produce electrons of energy 10 MeV and a beam power of 30 KW. The handling system, control and identification systems are also described. Documentation will be designed around a bar code system on line to a computer. The various uses of dosimetry in plant performance and process control are described. (U.K.)

  6. Multinational Corporation and International Strategic Alliance

    Institute of Scientific and Technical Information of China (English)

    陆兮

    2015-01-01

    The world is now deeply into the second great wave of globalization, in which product, capital, and markets are becoming more and more integrated across countries. Multinational corporations are gaining their rapid growth around the globe and playing a significant role in the world economy. Meanwhile, the accelerated rate of globalization has also imposed pressures on MNCs, left them desperately seeking overseas alliances in order to remain competitive. International strategic alliances, which bring together large and commonly competitive firms for specific purposes, have gradual y shown its importance in the world market. And the form of international joint venture is now widely adopted. Then after the formation of alliances, selecting the right partner, formulating right strategies, establishing harmonious and effective partnership are generally the key to success.

  7. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  8. Complex Strategic Choices Applying Systemic Planning for Strategic Decision Making

    CERN Document Server

    Leleur, Steen

    2012-01-01

    Effective decision making requires a clear methodology, particularly in a complex world of globalisation. Institutions and companies in all disciplines and sectors are faced with increasingly multi-faceted areas of uncertainty which cannot always be effectively handled by traditional strategies. Complex Strategic Choices provides clear principles and methods which can guide and support strategic decision making to face the many current challenges. By considering ways in which planning practices can be renewed and exploring the possibilities for acquiring awareness and tools to add value to strategic decision making, Complex Strategic Choices presents a methodology which is further illustrated by a number of case studies and example applications. Dr. Techn. Steen Leleur has adapted previously established research based on feedback and input from various conferences, journals and students resulting in new material stemming from and focusing on practical application of a systemic approach. The outcome is a coher...

  9. Accelerators and the Accelerator Community

    Energy Technology Data Exchange (ETDEWEB)

    Malamud, Ernest; Sessler, Andrew

    2008-06-01

    In this paper, standing back--looking from afar--and adopting a historical perspective, the field of accelerator science is examined. How it grew, what are the forces that made it what it is, where it is now, and what it is likely to be in the future are the subjects explored. Clearly, a great deal of personal opinion is invoked in this process.

  10. Networks and meshworks in strategizing

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Andersen, Poul Houman

    The purpose of this paper is to examine the business network metaphor in relation to strategizing in business and to tentatively propose an alternative metaphor, that of the business meshwork. The paper reviews existing work on strategy and strategizing within the IMP literature, particularly...... the literature on networks and network pictures, and identifies several shortcomings of this work. To develop the notion of business meshworks as an alternative for understanding strategizing practices in business interaction, the paper draws on recent writings within anthropology and the strategy......-as-practice literature....

  11. Application of local area networks to accelerator control systems at the Stanford Linear Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.D.; Linstadt, E.; Melen, R.

    1983-03-01

    The history and current status of SLAC's SDLC networks for distributed accelerator control systems are discussed. These local area networks have been used for instrumentation and control of the linear accelerator. Network topologies, protocols, physical links, and logical interconnections are discussed for specific applications in distributed data acquisition and control system, computer networks and accelerator operations.

  12. Application of local area networks to accelerator control systems at the Stanford Linear Accelerator

    International Nuclear Information System (INIS)

    The history and current status of SLAC's SDLC networks for distributed accelerator control systems are discussed. These local area networks have been used for instrumentation and control of the linear accelerator. Network topologies, protocols, physical links, and logical interconnections are discussed for specific applications in distributed data acquisition and control system, computer networks and accelerator operations

  13. Global Market. Global Strategic Rflection?A Strategic Apprroach Methodology..

    OpenAIRE

    Vivas, Carla; Sousa, António

    2011-01-01

    This paper aims to analyze the competitive environment of international companies in the wine sector and assess the implications in terms of development of strategic guidelines and quotas various performances. We propose the application of the methodological Integrated Grid of Strategic Reflection (IGSR), based on a literature from the literature (business management and viticulture), primary data collection and creative reflection. For qualitative analysis we used the PEST method and five fo...

  14. accelerating cavity

    CERN Multimedia

    On the inside of the cavitytThere is a layer of niobium. Operating at 4.2 degrees above absolute zero, the niobium is superconducting and carries an accelerating field of 6 million volts per metre with negligible losses. Each cavity has a surface of 6 m2. The niobium layer is only 1.2 microns thick, ten times thinner than a hair. Such a large area had never been coated to such a high accuracy. A speck of dust could ruin the performance of the whole cavity so the work had to be done in an extremely clean environment.

  15. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2016-01-01

    International audience How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment...

  16. Cyber Conflict Between Taiwan and China; Strategic Insights, Spring 2011

    OpenAIRE

    Chang, Yao-Chung

    2011-01-01

    This article appeared in Strategic Insights, Spring 2011 The Republic of China (Taiwan hereafter) and the People’s Republic of China (China hereafter) are two particularly attractive targets for internet hackers. Reports have found that, compared to other countries in the Asia and Pacific regions, China and Taiwan rank as the top two countries in terms of malicious computer activity. Reports have also shown that most hacking into Taiwanese computer systems is initiated from wit...

  17. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  18. FAMOUS, faster: using parallel computing techniques to accelerate the FAMOUS/HadCM3 climate model with a focus on the radiative transfer algorithm

    Science.gov (United States)

    Hanappe, P.; Beurivé, A.; Laguzet, F.; Steels, L.; Bellouin, N.; Boucher, O.; Yamazaki, Y. H.; Aina, T.; Allen, M.

    2011-09-01

    We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL's Synergistic Processing Element than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60 % of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

  19. FAMOUS, faster: using parallel computing techniques to accelerate the FAMOUS/HadCM3 climate model with a focus on the radiative transfer algorithm

    Directory of Open Access Journals (Sweden)

    P. Hanappe

    2011-09-01

    Full Text Available We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations.

    The modified algorithm runs more than 50 times faster on the CELL's Synergistic Processing Element than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60 % of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

  20. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  1. Dominant investors and strategic transparency

    NARCIS (Netherlands)

    E.C. Perotti; E.-L. von Thadden

    1999-01-01

    This paper studies product market competition under a strategic transparency decision. Dominant investors can influence information collection in the financial market, and thereby corporate transparency, by affecting market liquidity or the cost of information collection. More transparency on a firm

  2. Dominant investors and strategic transparency

    NARCIS (Netherlands)

    E.C. Perotti; E.-L. von Thadden

    1998-01-01

    This paper studies product market competition under a strategic transparency decision. Dominant investors can influence information collection in the financial market, and thereby corporate transparency, by affecting market liquidity or the cost of information collection. More transparency on a firm

  3. Strategic Arrivals Recommendation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — During the conduct of a NASA Research Announcement (NRA) in 2012 and 2013, the Mosaic ATM team first developed the Strategic Arrivals Recommendation Tool concept,...

  4. The Strategic Process in Organisations

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    1999-01-01

    Organisational strategy development is often conceptualised through methodological frameworks. In this paper strategy development is seen as a strategic process characterised by inherent contradictions between actors, OR methods and the problem situation. The paper presents the dimensions...

  5. The Economics of Strategic Opportunity

    OpenAIRE

    Denrell, Jerker; Fang, Christina; Winter, Sidney G.

    2003-01-01

    As emphasized by Barney (1986), any explanation of superior profitability must account for why the resources supporting such profitability could have been acquired for a price below their rent generating capacity. Building upon the literature in economics on coordination failures and incomplete markets, we suggest a framework for analyzing such strategic factor market inefficiencies. Our point of departure is that a strategic opportunity exists whenever prices fail to reflect the value of a r...

  6. The Emerging Strategic Entrepreneurship Field

    OpenAIRE

    Foss, Nicolai J.; Lyngsie, Jacob

    2011-01-01

    The field of strategic entrepreneurship is a fairly recent one. Its central idea is that opportunity-seeking and advantage-seeking—the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field— are processes that need to be considered jointly. The purpose of this brief chapter is to explain the emergence of SE theory field in terms of a response to research gaps in the neighboring fields of entrepreneurship and st...

  7. Executive presence for strategic influence.

    Science.gov (United States)

    Shirey, Maria R

    2013-01-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author discusses cultivating executive presence, a crucial component of great leadership, needed for strategic influence and to drive change.

  8. Pharmaceutical industry in strategic development

    OpenAIRE

    2013-01-01

    World pharmaceutical industry has been changing profoundly as it has been steadily concentrating and consolidating in the last decade. According to our survey, we may underline the intensive marketing management represents an extremely important operational and even strategic function for proper business performance and long-term strategic orientation for the world pharmaceutical companies. We may even conclude that intensive consolidation of world pharmaceutical industry is a market driven a...

  9. ASPECTS REGARDING SMEs’ STRATEGIC MANAGEMENT

    OpenAIRE

    MAGDALENA VORZSÁK; MONICA MARIA COROS

    2007-01-01

    The main objective of the present research is to formulate a set of pertinent recommendations regarding the efficient adjustment of the principles and instruments of strategic management to the particularities of the SMEs. Thus, the authors intend to clarify the aspects referring to matters such as: the manner of defining the organizational vision and objectives; the drawing up of the strategic options regarding the needed resources (financial, material, human); the understanding of the clien...

  10. The neoliberalisation of strategic spatial planning

    DEFF Research Database (Denmark)

    Olesen, Kristian

    2014-01-01

    Strategic spatial planning practices have recently taken a neoliberal turn in many northwestern European countries. This neoliberalisation of strategic spatial planning has materialised partly in governance reforms aiming to reduce or abolish strategic spatial planning at national and regional...... scales, and partly through the normalisation of neoliberal discourses in strategic spatial planning processes. This paper analyses the complex relationship, partly of unease and partly of coevolution, between neoliberalism and strategic spatial planning. Furthermore, the paper discusses the key...... challenges for strategic spatial planning in the face of neoliberalism and argues for a need to strengthen strategic spatial planning’s critical dimension....

  11. Implementation of Hardware Accelerators on Zynq

    DEFF Research Database (Denmark)

    Toft, Jakob Kenn

    processors, which has made hardware accelerators an essential part of several datacentres and the worlds fastest super-computers. In this work, two different hardware accelerators were implemented on a Xilinx Zynq SoC platform mounted on the ZedBoard platform. The two accelerators are based on two different...... of the ARM Cortex-9 processor featured on the Zynq SoC, with regard to execution time, power dissipation and energy consumption. The implementation of the hardware accelerators were successful. Use of the Monte Carlo processor resulted in a significant increase in performance. The Telco hardware accelerator...

  12. Accelerating abelian gauge dynamics

    CERN Document Server

    Adler, Stephen Louis

    1991-01-01

    In this paper, we suggest a new acceleration method for Abelian gauge theories based on linear transformations to variables which weight all length scales equally. We measure the autocorrelation time for the Polyakov loop and the plaquette at β=1.0 in the U(1) gauge theory in four dimensions, for the new method and for standard Metropolis updates. We find a dramatic improvement for the new method over the Metropolis method. Computing the critical exponent z for the new method remains an important open issue.

  13. 2011 Computation Directorate Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence

  14. Standard test method for accelerated leach test for diffusive releases from solidified waste and a computer program to model diffusive, fractional leaching from cylindrical waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method provides procedures for measuring the leach rates of elements from a solidified matrix material, determining if the releases are controlled by mass diffusion, computing values of diffusion constants based on models, and verifying projected long-term diffusive releases. This test method is applicable to any material that does not degrade or deform during the test. 1.1.1 If mass diffusion is the dominant step in the leaching mechanism, then the results of this test can be used to calculate diffusion coefficients using mathematical diffusion models. A computer program developed for that purpose is available as a companion to this test method (Note 1). 1.1.2 It should be verified that leaching is controlled by diffusion by a means other than analysis of the leach test solution data. Analysis of concentration profiles of species of interest near the surface of the solid waste form after the test is recommended for this purpose. 1.1.3 Potential effects of partitioning on the test results can...

  15. Stratway: A Modular Approach to Strategic Conflict Resolution

    Science.gov (United States)

    Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.

    2011-01-01

    In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.

  16. TOPSIS Method for Determining The Priority of Strategic Training Program

    Directory of Open Access Journals (Sweden)

    Rohmatulloh Rohmatulloh

    2014-01-01

    Full Text Available The voice of stakeholders is an important issue for government or public organizations. The issue becomes an input in designing strategic program. Decision maker should evaluate the priority to get the importance level. The decision making process is a complex problem because it is influenced by many critetria. The purpose of this study is to solve multi-criteria decision making problem using TOPSIS method. This method is proposed due to its easy and simple computation process. The case sample is determining the strategic training program in energy and mineral resources field. TOPSIS analysis may be able to assist decision maker in allocating resources for the preparation of strategic training program in accordance with the priorities

  17. CLOUD COMPUTING AND SECURITY

    Directory of Open Access Journals (Sweden)

    Asharani Shinde

    2015-10-01

    Full Text Available This document gives an insight into Cloud Computing giving an overview of key features as well as the detail study of exact working of Cloud computing. Cloud Computing lets you access all your application and documents from anywhere in the world, freeing you from the confines of the desktop thus making it easier for group members in different locations to collaborate. Certainly cloud computing can bring about strategic, transformational and even revolutionary benefits fundamental to future enterprise computing but it also offers immediate and pragmatic opportunities to improve efficiencies today while cost effectively and systematically setting the stage for the strategic change. As this technology makes the computing, sharing, networking easy and interesting, we should think about the security and privacy of information too. Thus the key points we are going to be discussed are what is cloud, what are its key features, current applications, future status and the security issues and the possible solutions.

  18. Application of Plasma Waveguides to High Energy Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Milchberg, Howard [Univ. of Maryland, College Park, MD (United States)

    2016-07-01

    This grant supported basic experimental, theoretical and computer simulation research into developing a compact, high pulse repetition rate laser accelerator using the direct laser acceleration mechanism in plasma-based slow wave structures.

  19. A model for strategic leadership.

    Science.gov (United States)

    Carver, J

    1989-01-01

    Boards can exercise strategic leadership, but to do so well, they must change their concept of governance. Most of what is traditionally accepted in the boardroom is not conducive to excellence in strategic leadership and, in fact, actually impedes it. First, trustees must be able to escape the clutter that interferes with a continual emphasis on strategic issues. This calls for a far more incisive way to withdraw responsibility from administrative tissues in favor of control by policy. Second, trustees must understand that strategic leadership is a matter of how the board approaches its job, not a mechanical annual tussle with setting long-range goals. Third, trustees must structure their annual plans around updating their long-range intentions--chiefly concerning mission issues and priorities--and restating them in amended ends policies. Finally, trustees must realize that strategic leadership--hence, good governance--is not so much about budgets, personnel practices, construction projects or medical staff bylaws as it is about public policy, purpose, values and the vision that shapes institutional destiny. PMID:10295445

  20. Strategic Management in Radiology Department

    Directory of Open Access Journals (Sweden)

    Arash Deljou

    2009-01-01

    Full Text Available "nA radiologist makes literally hundreds of decisions each day, but once each of those decisions is made, the case is finished. The radiologist is free to move on to the next set of decisions. Radiology practice administrators, in contrast, may be tied up solidly for 6 months while carrying out one business decision. The decision-making processes, environments, and timetables differ so greatly between physicians and administrators that bridging those cultural gaps becomes, in itself, an important step in planning, strategic and operational. "nOne of the main premise is that there is no “Holy Grail” to be found in strategic management, only an understanding that planning and change are the responsibility of senior management in radiology sector. "nIn fact, it is now their primary job in today’s radiology world of constant change, “Analogue to Digital”.  "nExcellent organizations don’t just have a budgeting cycle each year; they have a “strategic management” cycle led by senior management as they work on the organization, rather than just in the organization. As planning is just the first function of management, and strategic planning is just the highest order of planning and the purview of senior management in every radiology department, then every department has three basic goals: "n1. Develop strategic and operational plans. "n2. Ensure their successful implementation and change. "n3. Build and sustain high performance over the long term.  

  1. The Los Alamos accelerator code group

    International Nuclear Information System (INIS)

    The Los Alamos Accelerator Code Group (LAACG) is a national resource for members of the accelerator community who use and/or develop software for the design and analysis of particle accelerators, beam transport systems, light sources, storage rings, and components of these systems. Below the authors describe the LAACG's activities in high performance computing, maintenance and enhancement of POISSON/SUPERFISH and related codes and the dissemination of information on the INTERNET

  2. Computational thinking and thinking about computing

    OpenAIRE

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  3. The Emerging Strategic Entrepreneurship Field

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Lyngsie, Jacob

    considered jointly. The purpose of this brief chapter is to explain the emergence of SE theory field in terms of a response to research gaps in the neighboring fields of entrepreneurship and strategic management; describe the main tenets of SE theory; discuss its relations to neighboring fields; and finally......The field of strategic entrepreneurship is a fairly recent one. Its central idea is that opportunity-seeking and advantage-seeking — the former the central subject of the entrepreneurship field, the latter the central subject of the strategic management field — are processes that need to be...... describe some research gaps in extant theory, mainly focusing on the need to provide clear microfoundations for SE theory and link it to organizational design theory....

  4. Using Intellectual Property Rights Strategically

    DEFF Research Database (Denmark)

    Reitzig, Markus

    2003-01-01

    With the share of intellectual property among corporate value constantly rising,management's understanding of the strategic use of patents, trademarks, andcopyrights becomes ever more crucial. The vast majority of articles on patent ortrademark strategies, however, is written by and for lawyers...... describing advancedprocedural legal tactics but not addressing the problems faced by managers. Onthe other hand, the marketing, technology management, and economics literatureprovides important basic insights to managerial aspects of IPRs but does hardlyexpand on the fundamental knowledge so that it would...... in various industries, this articleattempts at defining the content of an `intellectual property strategy' from theperspective of a strategic management scholar by linking the use of IPRs to aseries of strategic management questions. Ultimately, the paper helps strategicmanagers to understand recent...

  5. Corporate Foresight and Strategic Decisions

    DEFF Research Database (Denmark)

    Gomez Portaleoni, Claudio; Marinova, Svetla Trifonova; Ul-Haq, Rehan;

    . It provides an extensive analysis of extant theories of corporate foresight and strategic management, brings in new notions and insights, and presents an in-depth case study exploration of corporate foresight of a European bank. The understanding of organizational future is influenced by the perceived...... dealt successfully with internal information sharing processes that in most cases have prepared them for the challenges of the future. Corporate Foresights and Strategic Decisions investigates the relationships between corporate foresight and management decision-making processes in organizations...... accountability and integrity of the participating departments as well as by the apparent nature of environmental explosiveness. This book provides clear confirmations showing that the impacts of corporate foresight on strategic decisions are critically affected by the evaluative and analytical verdicts...

  6. Managing transdisciplinarity in strategic foresight

    DEFF Research Database (Denmark)

    Rasmussen, Birgitte; Andersen, Per Dannemand; Borch, Kristian

    2010-01-01

    strategic foresight has now been widely accepted for strategy-making and priority-setting in science and innovation policy, the methodologies underpinning it still need further development. Key findings are the identification of challenges, aspects and issues related to management and facilitation......Strategic foresight deals with the long term future and is a transdisciplinary exercise which, among other aims, addresses the prioritization of science and other decision making in science and innovation advisory and funding bodies. This article discusses challenges in strategic foresight...... to negotiate over how to attain a desirable future. This requires creative thinking from the participants, who need to extend their knowledge into the uncertainty of the future. Equally important is skilled facilitating in order to create a space for dialogue and exploration in a contested territory. Although...

  7. NATO's Strategic Partnership with Ukraine

    DEFF Research Database (Denmark)

    Breitenbauch, Henrik Ø.

    2014-01-01

    Russian actions in Ukraine have altered the security land- scape in Europe, highlighting a renewed emphasis on the differences between members and non-members. In this context, NATO must a) create a strategic understanding of partnerships as something that can be transformative, even if it will n......Russian actions in Ukraine have altered the security land- scape in Europe, highlighting a renewed emphasis on the differences between members and non-members. In this context, NATO must a) create a strategic understanding of partnerships as something that can be transformative, even...... if it will not lead to membership in the short or even long term, and b) build such a strategic relationship with Ukraine. In sum, the Russian-induced Ukraine crisis should spur the reform of NATO partnerships – with Ukraine as a case in point....

  8. Strategic Leadership of Corporate Sustainability

    DEFF Research Database (Denmark)

    Strand, Robert

    2014-01-01

    ? What effects do corporate sustainability TMT positions have at their organizations? We consider these questions through strategic leadership and neoinstitutional theoretical frameworks. Through the latter, we also engage with Weberian considerations of bureaucracy. We find that the reasons why......Strategic leadership and corporate sustainability have recently come together in conspicuously explicit fashion through the emergence of top management team (TMT) positions with dedicated corporate sustainability responsibilities. These TMT positions, commonly referred to as 'Chief Sustainability...... unrealized without concerted attention and coordination afforded by a strategic level position. Regarding effects, we determine the position can relate to the establishment of bureaucratic structures dedicated to corporate sustainability within the corporation through which formalized processes and key...

  9. Strategic planning in healthcare organizations.

    Science.gov (United States)

    Rodríguez Perera, Francisco de Paula; Peiró, Manel

    2012-08-01

    Strategic planning is a completely valid and useful tool for guiding all types of organizations, including healthcare organizations. The organizational level at which the strategic planning process is relevant depends on the unit's size, its complexity, and the differentiation of the service provided. A cardiology department, a hemodynamic unit, or an electrophysiology unit can be an appropriate level, as long as their plans align with other plans at higher levels. The leader of each unit is the person responsible for promoting the planning process, a core and essential part of his or her role. The process of strategic planning is programmable, systematic, rational, and holistic and integrates the short, medium, and long term, allowing the healthcare organization to focus on relevant and lasting transformations for the future.

  10. Limited rationality and strategic interaction

    DEFF Research Database (Denmark)

    Fehr, Ernst; Tyran, Jean-Robert

    2008-01-01

    Much evidence suggests that people are heterogeneous with regard to their abilities to make rational, forward-looking decisions. This raises the question as to when the rational types are decisive for aggregate outcomes and when the boundedly rational types shape aggregate results. We examine...... this question in the context of a long-standing and important economic problem: the adjustment of nominal prices after an anticipated monetary shock. Our experiments suggest that two types of bounded rationality-money illusion and anchoring-are important behavioral forces behind nominal inertia. However......, depending on the strategic environment, bounded rationality has vastly different effects on aggregate price adjustment. If agents' actions are strategic substitutes, adjustment to the new equilibrium is extremely quick, whereas under strategic complementarity, adjustment is both very slow and associated...

  11. Whole scale change for real-time strategic application in complex health systems.

    Science.gov (United States)

    Shirey, Maria R; Calarco, Margaret M

    2014-11-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. In this article, the authors introduce Whole Scale Change™, an action learning approach that accelerates organizational transformation to meet the challenges of dynamic environments.

  12. Anderson Acceleration for Fixed-Point Iterations

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Homer F. [Worcester Polytechnic Institute, MA (United States)

    2015-08-31

    The purpose of this grant was to support research on acceleration methods for fixed-point iterations, with applications to computational frameworks and simulation problems that are of interest to DOE.

  13. Final Draft Strategic Marketing Plan.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1994-02-01

    The Bonneville Power Administration (BPA) has developed a marketing plan to define how BPA can be viable and competitive in the future, a result important to BPA`s customers and constituents. The Marketing Plan represents the preferred customer outcomes, marketplace achievements, and competitive advantage required to accomplish the Vision and the Strategic Business Objectives of the agency. The Marketing Plan contributes to successful implementation of BPA`s Strategic Business Objectives (SBOs) by providing common guidance to organizations and activities throughout the agency responsible for (1) planning, constructing, operating, and maintaining the Federal Columbia River Power System; (2) conducting business with BPA`s customers; and (3) providing required internal support services.

  14. Is Strategic Entrepreneurship a Pleonasm?

    OpenAIRE

    2013-01-01

    A plethora of terms have been used to describe the notion of entrepreneurial and innovative orientation and action within organizations. While ¡®corporate¡¯ and ¡®entrepreneurship¡¯ coupled together in one term can be viewed as antithetical by some, is the arrival of its newer cousin ¡®strategic entrepreneurship¡¯ essentially the same notion? Irrespective of how opinions and logic may pivot on this debate, strategic entrepreneurship nevertheless seems to have some of the markings of an admitt...

  15. Contrasting strategic and Milan therapies.

    Science.gov (United States)

    MacKinnon, L

    1983-12-01

    Three related models of therapy are often grouped together as the strategic therapies. These are brief therapy model associated with the Mental Research Institute, approaches developed by Jay Haley and Cloë Madanes, and the model developed by the Milan associates. Controversy exists, however, as to whether the Milan model should be included as a strategic therapy. It appears that the similarities among the three models can mask deeper differences, thus confounding the confusion. This paper contrast the models in their development, theory, and practice.

  16. Issues in Strategic Decision Modelling

    CERN Document Server

    Jennings, Paula

    2008-01-01

    [Spreadsheet] Models are invaluable tools for strategic planning. Models help key decision makers develop a shared conceptual understanding of complex decisions, identify sensitivity factors and test management scenarios. Different modelling approaches are specialist areas in themselves. Model development can be onerous, expensive, time consuming, and often bewildering. It is also an iterative process where the true magnitude of the effort, time and data required is often not fully understood until well into the process. This paper explores the traditional approaches to strategic planning modelling commonly used in organisations and considers the application of a real-options approach to match and benefit from the increasing uncertainty in today's rapidly changing world.

  17. Accelerators for heavy-ion fusion

    International Nuclear Information System (INIS)

    The author discusses accelerators for heavy-ion fusion rather than accelerators for strategic defense systems, focusing first on generic fusion issues. The author maintains that a sensible fusion system must satisfy three conditions: the total capital cost of the system must be acceptable; the cost of electricity must also be acceptable, and there must be a reasonable way to get from where we are today to where we want to be ultimately, i.e., there must be a sensible RandD path. The author believes that inertial confinement fusion (ICF) provides a reasonable RandD path and explains why. The results of the heavy-ion experiments performed have tested only transverse beam dynamics. The author believes that heavy-ion fusion is a promising fusion option and that multistage accelerators are capable of satisfying the engineering requirements of fusion power production

  18. Strategic Planning and Strategic Thinking Clothed in STRATEGO

    Science.gov (United States)

    Baaki, John; Moseley, James L.

    2011-01-01

    This article shares experiences that participants had playing the game of STRATEGO and how the activity may be linked to strategic planning and thinking. Among the human performance technology implications of playing this game are that gamers agreed on a framework for rules, took stock on where they wanted to go in the future, and generated a risk…

  19. The strategic labor allocation proces : a model of strategic HRM

    NARCIS (Netherlands)

    Bax, Erik H.

    2002-01-01

    In this article the Strategic Labor Allocation Process model (SLAP) is described. The model relates HR-strategies to structure, culture and task technology to HR-policies like recruitment, appraisal and rewarding, to business strategy and to socio-cultural, economic, institutional and technological

  20. Acceleration of Runge-Kutta integration schemes

    OpenAIRE

    Udwadia, Firdaus E.; Phailaung Phohomsiri

    2004-01-01

    A simple accelerated third-order Runge-Kutta-type, fixed time step, integration scheme that uses just two function evaluations per step is developed. Because of the lower number of function evaluations, the scheme proposed herein has a lower computational cost than the standard third-order Runge-Kutta scheme while maintaining the same order of local accuracy. Numerical examples illustrating the computational efficiency and accuracy are presented and the actual speedup when the accelerated alg...

  1. 导航卫星速度和加速度的计算方法及精度分析%Navigation Satellites Velocity and Acceleration Computation. Methods and Accuracy Analysis

    Institute of Scientific and Technical Information of China (English)

    李显; 吴美平; 张开东; 曹聚亮; 黄杨明

    2012-01-01

    A systemic analysis of the different methods to calculate the velocities and accelerations of the navigation satellites is made, including O the closed analytical method based on broadcast elghemeris (1) the numerical differencing method based on position series of the satellite Q the analytical differencing method based on position series of the satellite. Firstly, the analytical expressions are deduced based on broadcast ephemeris, three types of broadcast ephemeris, including Kepler elements, GE~, and position-velocity type are discussed. The results can be drawn from precision comparison as follows, ~ the accuracy of velocity and acceleration derived from broadcast ephemeris is relative low, and can not match the high precision applications, such as airborne gravimetric measurement~ Q the acceleration accuracy is higher derived from position-velocity broadcast ephemeris while the Kepler type has higher velocity accuracy (3) the orbit height is one of the factors of the computation precision. Then, the analytical differencing and numerical differencing based on precision ephemeris to derive velocities and acceleration are analyzed and compared, the results shows that although the analytical method has advantages on efficient, the velocities computation precision is lower for the orbit analytical model built from short term position series is inaccurate, however, the acceleration computation precision is compared to the numerical differencing method. Finally, a static experiment is conducted which data from two CQRS (continues operational reference system) stations to evaluate and compare the computation accuracy among the methods mentioned above.%系统分析和总结基于广播星历和精密星历的导航卫星速度和加速度的计算方法,包括:①基于广播星历的公式法;②基于导航卫星位置序列的数值差分法;③基于导航卫星位置序列的解析差分法。首先在基于广播星历的公式法中,推导出Kepler

  2. The Danish experience of strategic environment assesment

    DEFF Research Database (Denmark)

    Kørnøv, Lone

    2004-01-01

    The article recounts a number of examples of the Danish experience with Strategic Environmental Assessment (SEA).......The article recounts a number of examples of the Danish experience with Strategic Environmental Assessment (SEA)....

  3. Are human resource professionals strategic business partners?

    DEFF Research Database (Denmark)

    Chiu, Randy; Selmer, Jan

    2011-01-01

    Theoretical speculations and prescriptive discussions abound in the literature regarding the strategic importance of human resource management. However, evidence based on rigorous empirical studies that the transformation from an administrative service function to strategic partnership has taken...

  4. Linear Accelerator (LINAC)

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Linear Accelerator A linear accelerator (LINAC) customizes high energy x-rays to ... ensured? What is this equipment used for? A linear accelerator (LINAC) is the device most commonly used ...

  5. OpenMP for Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, J C; Stotzer, E J; Hart, A; de Supinski, B R

    2011-03-15

    OpenMP [13] is the dominant programming model for shared-memory parallelism in C, C++ and Fortran due to its easy-to-use directive-based style, portability and broad support by compiler vendors. Similar characteristics are needed for a programming model for devices such as GPUs and DSPs that are gaining popularity to accelerate compute-intensive application regions. This paper presents extensions to OpenMP that provide that programming model. Our results demonstrate that a high-level programming model can provide accelerated performance comparable to hand-coded implementations in CUDA.

  6. Strategic Listening for School Leaders

    Science.gov (United States)

    Tate, Jeannine S.; Dunklee, Dennis R.

    2005-01-01

    The ability to communicate effectively with multiple constituencies is recognized as an essential characteristic of effective leaders. Listening strategically is a way of showing parents, students, faculty, staff, and others that their ideas and beliefs are of value. The authors' practitioner-friendly book concentrates on the importance of…

  7. Strategic petroleum reserve annual report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-15

    Section 165 of the Energy Policy and Conservation Act (Public Law 94- 163), as amended, requires the Secretary of Energy to submit annual reports to the President and the Congress on activities of the Strategic Petroleum Reserve (SPR). This report describes activities for the year ending December 31, 1995.

  8. Strategic Asset Seeking by EMNEs

    DEFF Research Database (Denmark)

    Petersen, Bent; Seifert, Jr., Rene E.

    2014-01-01

    Purpose: The chapter provides an economic explanation and perspectivation of strategic asset seeking of multinational enterprises from emerging economies (EMNEs) as a prominent feature of today’s global economy. Approach: The authors apply and extend the “springboard perspective.” This perspectiv...

  9. Strategic interactions in environmental economics

    NARCIS (Netherlands)

    Heijnen, Pim

    2007-01-01

    This thesis deals with the strategic interactions important to environmental economics. That includes a wide range of topics: environmental groups influencing or informing consumers, the reaction to taxation by car and fuel producers, the spread of an innovative tax and binary public good games.

  10. Corporate Social Responsibility: Strategic Implications

    OpenAIRE

    Abagail McWilliams; Siegel, Donald S.; Patrick M. Wright

    2005-01-01

    We describe a variety of perspectives on corporate social responsibility (CSR), which we use to develop a framework for consideration of the strategic implications of CSR. Based on this framework, we propose an agenda for additional theoretical and empirical research on CSR. We then review the papers in this special issue and relate them to the proposed agenda.

  11. Strategic alliances : fit or failure

    NARCIS (Netherlands)

    Douma, Marc Ulco

    1997-01-01

    Strategic alliances have come to form a structural element within present day economic systems. In spite of a considerable increase in the number of alliances, in practice these are quite often less successful than initially anticipated by the partners involved. These two observations formed the imm

  12. The Test of Strategic Culture

    DEFF Research Database (Denmark)

    Dalgaard-Nielsen, Anja

    2005-01-01

    that have so far framed the reunified Germany’s military policy. Iraq simply showed that Germany, like most other countries, has conditions that have to be met – in Germany’s case, conditions flowing from the coexistence of two competing schools of thought within Germany’s strategic culture....

  13. A strategic PACS maturity approach

    NARCIS (Netherlands)

    van de Wetering, R.

    2011-01-01

    Finding the key determinants of Picture Archivingand Communication Systems (PACS)performance in hospitals has been a conundrumfor decades. This research provides a method toassess the strategic alignment of PACS in hospitalsin order to find these key determinants. PACS touches upon every single part

  14. Entrepreneurial Spirit in Strategic Planning.

    Science.gov (United States)

    Riggs, Donald E.

    1987-01-01

    Presents a model which merges the concepts of entrepreneurship with those of strategic planning to create a library management system. Each step of the process, including needs assessment and policy formation, strategy choice and implementation, and evaluation, is described in detail. (CLB)

  15. Organising purchasing and (strategic) sourcing

    DEFF Research Database (Denmark)

    Lidegaard, Nina; Boer, Harry; Munkgaard Møller, Morten

    2015-01-01

    or hybrid overall structure, deliver the expected results. Contingency theory predicts that the success of a firm depends on the fit among characteristics of, amongst others, the firm’s processes and organisational structure. The objective of this paper is to propose and illustrate a processbased...... typological theory of purchasing and (strategic) sourcing organisation....

  16. Strategic Groups and Banks’ Performance

    Directory of Open Access Journals (Sweden)

    Gregorz Halaj

    2009-06-01

    Full Text Available The theory of strategic groups predicts the existence of stable groups of companies that adopt similar business strategies. The theory also predicts that groups will differ in performance and in their reaction to external shocks. We use cluster analysis to identify strategic groups in the Polish banking sector. We find stable groups in the Polish banking sector constituted after the year 2000 following the major privatisation and ownership changes connected with transition to the mostly-privately-owned banking sector in the late 90s. Using panel regression methods we show that the allocation of banks to groups is statistically significant in explaining the profitability of banks. Thus, breaking down the banks into strategic groups and allowing for the different reaction of the groups to external shocks helps in a more accurate explanation of profits of the banking sector as a whole.Therefore, a more precise ex ante assessment of the loss absorption capabilities of banks is possible, which is crucial for an analysis of banking sector stability. However, we did not find evidence of the usefulness of strategic groups in explaining the quality of bank portfolios as measured by irregular loans over total loans, which is a more direct way to assess risks to financial stability.

  17. IT Strategic and Operational Controls

    CERN Document Server

    Kyriazoglou, J

    2010-01-01

    This book provides a comprehensive guide to implementing an integrated and flexible set of IT controls in a systematic way. It can help organisations to formulate a complete culture for all areas which must be supervised and controlled; allowing them to simultaneously ensure a secure, high standard whilst striving to obtain the strategic and operational goals of the company.

  18. TFV as a strategic tool

    DEFF Research Database (Denmark)

    Bonke, Sten; Bertelsen, Sven

    2011-01-01

    The paper investigates the use of the Transformation-Flow-Value theory as a strategic tool in the development of the project production firm. When producing products such as ships, focus on value more than on cost may be the best approach, but in service industries such as construction, focus...

  19. Strategic School Planning in Jordan

    Science.gov (United States)

    Al-Zboon, Mohammad Saleem; Hasan, Manal Subhi

    2012-01-01

    The study aimed to measuring the applying degree of the strategic school planning stages at the Governmental high schools from the educational supervisors and principals perspective in the directorates related to Amman city, the study society was formed of the educational supervisors and principals working at Educational directorates related to…

  20. Strategic Approaches with Resistant Families.

    Science.gov (United States)

    Breit, Miranda; And Others

    1983-01-01

    Describes the operation of a 10-session brief therapy unit for families who have failed in more traditional treatment modalities. Case material is presented to exemplify five different treatment strategies: symptom prescription, reframing, illusion of alternatives, role play, and strategic alliances. Advantages and limitations are discussed.…

  1. Strategic Audit and Marketing Plan

    Science.gov (United States)

    Wright, Lianna S.

    2013-01-01

    The purpose of this audit was to revise the marketing plan for ADSum Professional Development School and give the owner a long-term vision of the school to operate competitively in the crowded field of for-profit schools. It is fairly simple to create a strategic plan but harder to implement and execute. Execution requires weeks and months of…

  2. Negotiation for Strategic Video Games

    OpenAIRE

    Afiouni, Einar Nour; Ovrelid, Leif Julian

    2013-01-01

    This project aims to examine the possibilities of using game theoretic concepts and multi-agent systems in modern video games with real time demands. We have implemented a multi-issue negotiation system for the strategic video game Civilization IV, evaluating different negotiation techniques with a focus on the use of opponent modeling to improve negotiation results.

  3. Strategic Planning for School Success.

    Science.gov (United States)

    Herman, Jerry J.

    1993-01-01

    Strategic planners concerned with such matters as high-achieving students, high-performing teachers, broad-based community support, and a two-way involvement with the community must analyze the strengths, weaknesses, opportunities, and threats existing in the school's internal and external environment. A sample SWOT analysis is included. (MLH)

  4. Rewarding Stakeholders: The Perspective of Strategic Entrepreneurship

    OpenAIRE

    Dissanayake, Srinath

    2013-01-01

    Prime concern on stakeholders is a crucial aspect in each business success. Among the wide spectrum of organizational strategies, Strategic Entrepreneurship pays a greater emphasis. This essay details practical as well as empirical grounds with regard to the notion of Strategic Entrepreneurship. Focally, strategic Entrepreneurship is an integration of Entrepreneurship (Opportunity Seeking Behavior) and Strategic Management (Advantage Seeking Behavior). Thus I conclude, an amalgamation of Str...

  5. Strategic Interaction, Competition, Cooperation and Observability

    OpenAIRE

    Groh, Christian

    2001-01-01

    Both the Cournot and the Bertrand oligopoly model make an important distinction between buyers and sellers. Sellers act as strategic players and are able to influence prices, buyers act non-strategically and are just represented by a demand function. This dissertation looks at situations without distinctions between buyers and sellers. Chapter two analyzes strategic market games à la Shapley and Shubik (Journal of Political Economy, 1977) with sequential moves. Usually, strategic market games...

  6. Maturity of strategic management in organizations

    OpenAIRE

    Anna Witek-Crabb

    2015-01-01

    There is some ambivalence with regards to how to improve strategic management of organizations. On the one hand the example of big companies emphasizes the need for formalization and good organization of strategic management process. On the other hand the example of small companies draws attention to such qualities as entrepreneurship, flexibility and adaptability. The concept of strategic management maturity embraces both of these priorities. In this paper a framework for strategic managemen...

  7. Legislation strategic enterprises of machine building industry

    OpenAIRE

    Kotovska, Iryna; Halushchak, Olha Yaropolkivna

    2011-01-01

    The article explores the stages of strategic planning in Ukraine, determined its value for industry in general and engineering industry in particular. We consider the legislative framework that regulates the process of strategic planning at three levels: national, regional and sectoral.Found that the number of strategic documents of various levels increases and decreases the effectiveness of their implementation. The legal framework of strategic planning is not complete and is mostly declarat...

  8. Legislation strategic enterprises of machine building industry

    Directory of Open Access Journals (Sweden)

    Kotovska, Iryna

    2011-11-01

    Full Text Available The article explores the stages of strategic planning in Ukraine, determined its value for industry in general and engineering industry in particular. We consider the legislative framework that regulates the process of strategic planning at three levels: national, regional and sectoral.Found that the number of strategic documents of various levels increases and decreases the effectiveness of their implementation. The legal framework of strategic planning is not complete and is mostly declarative.

  9. Information System of Scenario Strategic Planning

    OpenAIRE

    Denis, R.; Maxim, P.; Sergei, M.

    2009-01-01

    This paper gives an overview of the concept of a new system to support decision-making process in the area of strategic management company DEM. To ensure that in the modern world the company remains leader in its industry it needs to continually adjust its development strategy to changes. Nevertheless, there is no unified methodological framework for strategic planning. Also there are no strategic planning systems to help managers make strategic decisions (a problem of choice of the one of th...

  10. Characteristics of Useful and Practical Organizational Strategic Plans

    Science.gov (United States)

    Kaufman, Roger

    2014-01-01

    Most organizational strategic plans are not strategic but rather tactical or operational plans masquerading as "strategic." This article identifies the basic elements required in a useful and practical strategic plan and explains why they are important.

  11. Fiscal years 1994--1998 Information Technology Strategic Plan

    International Nuclear Information System (INIS)

    A team of senior managers from across the US Nuclear Regulatory Commission (NRC), working with the Office of Information Resources Management (IRM), has completed an NRC Strategic Information Technology (IT) Plan. The Plan addresses three major areas: (1) IT Program Management, (2) IT Infrastructure, and (3) Information and Applications Management. Key recommendations call for accelerating the replacement of Agency workstations, implementing a new document management system, applying business process reengineering to selected Agency work processes, and establishing an Information Technology Council to advise the Director of IRM

  12. 77 FR 54615 - Strategic Management Program; Fiscal Year 2013-2016 Strategic Plan

    Science.gov (United States)

    2012-09-05

    ... SAFETY BOARD Strategic Management Program; Fiscal Year 2013-2016 Strategic Plan AGENCY: National... Board, 490 L'Enfant Plaza SW., Washington, DC 20594. Attn: MD-1, Strategic Management Program. FOR FURTHER INFORMATION CONTACT: Agency contact, Shamicka Fulson, Program Manager, Strategic...

  13. Strategic Alliance Poker: Demonstrating the Importance of Complementary Resources and Trust in Strategic Alliance Management

    Science.gov (United States)

    Reutzel, Christopher R.; Worthington, William J.; Collins, Jamie D.

    2012-01-01

    Strategic Alliance Poker (SAP) provides instructors with an opportunity to integrate the resource based view with their discussion of strategic alliances in undergraduate Strategic Management courses. Specifically, SAP provides Strategic Management instructors with an experiential exercise that can be used to illustrate the value creation…

  14. Strategic Activism, Educational Leadership and Social Justice

    Science.gov (United States)

    Ryan, James

    2016-01-01

    This article describes the strategic activism of educational leaders who promote social justice. Given the risks, educational leaders need to be strategic about the ways in which they pursue their activism. Citing current research, this article explores the ways in which leaders strategically pursue their social justice agendas within their own…

  15. Rethinking Strategy and Strategic Leadership in Schools.

    Science.gov (United States)

    Davies, Brent

    2003-01-01

    Reviews nature of strategy and strategic leadership in schools. Considers how leaders can map and reconceptualize the nature of strategy and develop strategic capabilities for longer-term sustainability. Questions hierarchical models of leadership. Highlights three characteristics of strategically oriented schools; suggests ways to improve art of…

  16. 12 CFR 563e.27 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Strategic plan. 563e.27 Section 563e.27 Banks... for Assessing Performance § 563e.27 Strategic plan. (a) Alternative election. The OTS will assess a... strategic plan if: (1) The savings association has submitted the plan to the OTS as provided for in...

  17. 12 CFR 228.27 - Strategic plan.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Strategic plan. 228.27 Section 228.27 Banks and... REINVESTMENT (REGULATION BB) Standards for Assessing Performance § 228.27 Strategic plan. (a) Alternative...(s) under a strategic plan if: (1) The bank has submitted the plan to the Board as provided for...

  18. Strategic Management in Chinese Manufacturing SMEs

    OpenAIRE

    Chen, Muxia; Bowen, Liu

    2012-01-01

    This study is about to find out whether and how strategic management is employed in the Chinese manufacturing SMEs, as well as to explore the main characteristics of the strategic management process in these SMEs. It aims to work as a reference for the senior managers in these firms to better improve and utilize the strategic management tools for their future growth.

  19. 13 CFR 313.6 - Strategic Plans.

    Science.gov (United States)

    2010-01-01

    .... EDA shall evaluate the Strategic Plan based on the following minimum requirements: (1) An analysis of... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Strategic Plans. 313.6 Section 313... § 313.6 Strategic Plans. (a) General. An Impacted Community that intends to apply for a grant...

  20. Strategic decision quality in Flemish municipalities

    NARCIS (Netherlands)

    B.R.J. George (Bert); S. Desmidt (Sebastian); J. De Moyer (Julie)

    2016-01-01

    textabstractStrategic planning (SP) has taken the public sector by storm because it is widely believed that SP’s approach to strategic decision-making strengthens strategic decision quality (SDQ) in public organizations. However, if or how SP relates to SDQ seems to lack empirical evidence. Drawing

  1. Collaborative Strategic Planning: Myth or Reality?

    Science.gov (United States)

    Mbugua, Flora; Rarieya, Jane F. A.

    2014-01-01

    The concept and practice of strategic planning, while entrenched in educational institutions in the West, is just catching on in Kenya. While literature emphasizes the importance of collaborative strategic planning, it does not indicate the challenges presented by collaboratively engaging in strategic planning. This article reports on findings of…

  2. The value contribution of strategic foresight

    DEFF Research Database (Denmark)

    Rohrbeck, René; Schwarz, Jan Oliver

    2013-01-01

    This paper focuses on exploring the potential and empirically observable value creation of strategic foresight activities in firms. We first review the literature on strategic foresight, innovation management and strategic management in order to identify the potential value contributions. We use...

  3. Recent activities in accelerator code development

    International Nuclear Information System (INIS)

    In this paper we will review recent activities in the area of code development as it affects the accelerator community. We will first discuss the changing computing environment. We will review how the computing environment has changed in the last 10 years, with emphasis on computing power, operating systems, computer languages, graphics standards, and massively parallel processing. Then we will discuss recent code development activities in the areas of electromagnetics codes and beam dynamics codes

  4. Strategic issues in information technology international implications for decision makers

    CERN Document Server

    Schütte, Hellmut

    1988-01-01

    Strategic Issues in Information Technology: International Implications for Decision Makers presents the significant development of information technology in the output of components, computers, and communication equipment and systems. This book discusses the integration of information technology into factories and offices to increase productivity.Organized into six parts encompassing 12 chapters, this book begins with an overview of the advancement towards an automated interpretation communication system to achieve real international communication. This text then examines the main determining

  5. Robustly Strategic Consumption-Portfolio Rules with Informational Frictions

    OpenAIRE

    Luo, Yulei

    2015-01-01

    This paper provides a tractable continuous-time constant-absolute-risk averse (CARA)-Gaussian framework to explore how the interactions of fundamental uncertainty, model uncertainty due to a preference for robustness (RB), and state uncertainty due to information-processing constraints (rational inattention or RI) affect strategic consumption-portfolio rules and precautionary savings in the presence of uninsurable labor income. Specifically, after solving the model explicitly, I compute and c...

  6. Cyber Terrorism demands a Global Risks and Threats Strategic Management

    International Nuclear Information System (INIS)

    The world is in the third wave of development, which is digital managed and networked. Information, which creates the knowledge is transferring thorough the Internet by exponential function. The rapid advancement of the computer technology has a great influence over the development of the critical information infrastructure, thus changing the safety environment and the national values and interests. This advancement produces threats and risks from computer perspective which are sublimated in different forms of international terrorism and particularly in cyber terrorism. The main aim of this paper is based on a thorough analysis of what is scientifically known and practiced when nowadays critical information infrastructure is in the focus of the cyber terrorism. The rapid IT development demands changes in the strategic management focus. As a result of a time-consuming theoretical and empirical research this paper suggests a methodology for strategic managing of: threats, risks and vulnerabilities. The proposed methodology is seen as a mean to increase the human security conscious in every sense of the word, and to promote the need for rules, procedures and standards establishment from the aspect of the strategic management in the new information epoch concerning. In addition, through a scientific discourse, a short attempt is made to relate Macedonian reality with the phenomenon mentioned above. The most fundamental set phrase is that the efficiency and promptly made decisions during strategic planning are a projection of the systematic organization of functions and models for managing the risks and threats of the critical information infrastructure. Hence, this paper could be seen as a perspective when taking in consideration the regional strategic management, and the cyber space vital functioning. (author)

  7. Accelerated large-scale multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Lloyd Scott

    2011-12-01

    Full Text Available Abstract Background Multiple sequence alignment (MSA is a fundamental analysis method used in bioinformatics and many comparative genomic applications. Prior MSA acceleration attempts with reconfigurable computing have only addressed the first stage of progressive alignment and consequently exhibit performance limitations according to Amdahl's Law. This work is the first known to accelerate the third stage of progressive alignment on reconfigurable hardware. Results We reduce subgroups of aligned sequences into discrete profiles before they are pairwise aligned on the accelerator. Using an FPGA accelerator, an overall speedup of up to 150 has been demonstrated on a large data set when compared to a 2.4 GHz Core2 processor. Conclusions Our parallel algorithm and architecture accelerates large-scale MSA with reconfigurable computing and allows researchers to solve the larger problems that confront biologists today. Program source is available from http://dna.cs.byu.edu/msa/.

  8. Peran Strategic Entrepreneurship dalam membangun Sustainable Competitive Advantage

    OpenAIRE

    Agustinus Dedy Handrimurtjahjo

    2014-01-01

    Strategic entrepreneurship has emerged as a new concept in examining convergence in entrepreneurship studies (opportunity-seeking behavior) and strategic management (advantage-seeking behavior). Studies in the area of strategic management have gradually exposed the relationship between strategic management and entrepreneurship: entrepreneurial strategy making; intrapreneurship; entrepreneurial strategic posture within organizations; entrepreneurial orientation; strategic management integratio...

  9. Computer Science Security

    OpenAIRE

    Ocotlan Diaz-Parra; Ruiz-Vanoye, Jorge A.; Barrera-Cámara, Ricardo A.; Alejandro Fuentes-Penna; Natalia Sandoval

    2014-01-01

    Soft Systems Methodology (SSM) is a problem-solving methodology employing systems thinking. SSM has been applied to the management, planning, health and medical systems, information systems planning, human resource management, analysis of the logistics systems, knowledge management, project management, construction management and engineering, and development of expert systems. This paper proposes using SSM for strategic planning of Enterprise Computer Security.

  10. VLHC accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Michael Blaskiewicz et al.

    2001-11-01

    A six-month design study for a future high energy hadron collider was initiated by the Fermilab director in October 2000. The request was to study a staged approach where a large circumference tunnel is built that initially would house a low field ({approx}2 T) collider with center-of-mass energy greater than 30 TeV and a peak (initial) luminosity of 10{sup 34} cm{sup -2}s{sup -1}. The tunnel was to be scoped, however, to support a future upgrade to a center-of-mass energy greater than 150 TeV with a peak luminosity of 2 x 10{sup 34} cm{sup -2} sec{sup -1} using high field ({approx} 10 T) superconducting magnet technology. In a collaboration with Brookhaven National Laboratory and Lawrence Berkeley National Laboratory, a report of the Design Study was produced by Fermilab in June 2001. 1 The Design Study focused on a Stage 1, 20 x 20 TeV collider using a 2-in-1 transmission line magnet and leads to a Stage 2, 87.5 x 87.5 TeV collider using 10 T Nb{sub 3}Sn magnet technology. The article that follows is a compilation of accelerator physics designs and computational results which contributed to the Design Study. Many of the parameters found in this report evolved during the study, and thus slight differences between this text and the Design Study report can be found. The present text, however, presents the major accelerator physics issues of the Very Large Hadron Collider as examined by the Design Study collaboration and provides a basis for discussion and further studies of VLHC accelerator parameters and design philosophies.

  11. Strategic regulation of gas transport

    International Nuclear Information System (INIS)

    The basis of the article has been the growing focus on competition within the natural gas markets particularly in the EU. Increased competition whether upstream or downstream may influence the distribution of profit between producing and consuming countries. For Norway as a large exporter of natural gas to the European market this would be an important problem. The chain of values in the gas market consists of three complementary parts (production, problem definition and distribution). With this in mind it is studied how the countries would use strategic availability pricing for transport and distribution systems for moving as large a part of the total profit as possible to the parts of the chain of value they control themselves. The focus has been on how increased competition in the market for natural gas influence the authority incentives for stipulating a high or low availability price and to what extent increased competition influence the welfare level in the producing and consuming countries when they use strategic availability pricing. The analysis builds on a theoretical model developed by the company Nese and Straume (2005). Finally some of the more interesting results as to the Norwegian position as a gas producer are presented. One of the more surprising results was that for an exporting country and an importing country increased competition upwards may be an advantage for the exporting country while negative for the importing country. The result was valid also when a competing export country was included when this country did not use strategic availability pricing. If the competing country also acted strategically the result inverted. However, if the gas exporting countries were capable of perfect coordination of their availability pricing the case would revert to the situation with only one exporting country and the result would be valid. If a future formation of a ''gas-OPEC'' is considered where for example Norway and Russia cooperate in a gas

  12. Developing strategic thinking in senior management.

    Science.gov (United States)

    Zabriskie, N B; Huellmantel, A B

    1991-12-01

    Chief Executive Officers have recently stated that their greatest staffing challenge for the 1990s is the development of strategic leadership in their senior management. In order to do this, it is necessary to identify the substance of strategic thinking, and the capabilities that must be mastered. Writers on strategy have identified six major elements of strategic thinking and these have been organized to reveal the tasks, questions, decisions, and skills that senior executives must acquire in order to lead their organizations strategically. Finally, the article identifies training programme elements which are used by Directors of Manpower Development to develop strategic leadership ability. PMID:10118696

  13. STRATEGIC MANAGEMENT OF A TERRITORIAL DISTRIBUTED COMPLEX

    Directory of Open Access Journals (Sweden)

    Vidovskiy L. A.

    2015-10-01

    Full Text Available The article is devoted to strategic management and implementation of the strategy. Management strategy is based on the management of strategic potential of the enterprise. The strategic potential of the company generates only those resources that can be changed because of strategic decisions. Analysis of the potential of the enterprise should cover almost all spheres of its activity: the enterprise management, production, marketing, finance, human resources. The article has designed a system of strategic management by the example of a construction company in the information management system territorially - distributed building complexes, thus improving the competitiveness of the organization, to provide timely and quality implementation of business plans

  14. Modeling the value of strategic actions in the superior colliculus

    Directory of Open Access Journals (Sweden)

    Dhushan Thevarajah

    2010-02-01

    Full Text Available In learning models of strategic game play, an agent constructs a valuation (action value over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC, a midbrain region involved in planning saccadic eye movements, in monkeys while they performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game “matching-pennies”. In the instructed task, stochastic saccades were elicited through explicit instruction rather than free choices. In both tasks, neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Ho, Camerer, and Chong, 2007. Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions.

  15. Use of hardware accelerators for ATLAS computing

    CERN Document Server

    Bauce, Matteo; Dankel, Maik; Howard, Jacob; Kama, Sami

    2015-01-01

    Modern HEP experiments produce tremendous amounts of data. These data are processed by in-house built software frameworks which have lifetimes longer than the detector itself. Such frameworks were traditionally based on serial code and relied on advances in CPU technologies, mainly clock frequency, to cope with increasing data volumes. With the advent of many-core architectures and GPGPUs this paradigm has to shift to parallel processing and has to include the use of co-processors. However, since the design of most existing frameworks is based on the assumption of frequency scaling and predate co-processors, parallelisation and integration of co-processors are not an easy task. The ATLAS experiment is an example of such a big experiment with a big software framework called Athena. In this talk we will present the studies on parallelisation and co-processor (GPGPU) use in data preparation and tracking for trigger and offline reconstruction as well as their integration into a multiple process based Athena frame...

  16. Use of hardware accelerators for ATLAS computing

    CERN Document Server

    Dankel, Maik; The ATLAS collaboration; Howard, Jacob; Bauce, Matteo; Boing, Rene

    2015-01-01

    Modern HEP experiments produce tremendous amounts of data. This data is processed by in-house built software frameworks which have lifetimes longer than the detector it- self. Such frameworks were traditionally based on serial code and relied on advances in CPU technologies, mainly clock frequency, to cope with increasing data volumes. With the advent of many-core architectures and GPGPUs this paradigm has to shift to paral- lel processing and has to include the use of co-processors. However, since the design of most existing frameworks is based on the assumption of frequency scaling and predate co-processors, parallelisation and integration of co-processors are not an easy task. The ATLAS experiment is an example of such a big experiment with a big software frame- work called Athena. In this proceedings we will present the studies on parallelisation and co-processor (GPGPU) use in data preparation and tracking for trigger and offline recon- struction as well as their integration into a multiple process based...

  17. Parallel Computing Methods For Particle Accelerator Design

    CERN Document Server

    Popescu, Diana Andreea; Hersch, Roger

    We present methods for parallelizing the transport map construction for multi-core processors and for Graphics Processing Units (GPUs). We provide an efficient implementation of the transport map construction. We describe a method for multi-core processors using the OpenMP framework which brings performance improvement over the serial version of the map construction. We developed a novel and efficient algorithm for multivariate polynomial multiplication for GPUs and we implemented it using the CUDA framework. We show the benefits of using the multivariate polynomial multiplication algorithm for GPUs in the map composition operation for high orders. Finally, we present an algorithm for map composition for GPUs.

  18. DISCOM2: Distance Computing the SP2 Pilot FY98 Report

    Energy Technology Data Exchange (ETDEWEB)

    Beiriger, Judy; Byers, Rupert K.; Ernest, Martha J.; Goudy, Sue P.; Noe, John P.; Pratt, Thomas J.; Shirley, David N.; Tarman, Thomas D.; VanDevender, Walter H.; Wiltzius, David P.

    1999-05-01

    As a way to bootstrap the DISCOM(2) Distance Computing Program the SP2 Pilot Project was launched in March 1998. The Pilot was directed towards creating an environment to allow Sandia users to run their applications on the Accelerated Strategic Computing Initiative's (ASCI) Blue Pacific computation platform, the unclassified IBM SP2 platform at Lawrence Livermore National Laboratory (LLNL). The DISCOM(2) Pilot leverages the ASCI PSE (Problem solving Environment) efforts in networking and services to baseline the performance of the current system. Efforts in the following areas of the pilot are documented: applications, services, networking, visualization, and the system model. It details not only the running of two Sandia codes CTH and COYOTE on the Blue Pacific platform, but also the buildong of the Sandia National Laboratories (SNL) proxy environment of the RS6000 platforms to support the Sandia users.

  19. Strategic leadership: the essential skills.

    Science.gov (United States)

    Schoemaker, Paul J H; Krupp, Steve; Howland, Samantha

    2013-01-01

    The more uncertain your environment, the greater the opportunity--if you have the leadership skills to capitalize on it. Research at the Wharton school and at the authors' consulting firm, involving more than 20,000 executives to date, has identified six skills that, when mastered and used in concert, allow leaders to think strategically and navigate the unknown effectively. They are the abilities to anticipate, challenge, interpret, decide, align, and learn. This article describes the six skills in detail and includes a self-assessment that will enable you to identify the ones that most need your attention. The authors have found that strength in one skill cannot easily compensate for a deficit in another. An adaptive strategic leader has learned to apply all six at once. PMID:23390746

  20. Strategic alliances and market risk.

    Science.gov (United States)

    Havenaar, Matthias; Hiscocks, Peter

    2012-08-01

    Strategic alliances in product development and marketing are crucial to the biotechnology industry. Many alliances, however, are terminated before the drug reaches the market. In this article we make the case that strategic alliances can fail because of how they are negotiated. Alliance contracts are often inflexible and do not allow for changes in market conditions. We propose a model for contract valuation that can assist biotech and/or pharma deal makers in negotiating alliances that have a higher chance of survival in uncertain market conditions. The model makes use of variable royalties and milestone payments. Because licensing is key to the biotech and/or pharma business model this article will be of interest not only to professionals in licensing, but to all professionals active in the industry. PMID:22484547

  1. Human dimension of strategic partnerships

    Directory of Open Access Journals (Sweden)

    Petković Mirjana M.

    2004-01-01

    Full Text Available This paper aims to point to the widespread practice of neglecting behavioral aspects of different forms of fusions and integrations of enterprises that have emerged in the process of privatization through strategic partnerships with foreign companies among Serbian enterprises. The initial hypothesis in this paper is that the process of privatization, restructuring and transformation in Serbian enterprises cannot be completely successful and equally advantageous for all the subjects involved if there is no concern for human dimension of these processes. Without this concern there is a possibility for behavioral problems to arise, and the only way to resolve them is through post festum respecting and introducing elements that should never have been neglected in the first place. This paper refers to the phenomenon of collision of cultures and the ways of resolving it while forming strategic partnerships.

  2. Building a strategic security organisation.

    Science.gov (United States)

    Howard, Mike

    2016-01-01

    In everyone's day-to-day jobs there is constant need to deal with current and newly detected matters. This is now a world of immediacy, driven by the cadence of the business and its needs. These concerns should not be ignored, as failing to deal with these issues would not bode well for the future. It is essential that the gears are kept spinning. The challenge for any security organisation is to identify its short-term tactical requirements, while developing longer-term strategic needs. Once done, the differences can be accounted for and strides can be made toward a desired future state. This paper highlights several steps that the author and his team have taken in their own journey. There is no magic answer, each organisation will have its own unique challenges. Nevertheless, some of the approaches to building a strategic security organisation described in this paper are applicable to all organisations, irrespective of their size. PMID:27318284

  3. A Strategizing Perspective in Foresight

    DEFF Research Database (Denmark)

    : The overall purpose of the paper is partly to contribute to the discussion on the theoretical perspectives behind the practice of foresight and partly to suggest a strategizing approach in foresight practice. More specifically we focus on foresight as a policy tool for sectoral innovation. Approach......: As repeated by numerous practitioners and scholars foresight is not only about looking in to the future but also about make things happen today. Also as noted by several scholars the practice fo foresight over the recent decades has changed from focusing on intra-organisational planning and forecasting...... and inputs in order to facilitate strategizing in sectoral innovation systems. First, the literature of the innovation systems gives theoretical elements for ex post analyses of actors, institutions, knowledge flows, interaction patterns and dynamics of the considered system. This approach comprises both...

  4. Arms Control and Strategic Stability

    Institute of Scientific and Technical Information of China (English)

    Hu; Yumin

    2014-01-01

    This essay intends to offer a comment on concepts, trends and attitudes concerning arms control and strategic stability with reference to the current international security situation. It also offers observations from two different perspectives about strategic stability: one proceeds from the concept of universal security and aims to prevent conflicts and instability from disrupting regional and international security environment on which nation states depend so much for their peaceful development; the other starts from maintaining the global leadership by a super power and aiming to contain any challenge that sways or is likely to sway its dominating status. If China and the United States commit themselves to the undertaking of a new type of major powers relationship that stresses win-win cooperation, they will be able to contribute greatly to a stable international security architecture that is good for world peaceful development.

  5. Acceleration of saddle-point searches with machine learning.

    Science.gov (United States)

    Peterson, Andrew A

    2016-08-21

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  6. Acceleration of saddle-point searches with machine learning

    Science.gov (United States)

    Peterson, Andrew A.

    2016-08-01

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  7. Strategic analysis of Czech Airlines

    OpenAIRE

    Moiseeva, Polina

    2016-01-01

    The thesis called Strategic Analysis of Czech Airlines which completely analyses current situation within the company. It presents theoretical base for such an analysis and subsequently offers situational analysis, which includes the analysis of external environment, internal environment and suggestions for improvement. The thesis includes a complete companys SWOT analysis and offers the applying of Porters five forces framework. The thesis also includes recommendations and suggestions for th...

  8. Optimal Policies with Strategic Distortions

    OpenAIRE

    Kala Krishna; Marie Thursby

    1988-01-01

    Recent work in optimal trade policy for imperfectly competitive markets usually identifies the optimal level of an instrument, and when more instruments are allowed, general interpretations have been unavailable, This paper analyzes the jointly optimal levels of a Variety of instruments with oligopolistic competition. A targeting principle for identifying optimal policies is derived using the concept of a "strategic distortion." It is shown how optimal policies vary with the distortions prese...

  9. Strategic Workshops on Cancer Nanotechnology

    OpenAIRE

    Nagahara, Larry A.; Lee, Jerry S.H.; Molnar, Linda K.; Panaro, Nicholas J.; Farrell, Dorothy; Ptak, Krzysztof; Alper, Joseph; Grodzinski, Piotr

    2010-01-01

    Nanotechnology offers the potential for new approaches to detecting, treating and preventing cancer. To determine the current status of the cancer nanotechnology field and the optimal path forward, the National Cancer Institute’s Alliance for Nanotechnology in Cancer held three strategic workshops, covering the areas of in-vitro diagnostics and prevention, therapy and post-treatment, and in-vivo diagnosis and imaging. At each of these meetings, a wide range of experts from academia, industry,...

  10. Grandparental effects on reproductive strategizing

    OpenAIRE

    G. William Skinner

    2004-01-01

    This paper analyzes data from the household registers for two villages in the Nôbi region of central Japan in the late Edo period (1717-1869) to assess how grandparents may have affected reproductive strategizing in stem families. The particulars of the family system fostered a culturally favored set of reproductive goals, in particular, a daughter as eldest child, followed by a son (and heir), coupled with gender alternation in subsequent reproduction and overall gender balance. This reprodu...

  11. Strategic pricing of equity issues

    OpenAIRE

    Klaus Ritzberger; Frank Milne

    2002-01-01

    Consider a general equilibrium model where agents may behave strategically. Specifically, suppose some firm issues new shares. If the primary market price is controlled by the issuing institution and investors' expectations on future equity prices are constant in their share purchases, the share price on the primary market cannot exceed the secondary market share price. In certain cases this may imply strict underpricing of newly issued shares. If investors perceive an influence on future sha...

  12. Strategic negotiations towards sustainabilityfor entrepreneurs

    OpenAIRE

    Hurry, Jovin

    2012-01-01

    The purpose of this thesis is to find out what it takes for entrepreneurs to negotiate strategically in order to ultimately influence systemic change towards sustainability. It focuses on the challenges sustainability entrepreneurs face as they negotiate their twin objectives of social mission and positive cashflow during their collaborative processes. To answer this purpose, I conducted a participatory action research with the entrepreneurs at Hubs Westminster, King’s Cross and Islington in ...

  13. Collective intelligence in strategic leadership

    OpenAIRE

    Laine, Sebastian

    2011-01-01

    Collective intelligence is forecasted to transform businesses and the way companies make decisions. However, many senior business managers have a poor understanding of collective intelligence and thus they face challenges in utilizing it. The main objective of this study was to provide understanding of collective intelligence and examine how it can benefit strategic leadership. The research was conducted by using the methods of a qualitative case study. The study combines literature revie...

  14. Strategic Complexity and Global Expansion

    DEFF Research Database (Denmark)

    Oladottir, Asta Dis; Hobdari, Bersant; Papanastassiou, Marina;

    2012-01-01

    The purpose of this paper is to analyse the determinants of global expansion strategies of newcomer Multinational Corporations (MNCs) by focusing on Iceland, Israel and Ireland. We argue that newcomer MNCs from small open economies pursue complex global expansion strategies (CGES). We distinguish....... The empirical evidence suggests that newcomer MNCs move away from simplistic dualities in the formulation of their strategic choices towards more complex options as a means of maintaining and enhancing their global competitiveness....

  15. Strategic fit analysis - RW tracers

    OpenAIRE

    Berea Bellon, Carlos Adolfo

    2005-01-01

    This paper analyses, RW Tracers, a company dedicated to tracing lost individuals, to determine how to improve its performance since expected sales growth and profitability margin have yet to be met. The company's strategy is examined to determine if the company meets the criteria of strategic fit. To add context, an Industry Analysis is conducted to determine the Key Success Factors of the tracing industry. These Key Success Factors are then linked to the Industry Value Chain to build a model...

  16. Strategic Management in Estonian SMEs

    OpenAIRE

    Juhan Teder; Urve Venesaar

    2005-01-01

    The research is based on the empirical survey conducted among the members of Estonian Association of SMEs. The study involved goal setting, development and implementation of strategies, competitive advantages striven for, orientation toward growth, and factors hindering enterprisesí development. As a result, characteristics of strategic management in enterprises with different levels of growth orientation (expanding, stable and declining enterprises) as well as those depending on other enterp...

  17. Strategic marketing planning in library

    OpenAIRE

    Karmen Štular-Sotošek

    2000-01-01

    The article is based on the idea that every library can design instruments for creating events and managing the important resources of today's world, especially to manage the changes. This process can only be successful if libraries use adequate marketing methods. Strategic marketing planning starts with the analysis of library's mission, its objectives, goals and corporate culture. By analysing the public environment, the competitive environment and the macro environment, libraries recognise...

  18. Strategic analysis of Swedish agriculture

    OpenAIRE

    Fogelfors, Håkan; Wivstad, Maria; Eckersten, Henrik; Holstein, Fredrik; Johansson, Susanne; Verwijst, Theo

    2009-01-01

    This strategic analysis of Swedish agriculture – production systems and agricultural landscapes in a time of change – focuses on climate change, future availability of natural resources and economic regulation in a global food market. The background to the project was that the Faculty of Natural Resources and Agriculture of the Swedish University of Agricultural Sciences identified an urgent need to explore the implications and opportunities of coming changes for agricultural production syste...

  19. Strategic Analysis for Patch Ltd.

    OpenAIRE

    Louis, Owen

    2012-01-01

    This paper is a strategic analysis for the start-up Patch Ltd. Patch has developed innovative products for growing produce in homes and will compete in the consumer containergrowing industry. The industry and the company are introduced along with urban agriculture trends. The industry is analysed using Porter’s 5 forces analysis, and a competitive analysis compares Patch to its competitors in key success factors found in the 5 forces analysis. A strategy is developed using opportunities and t...

  20. Managing the strategic capital cycle.

    Science.gov (United States)

    Kaufman, K

    1997-12-01

    A healthcare organization's financial strategy should be defined within the context of the capital cycle and provide for the management of three critical components that will ensure the expansion and renewal of capital capacity--the design and implementation of the financial plan, the design and implementation of an appropriate capital structure, and a means to strategically utilize capital and reinvest it in the organization. The capital cycle comprises two parts--strategic planning and implementation, and the development of the support infrastructure that includes financial planning, capital structure, and capital allocation. The financial plan positions an organization within an area of financial equilibrium and defines its organizational capabilities. The financial infrastructure gives integrity and momentum to the capital cycle. Capital structure addresses critical funding and financing questions and is best defined as the combination of debt and equity that funds the strategic plan. In regard to capital allocation, healthcare organizations should follow a corporate "best practices" approach for such areas as financial objectives and policies, project review, and the capital expenditures approval process.