WorldWideScience

Sample records for energy research supercomputer

  1. Centralized supercomputer support for magnetic fusion energy research

    International Nuclear Information System (INIS)

    Fuss, D.; Tull, G.G.

    1984-01-01

    High-speed computers with large memories are vital to magnetic fusion energy research. Magnetohydrodynamic (MHD), transport, equilibrium, Vlasov, particle, and Fokker-Planck codes that model plasma behavior play an important role in designing experimental hardware and interpreting the resulting data, as well as in advancing plasma theory itself. The size, architecture, and software of supercomputers to run these codes are often the crucial constraints on the benefits such computational modeling can provide. Hence, vector computers such as the CRAY-1 offer a valuable research resource. To meet the computational needs of the fusion program, the National Magnetic Fusion Energy Computer Center (NMFECC) was established in 1974 at the Lawrence Livermore National Laboratory. Supercomputers at the central computing facility are linked to smaller computer centers at each of the major fusion laboratories by a satellite communication network. In addition to providing large-scale computing, the NMFECC environment stimulates collaboration and the sharing of computer codes and data among the many fusion researchers in a cost-effective manner

  2. Role of supercomputers in magnetic fusion and energy research programs

    International Nuclear Information System (INIS)

    Killeen, J.

    1985-06-01

    The importance of computer modeling in magnetic fusion (MFE) and energy research (ER) programs is discussed. The need for the most advanced supercomputers is described, and the role of the National Magnetic Fusion Energy Computer Center in meeting these needs is explained

  3. Proceedings of the first energy research power supercomputer users symposium

    International Nuclear Information System (INIS)

    1991-01-01

    The Energy Research Power Supercomputer Users Symposium was arranged to showcase the richness of science that has been pursued and accomplished in this program through the use of supercomputers and now high performance parallel computers over the last year: this report is the collection of the presentations given at the Symposium. ''Power users'' were invited by the ER Supercomputer Access Committee to show that the use of these computational tools and the associated data communications network, ESNet, go beyond merely speeding up computations. Today the work often directly contributes to the advancement of the conceptual developments in their fields and the computational and network resources form the very infrastructure of today's science. The Symposium also provided an opportunity, which is rare in this day of network access to computing resources, for the invited users to compare and discuss their techniques and approaches with those used in other ER disciplines. The significance of new parallel architectures was highlighted by the interesting evening talk given by Dr. Stephen Orszag of Princeton University

  4. Supercomputer applications in nuclear research

    International Nuclear Information System (INIS)

    Ishiguro, Misako

    1992-01-01

    The utilization of supercomputers in Japan Atomic Energy Research Institute is mainly reported. The fields of atomic energy research which use supercomputers frequently and the contents of their computation are outlined. What is vectorizing is simply explained, and nuclear fusion, nuclear reactor physics, the hydrothermal safety of nuclear reactors, the parallel property that the atomic energy computations of fluids and others have, the algorithm for vector treatment and the effect of speed increase by vectorizing are discussed. At present Japan Atomic Energy Research Institute uses two systems of FACOM VP 2600/10 and three systems of M-780. The contents of computation changed from criticality computation around 1970, through the analysis of LOCA after the TMI accident, to nuclear fusion research, the design of new type reactors and reactor safety assessment at present. Also the method of using computers advanced from batch processing to time sharing processing, from one-dimensional to three dimensional computation, from steady, linear to unsteady nonlinear computation, from experimental analysis to numerical simulation and so on. (K.I.)

  5. What is supercomputing ?

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1992-01-01

    Supercomputing means the high speed computation using a supercomputer. Supercomputers and the technical term ''supercomputing'' have spread since ten years ago. The performances of the main computers installed so far in Japan Atomic Energy Research Institute are compared. There are two methods to increase computing speed by using existing circuit elements, parallel processor system and vector processor system. CRAY-1 is the first successful vector computer. Supercomputing technology was first applied to meteorological organizations in foreign countries, and to aviation and atomic energy research institutes in Japan. The supercomputing for atomic energy depends on the trend of technical development in atomic energy, and the contents are divided into the increase of computing speed in existing simulation calculation and the acceleration of the new technical development of atomic energy. The examples of supercomputing in Japan Atomic Energy Research Institute are reported. (K.I.)

  6. Supercomputing and related national projects in Japan

    International Nuclear Information System (INIS)

    Miura, Kenichi

    1985-01-01

    Japanese supercomputer development activities in the industry and research projects are outlined. Architecture, technology, software, and applications of Fujitsu's Vector Processor Systems are described as an example of Japanese supercomputers. Applications of supercomputers to high energy physics are also discussed. (orig.)

  7. Enabling department-scale supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  8. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  9. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  10. Supercomputational science

    CERN Document Server

    Wilson, S

    1990-01-01

    In contemporary research, the supercomputer now ranks, along with radio telescopes, particle accelerators and the other apparatus of "big science", as an expensive resource, which is nevertheless essential for state of the art research. Supercomputers are usually provided as shar.ed central facilities. However, unlike, telescopes and accelerators, they are find a wide range of applications which extends across a broad spectrum of research activity. The difference in performance between a "good" and a "bad" computer program on a traditional serial computer may be a factor of two or three, but on a contemporary supercomputer it can easily be a factor of one hundred or even more! Furthermore, this factor is likely to increase with future generations of machines. In keeping with the large capital and recurrent costs of these machines, it is appropriate to devote effort to training and familiarization so that supercomputers are employed to best effect. This volume records the lectures delivered at a Summer School ...

  11. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee

    2011-11-15

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  12. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee; Kaushik, Dinesh; Winfer, Andrew

    2011-01-01

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  13. Supercomputing - Use Cases, Advances, The Future (1/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the first day, we will focus on the history and theory of supercomputing, the top500 list and the hardware that makes supercomputers tick. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP an...

  14. Supercomputing - Use Cases, Advances, The Future (2/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the second day, we will focus on software and software paradigms driving supercomputers, workloads that need supercomputing treatment, advances in technology and possible future developments. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and i...

  15. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  16. Computational Dimensionalities of Global Supercomputing

    Directory of Open Access Journals (Sweden)

    Richard S. Segall

    2013-12-01

    Full Text Available This Invited Paper pertains to subject of my Plenary Keynote Speech at the 17th World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI 2013 held in Orlando, Florida on July 9-12, 2013. The title of my Plenary Keynote Speech was: "Dimensionalities of Computation: from Global Supercomputing to Data, Text and Web Mining" but this Invited Paper will focus only on the "Computational Dimensionalities of Global Supercomputing" and is based upon a summary of the contents of several individual articles that have been previously written with myself as lead author and published in [75], [76], [77], [78], [79], [80] and [11]. The topics of these of the Plenary Speech included Overview of Current Research in Global Supercomputing [75], Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing [76], Data Mining Supercomputing with SAS™ JMP® Genomics ([77], [79], [80], and Visualization by Supercomputing Data Mining [81]. ______________________ [11.] Committee on the Future of Supercomputing, National Research Council (2003, The Future of Supercomputing: An Interim Report, ISBN-13: 978-0-309-09016- 2, http://www.nap.edu/catalog/10784.html [75.] Segall, Richard S.; Zhang, Qingyu and Cook, Jeffrey S.(2013, "Overview of Current Research in Global Supercomputing", Proceedings of Forty- Fourth Meeting of Southwest Decision Sciences Institute (SWDSI, Albuquerque, NM, March 12-16, 2013. [76.] Segall, Richard S. and Zhang, Qingyu (2010, "Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing", Proceedings of 5th INFORMS Workshop on Data Mining and Health Informatics, Austin, TX, November 6, 2010. [77.] Segall, Richard S., Zhang, Qingyu and Pierce, Ryan M.(2010, "Data Mining Supercomputing with SAS™ JMP®; Genomics: Research-in-Progress, Proceedings of 2010 Conference on Applied Research in Information Technology, sponsored by

  17. An assessment of worldwide supercomputer usage

    Energy Technology Data Exchange (ETDEWEB)

    Wasserman, H.J.; Simmons, M.L.; Hayes, A.H.

    1995-01-01

    This report provides a comparative study of advanced supercomputing usage in Japan and the United States as of Spring 1994. It is based on the findings of a group of US scientists whose careers have centered on programming, evaluating, and designing high-performance supercomputers for over ten years. The report is a follow-on to an assessment of supercomputing technology in Europe and Japan that was published in 1993. Whereas the previous study focused on supercomputer manufacturing capabilities, the primary focus of the current work was to compare where and how supercomputers are used. Research for this report was conducted through both literature studies and field research in Japan.

  18. Japanese supercomputer technology

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Ewald, R.H.; Worlton, W.J.

    1982-01-01

    In February 1982, computer scientists from the Los Alamos National Laboratory and Lawrence Livermore National Laboratory visited several Japanese computer manufacturers. The purpose of these visits was to assess the state of the art of Japanese supercomputer technology and to advise Japanese computer vendors of the needs of the US Department of Energy (DOE) for more powerful supercomputers. The Japanese foresee a domestic need for large-scale computing capabilities for nuclear fusion, image analysis for the Earth Resources Satellite, meteorological forecast, electrical power system analysis (power flow, stability, optimization), structural and thermal analysis of satellites, and very large scale integrated circuit design and simulation. To meet this need, Japan has launched an ambitious program to advance supercomputer technology. This program is described

  19. Visualization environment of the large-scale data of JAEA's supercomputer system

    Energy Technology Data Exchange (ETDEWEB)

    Sakamoto, Kensaku [Japan Atomic Energy Agency, Center for Computational Science and e-Systems, Tokai, Ibaraki (Japan); Hoshi, Yoshiyuki [Research Organization for Information Science and Technology (RIST), Tokai, Ibaraki (Japan)

    2013-11-15

    On research and development of various fields of nuclear energy, visualization of calculated data is especially useful to understand the result of simulation in an intuitive way. Many researchers who run simulations on the supercomputer in Japan Atomic Energy Agency (JAEA) are used to transfer calculated data files from the supercomputer to their local PCs for visualization. In recent years, as the size of calculated data has gotten larger with improvement of supercomputer performance, reduction of visualization processing time as well as efficient use of JAEA network is being required. As a solution, we introduced a remote visualization system which has abilities to utilize parallel processors on the supercomputer and to reduce the usage of network resources by transferring data of intermediate visualization process. This paper reports a study on the performance of image processing with the remote visualization system. The visualization processing time is measured and the influence of network speed is evaluated by varying the drawing mode, the size of visualization data and the number of processors. Based on this study, a guideline for using the remote visualization system is provided to show how the system can be used effectively. An upgrade policy of the next system is also shown. (author)

  20. A training program for scientific supercomputing users

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, F.; Moher, T.; Sabelli, N.; Solem, A.

    1988-01-01

    There is need for a mechanism to transfer supercomputing technology into the hands of scientists and engineers in such a way that they will acquire a foundation of knowledge that will permit integration of supercomputing as a tool in their research. Most computing center training emphasizes computer-specific information about how to use a particular computer system; most academic programs teach concepts to computer scientists. Only a few brief courses and new programs are designed for computational scientists. This paper describes an eleven-week training program aimed principally at graduate and postdoctoral students in computationally-intensive fields. The program is designed to balance the specificity of computing center courses, the abstractness of computer science courses, and the personal contact of traditional apprentice approaches. It is based on the experience of computer scientists and computational scientists, and consists of seminars and clinics given by many visiting and local faculty. It covers a variety of supercomputing concepts, issues, and practices related to architecture, operating systems, software design, numerical considerations, code optimization, graphics, communications, and networks. Its research component encourages understanding of scientific computing and supercomputer hardware issues. Flexibility in thinking about computing needs is emphasized by the use of several different supercomputer architectures, such as the Cray X/MP48 at the National Center for Supercomputing Applications at University of Illinois at Urbana-Champaign, IBM 3090 600E/VF at the Cornell National Supercomputer Facility, and Alliant FX/8 at the Advanced Computing Research Facility at Argonne National Laboratory. 11 refs., 6 tabs.

  1. Integration of Panda Workload Management System with supercomputers

    Science.gov (United States)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads

  2. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  3. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  4. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  5. Status reports of supercomputing astrophysics in Japan

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Nagasawa, Mikio

    1990-01-01

    The Workshop on Supercomputing Astrophysics was held at National Laboratory for High Energy Physics (KEK, Tsukuba) from August 31 to September 2, 1989. More than 40 participants of physicists, astronomers were attendant and discussed many topics in the informal atmosphere. The main purpose of this workshop was focused on the theoretical activities in computational astrophysics in Japan. It was also aimed to promote effective collaboration between the numerical experimentists working on supercomputing technique. The various subjects of the presented papers of hydrodynamics, plasma physics, gravitating systems, radiative transfer and general relativity are all stimulating. In fact, these numerical calculations become possible now in Japan owing to the power of Japanese supercomputer such as HITAC S820, Fujitsu VP400E and NEC SX-2. (J.P.N.)

  6. Supercomputers to transform Science

    CERN Multimedia

    2006-01-01

    "New insights into the structure of space and time, climate modeling, and the design of novel drugs, are but a few of the many research areas that will be transforned by the installation of three supercomputers at the Unversity of Bristol." (1/2 page)

  7. Computational Science with the Titan Supercomputer: Early Outcomes and Lessons Learned

    Science.gov (United States)

    Wells, Jack

    2014-03-01

    Modeling and simulation with petascale computing has supercharged the process of innovation and understanding, dramatically accelerating time-to-insight and time-to-discovery. This presentation will focus on early outcomes from the Titan supercomputer at the Oak Ridge National Laboratory. Titan has over 18,000 hybrid compute nodes consisting of both CPUs and GPUs. In this presentation, I will discuss the lessons we have learned in deploying Titan and preparing applications to move from conventional CPU architectures to a hybrid machine. I will present early results of materials applications running on Titan and the implications for the research community as we prepare for exascale supercomputer in the next decade. Lastly, I will provide an overview of user programs at the Oak Ridge Leadership Computing Facility with specific information how researchers may apply for allocations of computing resources. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.

  8. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    Science.gov (United States)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  9. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    Energy Technology Data Exchange (ETDEWEB)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S [Earth Sciences Department. Barcelona Supercomputing Center. Barcelona (Spain); Cuevas, E [Izanaa Atmospheric Research Center. Agencia Estatal de Meteorologia, Tenerife (Spain); Nickovic, S [Atmospheric Research and Environment Branch, World Meteorological Organization, Geneva (Switzerland)], E-mail: carlos.perez@bsc.es

    2009-03-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  10. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    International Nuclear Information System (INIS)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S; Cuevas, E; Nickovic, S

    2009-01-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  11. Status of supercomputers in the US

    International Nuclear Information System (INIS)

    Fernbach, S.

    1985-01-01

    Current Supercomputers; that is, the Class VI machines which first became available in 1976 are being delivered in greater quantity than ever before. In addition, manufacturers are busily working on Class VII machines to be ready for delivery in CY 1987. Mainframes are being modified or designed to take on some features of the supercomputers and new companies with the intent of either competing directly in the supercomputer arena or in providing entry-level systems from which to graduate to supercomputers are springing up everywhere. Even well founded organizations like IBM and CDC are adding machines with vector instructions in their repertoires. Japanese - manufactured supercomputers are also being introduced into the U.S. Will these begin to compete with those of U.S. manufacture. Are they truly competitive. It turns out that both from the hardware and software points of view they may be superior. We may be facing the same problems in supercomputers that we faced in videosystems

  12. Automatic discovery of the communication network topology for building a supercomputer model

    Science.gov (United States)

    Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim

    2016-10-01

    The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.

  13. Research center Juelich to install Germany's most powerful supercomputer new IBM System for science and research will achieve 5.8 trillion computations per second

    CERN Multimedia

    2002-01-01

    "The Research Center Juelich, Germany, and IBM today announced that they have signed a contract for the delivery and installation of a new IBM supercomputer at the Central Institute for Applied Mathematics" (1/2 page).

  14. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Energy Technology Data Exchange (ETDEWEB)

    De, K [University of Texas at Arlington; Jha, S [Rutgers University; Klimentov, A [Brookhaven National Laboratory (BNL); Maeno, T [Brookhaven National Laboratory (BNL); Nilsson, P [Brookhaven National Laboratory (BNL); Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Wells, Jack C [ORNL; Wenaus, T [Brookhaven National Laboratory (BNL)

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation

  15. Convex unwraps its first grown-up supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Manuel, T.

    1988-03-03

    Convex Computer Corp.'s new supercomputer family is even more of an industry blockbuster than its first system. At a tenfold jump in performance, it's far from just an incremental upgrade over its first minisupercomputer, the C-1. The heart of the new family, the new C-2 processor, churning at 50 million floating-point operations/s, spawns a group of systems whose performance could pass for some fancy supercomputers-namely those of the Cray Research Inc. family. When added to the C-1, Convex's five new supercomputers create the C series, a six-member product group offering a performance range from 20 to 200 Mflops. They mark an important transition for Convex from a one-product high-tech startup to a multinational company with a wide-ranging product line. It's a tough transition but the Richardson, Texas, company seems to be doing it. The extended product line propels Convex into the upper end of the minisupercomputer class and nudges it into the low end of the big supercomputers. It positions Convex in an uncrowded segment of the market in the $500,000 to $1 million range offering 50 to 200 Mflops of performance. The company is making this move because the minisuper area, which it pioneered, quickly became crowded with new vendors, causing prices and gross margins to drop drastically.

  16. TOP500 Supercomputers for June 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-06-23

    23rd Edition of TOP500 List of World's Fastest Supercomputers Released: Japan's Earth Simulator Enters Third Year in Top Position MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 23rd edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2004) at the International Supercomputer Conference in Heidelberg, Germany.

  17. TOP500 Supercomputers for June 2005

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2005-06-22

    25th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/L LNL BlueGene/L and IBM gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 25th edition of the TOP500 list of the world's fastest supercomputers was released today (June 22, 2005) at the 20th International Supercomputing Conference (ISC2005) in Heidelberg Germany.

  18. QCD on the BlueGene/L Supercomputer

    International Nuclear Information System (INIS)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-01-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented

  19. QCD on the BlueGene/L Supercomputer

    Science.gov (United States)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-03-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented.

  20. TOP500 Supercomputers for November 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-11-16

    22nd Edition of TOP500 List of World s Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 22nd edition of the TOP500 list of the worlds fastest supercomputers was released today (November 16, 2003). The Earth Simulator supercomputer retains the number one position with its Linpack benchmark performance of 35.86 Tflop/s (''teraflops'' or trillions of calculations per second). It was built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan.

  1. INTEL: Intel based systems move up in supercomputing ranks

    CERN Multimedia

    2002-01-01

    "The TOP500 supercomputer rankings released today at the Supercomputing 2002 conference show a dramatic increase in the number of Intel-based systems being deployed in high-performance computing (HPC) or supercomputing areas" (1/2 page).

  2. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Science.gov (United States)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  3. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    International Nuclear Information System (INIS)

    Klimentov, A; Maeno, T; Nilsson, P; Panitkin, S; Wenaus, T; De, K; Oleynik, D; Jha, S; Wells, J

    2016-01-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  4. World's fastest supercomputer opens up to users

    Science.gov (United States)

    Xin, Ling

    2016-08-01

    China's latest supercomputer - Sunway TaihuLight - has claimed the crown as the world's fastest computer according to the latest TOP500 list, released at the International Supercomputer Conference in Frankfurt in late June.

  5. OpenMP Performance on the Columbia Supercomputer

    Science.gov (United States)

    Haoqiang, Jin; Hood, Robert

    2005-01-01

    This presentation discusses Columbia World Class Supercomputer which is one of the world's fastest supercomputers providing 61 TFLOPs (10/20/04). Conceived, designed, built, and deployed in just 120 days. A 20-node supercomputer built on proven 512-processor nodes. The largest SGI system in the world with over 10,000 Intel Itanium 2 processors and provides the largest node size incorporating commodity parts (512) and the largest shared-memory environment (2048) with 88% efficiency tops the scalar systems on the Top500 list.

  6. Novel Supercomputing Approaches for High Performance Linear Algebra Using FPGAs, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Supercomputing plays a major role in many areas of science and engineering, and it has had tremendous impact for decades in areas such as aerospace, defense, energy,...

  7. Application of Supercomputer Technologies for Simulation Of Socio-Economic Systems

    Directory of Open Access Journals (Sweden)

    Vladimir Valentinovich Okrepilov

    2015-06-01

    Full Text Available To date, an extensive experience has been accumulated in investigation of problems related to quality, assessment of management systems, modeling of economic system sustainability. The performed studies have created a basis for development of a new research area — Economics of Quality. Its tools allow to use opportunities of model simulation for construction of the mathematical models adequately reflecting the role of quality in natural, technical, social regularities of functioning of the complex socio-economic systems. Extensive application and development of models, and also system modeling with use of supercomputer technologies, on our deep belief, will bring the conducted research of socio-economic systems to essentially new level. Moreover, the current scientific research makes a significant contribution to model simulation of multi-agent social systems and that is not less important, it belongs to the priority areas in development of science and technology in our country. This article is devoted to the questions of supercomputer technologies application in public sciences, first of all, — regarding technical realization of the large-scale agent-focused models (AFM. The essence of this tool is that owing to the power computer increase it has become possible to describe the behavior of many separate fragments of a difficult system, as socio-economic systems are. The article also deals with the experience of foreign scientists and practicians in launching the AFM on supercomputers, and also the example of AFM developed in CEMI RAS, stages and methods of effective calculating kernel display of multi-agent system on architecture of a modern supercomputer will be analyzed. The experiments on the basis of model simulation on forecasting the population of St. Petersburg according to three scenarios as one of the major factors influencing the development of socio-economic system and quality of life of the population are presented in the

  8. Summaries of research and development activities by using supercomputer system of JAEA in FY2015. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2015, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2015, as well as user support, operational records and overviews of the system, and so on. (author)

  9. Summaries of research and development activities by using supercomputer system of JAEA in FY2014. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2016-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2014, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2014, as well as user support, operational records and overviews of the system, and so on. (author)

  10. Summaries of research and development activities by using supercomputer system of JAEA in FY2013. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. About 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2013, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue. This report presents a great amount of R and D results accomplished by using the system in FY2013, as well as user support, operational records and overviews of the system, and so on. (author)

  11. Summaries of research and development activities by using supercomputer system of JAEA in FY2012. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2012, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as apriority issue. This report presents a great amount of R and D results accomplished by using the system in FY2012, as well as user support, operational records and overviews of the system, and so on. (author)

  12. Summaries of research and development activities by using supercomputer system of JAEA in FY2011. April 1, 2011 - March 31, 2012

    International Nuclear Information System (INIS)

    2013-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2011, the system was used for analyses of the accident at the Fukushima Daiichi Nuclear Power Station and establishment of radioactive decontamination plan, as well as the JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great amount of R and D results accomplished by using the system in FY2011, as well as user support structure, operational records and overviews of the system, and so on. (author)

  13. Advanced parallel processing with supercomputer architectures

    International Nuclear Information System (INIS)

    Hwang, K.

    1987-01-01

    This paper investigates advanced parallel processing techniques and innovative hardware/software architectures that can be applied to boost the performance of supercomputers. Critical issues on architectural choices, parallel languages, compiling techniques, resource management, concurrency control, programming environment, parallel algorithms, and performance enhancement methods are examined and the best answers are presented. The authors cover advanced processing techniques suitable for supercomputers, high-end mainframes, minisupers, and array processors. The coverage emphasizes vectorization, multitasking, multiprocessing, and distributed computing. In order to achieve these operation modes, parallel languages, smart compilers, synchronization mechanisms, load balancing methods, mapping parallel algorithms, operating system functions, application library, and multidiscipline interactions are investigated to ensure high performance. At the end, they assess the potentials of optical and neural technologies for developing future supercomputers

  14. The Pawsey Supercomputer geothermal cooling project

    Science.gov (United States)

    Regenauer-Lieb, K.; Horowitz, F.; Western Australian Geothermal Centre Of Excellence, T.

    2010-12-01

    The Australian Government has funded the Pawsey supercomputer in Perth, Western Australia, providing computational infrastructure intended to support the future operations of the Australian Square Kilometre Array radiotelescope and to boost next-generation computational geosciences in Australia. Supplementary funds have been directed to the development of a geothermal exploration well to research the potential for direct heat use applications at the Pawsey Centre site. Cooling the Pawsey supercomputer may be achieved by geothermal heat exchange rather than by conventional electrical power cooling, thus reducing the carbon footprint of the Pawsey Centre and demonstrating an innovative green technology that is widely applicable in industry and urban centres across the world. The exploration well is scheduled to be completed in 2013, with drilling due to commence in the third quarter of 2011. One year is allocated to finalizing the design of the exploration, monitoring and research well. Success in the geothermal exploration and research program will result in an industrial-scale geothermal cooling facility at the Pawsey Centre, and will provide a world-class student training environment in geothermal energy systems. A similar system is partially funded and in advanced planning to provide base-load air-conditioning for the main campus of the University of Western Australia. Both systems are expected to draw ~80-95 degrees C water from aquifers lying between 2000 and 3000 meters depth from naturally permeable rocks of the Perth sedimentary basin. The geothermal water will be run through absorption chilling devices, which only require heat (as opposed to mechanical work) to power a chilled water stream adequate to meet the cooling requirements. Once the heat has been removed from the geothermal water, licensing issues require the water to be re-injected back into the aquifer system. These systems are intended to demonstrate the feasibility of powering large-scale air

  15. Extending ATLAS Computing to Commercial Clouds and Supercomputers

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Filipcic, A; Klimentov, A; Maeno, T; Oleynik, D; Panitkin, S; Wenaus, T; Wu, W

    2014-01-01

    The Large Hadron Collider will resume data collection in 2015 with substantially increased computing requirements relative to its first 2009-2013 run. A near doubling of the energy and the data rate, high level of event pile-up, and detector upgrades will mean the number and complexity of events to be analyzed will increase dramatically. A naive extrapolation of the Run 1 experience would suggest that a 5-6 fold increase in computing resources are needed - impossible within the anticipated flat computing budgets in the near future. Consequently ATLAS is engaged in an ambitious program to expand its computing to all available resources, notably including opportunistic use of commercial clouds and supercomputers. Such resources present new challenges in managing heterogeneity, supporting data flows, parallelizing workflows, provisioning software, and other aspects of distributed computing, all while minimizing operational load. We will present the ATLAS experience to date with clouds and supercomputers, and des...

  16. Guide to dataflow supercomputing basic concepts, case studies, and a detailed example

    CERN Document Server

    Milutinovic, Veljko; Trifunovic, Nemanja; Giorgi, Roberto

    2015-01-01

    This unique text/reference describes an exciting and novel approach to supercomputing in the DataFlow paradigm. The major advantages and applications of this approach are clearly described, and a detailed explanation of the programming model is provided using simple yet effective examples. The work is developed from a series of lecture courses taught by the authors in more than 40 universities across more than 20 countries, and from research carried out by Maxeler Technologies, Inc. Topics and features: presents a thorough introduction to DataFlow supercomputing for big data problems; revie

  17. Supporting Scientific Research with the Energy Sciences Network

    CERN Multimedia

    CERN. Geneva; Monga, Inder

    2016-01-01

    The Energy Sciences Network (ESnet) is a high-performance, unclassified national network built to support scientific research. Funded by the U.S. Department of Energy’s Office of Science (SC) and managed by Lawrence Berkeley National Laboratory, ESnet provides services to more than 40 DOE research sites, including the entire National Laboratory system, its supercomputing facilities, and its major scientific instruments. ESnet also connects to 140 research and commercial networks, permitting DOE-funded scientists to productively collaborate with partners around the world. ESnet Division Director (Interim) Inder Monga and ESnet Networking Engineer David Mitchell will present current ESnet projects and research activities which help support the HEP community. ESnet  helps support the CERN community by providing 100Gbps trans-Atlantic network transport for the LHCONE and LHCOPN services. ESnet is also actively engaged in researching connectivity to cloud computing resources for HEP workflows a...

  18. TOP500 Supercomputers for November 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-11-08

    24th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/IBM BlueGene/L and NASA/SGI's Columbia gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 24th edition of the TOP500 list of the worlds fastest supercomputers was released today (November 8, 2004) at the SC2004 Conference in Pittsburgh, Pa.

  19. TOP500 Supercomputers for June 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-06-23

    21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamos National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.

  20. TOP500 Supercomputers for June 2002

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2002-06-20

    19th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 19th edition of the TOP500 list of the worlds fastest supercomputers was released today (June 20, 2002). The recently installed Earth Simulator supercomputer at the Earth Simulator Center in Yokohama, Japan, is as expected the clear new number 1. Its performance of 35.86 Tflop/s (trillions of calculations per second) running the Linpack benchmark is almost five times higher than the performance of the now No.2 IBM ASCI White system at Lawrence Livermore National Laboratory (7.2 Tflop/s). This powerful leap frogging to the top by a system so much faster than the previous top system is unparalleled in the history of the TOP500.

  1. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    Science.gov (United States)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  2. Comments on the parallelization efficiency of the Sunway TaihuLight supercomputer

    OpenAIRE

    Végh, János

    2016-01-01

    In the world of supercomputers, the large number of processors requires to minimize the inefficiencies of parallelization, which appear as a sequential part of the program from the point of view of Amdahl's law. The recently suggested new figure of merit is applied to the recently presented supercomputer, and the timeline of "Top 500" supercomputers is scrutinized using the metric. It is demonstrated, that in addition to the computing performance and power consumption, the new supercomputer i...

  3. The ETA10 supercomputer system

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, Inc. ETA 10 is a next-generation supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamics MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed. (orig.)

  4. Development of seismic tomography software for hybrid supercomputers

    Science.gov (United States)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on

  5. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  6. Applications of supercomputing and the utility industry: Calculation of power transfer capabilities

    International Nuclear Information System (INIS)

    Jensen, D.D.; Behling, S.R.; Betancourt, R.

    1990-01-01

    Numerical models and iterative simulation using supercomputers can furnish cost-effective answers to utility industry problems that are all but intractable using conventional computing equipment. An example of the use of supercomputers by the utility industry is the determination of power transfer capability limits for power transmission systems. This work has the goal of markedly reducing the run time of transient stability codes used to determine power distributions following major system disturbances. To date, run times of several hours on a conventional computer have been reduced to several minutes on state-of-the-art supercomputers, with further improvements anticipated to reduce run times to less than a minute. In spite of the potential advantages of supercomputers, few utilities have sufficient need for a dedicated in-house supercomputing capability. This problem is resolved using a supercomputer center serving a geographically distributed user base coupled via high speed communication networks

  7. Plane-wave electronic structure calculations on a parallel supercomputer

    International Nuclear Information System (INIS)

    Nelson, J.S.; Plimpton, S.J.; Sears, M.P.

    1993-01-01

    The development of iterative solutions of Schrodinger's equation in a plane-wave (pw) basis over the last several years has coincided with great advances in the computational power available for performing the calculations. These dual developments have enabled many new and interesting condensed matter phenomena to be studied from a first-principles approach. The authors present a detailed description of the implementation on a parallel supercomputer (hypercube) of the first-order equation-of-motion solution to Schrodinger's equation, using plane-wave basis functions and ab initio separable pseudopotentials. By distributing the plane-waves across the processors of the hypercube many of the computations can be performed in parallel, resulting in decreases in the overall computation time relative to conventional vector supercomputers. This partitioning also provides ample memory for large Fast Fourier Transform (FFT) meshes and the storage of plane-wave coefficients for many hundreds of energy bands. The usefulness of the parallel techniques is demonstrated by benchmark timings for both the FFT's and iterations of the self-consistent solution of Schrodinger's equation for different sized Si unit cells of up to 512 atoms

  8. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  9. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  10. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  11. The ETA systems plans for supercomputers

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, is a class VII supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamic MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed

  12. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    Directory of Open Access Journals (Sweden)

    Khimich, O.M.

    2016-09-01

    Full Text Available New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  13. Direct exploitation of a top 500 Supercomputer for Analysis of CMS Data

    International Nuclear Information System (INIS)

    Cabrillo, I; Cabellos, L; Marco, J; Fernandez, J; Gonzalez, I

    2014-01-01

    The Altamira Supercomputer hosted at the Instituto de Fisica de Cantatbria (IFCA) entered in operation in summer 2012. Its last generation FDR Infiniband network used (for message passing) in parallel jobs, supports the connection to General Parallel File System (GPFS) servers, enabling an efficient simultaneous processing of multiple data demanding jobs. Sharing a common GPFS system and a single LDAP-based identification with the existing Grid clusters at IFCA allows CMS researchers to exploit the large instantaneous capacity of this supercomputer to execute analysis jobs. The detailed experience describing this opportunistic use for skimming and final analysis of CMS 2012 data for a specific physics channel, resulting in an order of magnitude reduction of the waiting time, is presented.

  14. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  15. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  16. Communication Characterization and Optimization of Applications Using Topology-Aware Task Mapping on Large Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; D' Azevedo, Eduardo [ORNL; Philip, Bobby [ORNL; Worley, Patrick H [ORNL

    2016-01-01

    On large supercomputers, the job scheduling systems may assign a non-contiguous node allocation for user applications depending on available resources. With parallel applications using MPI (Message Passing Interface), the default process ordering does not take into account the actual physical node layout available to the application. This contributes to non-locality in terms of physical network topology and impacts communication performance of the application. In order to mitigate such performance penalties, this work describes techniques to identify suitable task mapping that takes the layout of the allocated nodes as well as the application's communication behavior into account. During the first phase of this research, we instrumented and collected performance data to characterize communication behavior of critical US DOE (United States - Department of Energy) applications using an augmented version of the mpiP tool. Subsequently, we developed several reordering methods (spectral bisection, neighbor join tree etc.) to combine node layout and application communication data for optimized task placement. We developed a tool called mpiAproxy to facilitate detailed evaluation of the various reordering algorithms without requiring full application executions. This work presents a comprehensive performance evaluation (14,000 experiments) of the various task mapping techniques in lowering communication costs on Titan, the leadership class supercomputer at Oak Ridge National Laboratory.

  17. The TeraGyroid Experiment – Supercomputing 2003

    Directory of Open Access Journals (Sweden)

    R.J. Blake

    2005-01-01

    Full Text Available Amphiphiles are molecules with hydrophobic tails and hydrophilic heads. When dispersed in solvents, they self assemble into complex mesophases including the beautiful cubic gyroid phase. The goal of the TeraGyroid experiment was to study defect pathways and dynamics in these gyroids. The UK's supercomputing and USA's TeraGrid facilities were coupled together, through a dedicated high-speed network, into a single computational Grid for research work that peaked around the Supercomputing 2003 conference. The gyroids were modeled using lattice Boltzmann methods with parameter spaces explored using many 1283 and 3grid point simulations, this data being used to inform the world's largest three-dimensional time dependent simulation with 10243-grid points. The experiment generated some 2 TBytes of useful data. In terms of Grid technology the project demonstrated the migration of simulations (using Globus middleware to and fro across the Atlantic exploiting the availability of resources. Integration of the systems accelerated the time to insight. Distributed visualisation of the output datasets enabled the parameter space of the interactions within the complex fluid to be explored from a number of sites, informed by discourse over the Access Grid. The project was sponsored by EPSRC (UK and NSF (USA with trans-Atlantic optical bandwidth provided by British Telecommunications.

  18. Magnetic fusion energy and computers. The role of computing in magnetic fusion energy research and development (second edition)

    International Nuclear Information System (INIS)

    1983-01-01

    This report documents the structure and uses of the MFE Network and presents a compilation of future computing requirements. Its primary emphasis is on the role of supercomputers in fusion research. One of its key findings is that with the introduction of each successive class of supercomputer, qualitatively improved understanding of fusion processes has been gained. At the same time, even the current Class VI machines severely limit the attainable realism of computer models. Many important problems will require the introduction of Class VII or even larger machines before they can be successfully attacked

  19. Supercomputers Of The Future

    Science.gov (United States)

    Peterson, Victor L.; Kim, John; Holst, Terry L.; Deiwert, George S.; Cooper, David M.; Watson, Andrew B.; Bailey, F. Ron

    1992-01-01

    Report evaluates supercomputer needs of five key disciplines: turbulence physics, aerodynamics, aerothermodynamics, chemistry, and mathematical modeling of human vision. Predicts these fields will require computer speed greater than 10(Sup 18) floating-point operations per second (FLOP's) and memory capacity greater than 10(Sup 15) words. Also, new parallel computer architectures and new structured numerical methods will make necessary speed and capacity available.

  20. NASA Advanced Supercomputing Facility Expansion

    Science.gov (United States)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  1. ATLAS Software Installation on Supercomputers

    CERN Document Server

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  2. JINR supercomputer of the module type for event parallel analysis

    International Nuclear Information System (INIS)

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  3. Supercomputers and quantum field theory

    International Nuclear Information System (INIS)

    Creutz, M.

    1985-01-01

    A review is given of why recent simulations of lattice gauge theories have resulted in substantial demands from particle theorists for supercomputer time. These calculations have yielded first principle results on non-perturbative aspects of the strong interactions. An algorithm for simulating dynamical quark fields is discussed. 14 refs

  4. Supercomputing Centers and Electricity Service Providers

    DEFF Research Database (Denmark)

    Patki, Tapasya; Bates, Natalie; Ghatikar, Girish

    2016-01-01

    from a detailed, quantitative survey-based analysis and compare the perspectives of the European grid and SCs to the ones of the United States (US). We then show that contrary to the expectation, SCs in the US are more open toward cooperating and developing demand-management strategies with their ESPs......Supercomputing Centers (SCs) have high and variable power demands, which increase the challenges of the Electricity Service Providers (ESPs) with regards to efficient electricity distribution and reliable grid operation. High penetration of renewable energy generation further exacerbates...... this problem. In order to develop a symbiotic relationship between the SCs and their ESPs and to support effective power management at all levels, it is critical to understand and analyze how the existing relationships were formed and how these are expected to evolve. In this paper, we first present results...

  5. Computing at the leading edge: Research in the energy sciences

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.; Van Dyke, P.T. [eds.

    1994-02-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys.

  6. Computing at the leading edge: Research in the energy sciences

    International Nuclear Information System (INIS)

    Mirin, A.A.; Van Dyke, P.T.

    1994-01-01

    The purpose of this publication is to highlight selected scientific challenges that have been undertaken by the DOE Energy Research community. The high quality of the research reflected in these contributions underscores the growing importance both to the Grand Challenge scientific efforts sponsored by DOE and of the related supporting technologies that the National Energy Research Supercomputer Center (NERSC) and other facilities are able to provide. The continued improvement of the computing resources available to DOE scientists is prerequisite to ensuring their future progress in solving the Grand Challenges. Titles of articles included in this publication include: the numerical tokamak project; static and animated molecular views of a tumorigenic chemical bound to DNA; toward a high-performance climate systems model; modeling molecular processes in the environment; lattice Boltzmann models for flow in porous media; parallel algorithms for modeling superconductors; parallel computing at the Superconducting Super Collider Laboratory; the advanced combustion modeling environment; adaptive methodologies for computational fluid dynamics; lattice simulations of quantum chromodynamics; simulating high-intensity charged-particle beams for the design of high-power accelerators; electronic structure and phase stability of random alloys

  7. Design of multiple sequence alignment algorithms on parallel, distributed memory supercomputers.

    Science.gov (United States)

    Church, Philip C; Goscinski, Andrzej; Holt, Kathryn; Inouye, Michael; Ghoting, Amol; Makarychev, Konstantin; Reumann, Matthias

    2011-01-01

    The challenge of comparing two or more genomes that have undergone recombination and substantial amounts of segmental loss and gain has recently been addressed for small numbers of genomes. However, datasets of hundreds of genomes are now common and their sizes will only increase in the future. Multiple sequence alignment of hundreds of genomes remains an intractable problem due to quadratic increases in compute time and memory footprint. To date, most alignment algorithms are designed for commodity clusters without parallelism. Hence, we propose the design of a multiple sequence alignment algorithm on massively parallel, distributed memory supercomputers to enable research into comparative genomics on large data sets. Following the methodology of the sequential progressiveMauve algorithm, we design data structures including sequences and sorted k-mer lists on the IBM Blue Gene/P supercomputer (BG/P). Preliminary results show that we can reduce the memory footprint so that we can potentially align over 250 bacterial genomes on a single BG/P compute node. We verify our results on a dataset of E.coli, Shigella and S.pneumoniae genomes. Our implementation returns results matching those of the original algorithm but in 1/2 the time and with 1/4 the memory footprint for scaffold building. In this study, we have laid the basis for multiple sequence alignment of large-scale datasets on a massively parallel, distributed memory supercomputer, thus enabling comparison of hundreds instead of a few genome sequences within reasonable time.

  8. SERS internship fall 1995 abstracts and research papers

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Beverly

    1996-05-01

    This report is a compilation of twenty abstracts and their corresponding full papers of research projects done under the US Department of Energy Science and Engineering Research Semester (SERS) program. Papers cover a broad range of topics, for example, environmental transport, supercomputers, databases, biology. Selected papers were indexed separately for inclusion the the Energy Science and Technology Database.

  9. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  10. Computational plasma physics and supercomputers

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1984-09-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular codes, but parallel processing poses new coding difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematics

  11. Mistral Supercomputer Job History Analysis

    OpenAIRE

    Zasadziński, Michał; Muntés-Mulero, Victor; Solé, Marc; Ludwig, Thomas

    2018-01-01

    In this technical report, we show insights and results of operational data analysis from petascale supercomputer Mistral, which is ranked as 42nd most powerful in the world as of January 2018. Data sources include hardware monitoring data, job scheduler history, topology, and hardware information. We explore job state sequences, spatial distribution, and electric power patterns.

  12. Interactive real-time nuclear plant simulations on a UNIX based supercomputer

    International Nuclear Information System (INIS)

    Behling, S.R.

    1990-01-01

    Interactive real-time nuclear plant simulations are critically important to train nuclear power plant engineers and operators. In addition, real-time simulations can be used to test the validity and timing of plant technical specifications and operational procedures. To accurately and confidently simulate a nuclear power plant transient in real-time, sufficient computer resources must be available. Since some important transients cannot be simulated using preprogrammed responses or non-physical models, commonly used simulation techniques may not be adequate. However, the power of a supercomputer allows one to accurately calculate the behavior of nuclear power plants even during very complex transients. Many of these transients can be calculated in real-time or quicker on the fastest supercomputers. The concept of running interactive real-time nuclear power plant transients on a supercomputer has been tested. This paper describes the architecture of the simulation program, the techniques used to establish real-time synchronization, and other issues related to the use of supercomputers in a new and potentially very important area. (author)

  13. Porting Ordinary Applications to Blue Gene/Q Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Maheshwari, Ketan C.; Wozniak, Justin M.; Armstrong, Timothy; Katz, Daniel S.; Binkowski, T. Andrew; Zhong, Xiaoliang; Heinonen, Olle; Karpeyev, Dmitry; Wilde, Michael

    2015-08-31

    Efficiently porting ordinary applications to Blue Gene/Q supercomputers is a significant challenge. Codes are often originally developed without considering advanced architectures and related tool chains. Science needs frequently lead users to want to run large numbers of relatively small jobs (often called many-task computing, an ensemble, or a workflow), which can conflict with supercomputer configurations. In this paper, we discuss techniques developed to execute ordinary applications over leadership class supercomputers. We use the high-performance Swift parallel scripting framework and build two workflow execution techniques-sub-jobs and main-wrap. The sub-jobs technique, built on top of the IBM Blue Gene/Q resource manager Cobalt's sub-block jobs, lets users submit multiple, independent, repeated smaller jobs within a single larger resource block. The main-wrap technique is a scheme that enables C/C++ programs to be defined as functions that are wrapped by a high-performance Swift wrapper and that are invoked as a Swift script. We discuss the needs, benefits, technicalities, and current limitations of these techniques. We further discuss the real-world science enabled by these techniques and the results obtained.

  14. Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer

    Science.gov (United States)

    Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.

  15. NREL Receives Editors' Choice Awards for Supercomputer Research | News |

    Science.gov (United States)

    performance data center, high-bay labs, and office space. NREL's Martha Symko-Davies honored by Women in successful women working in the energy field. As NREL's Director of Partnerships for Energy Systems awards for the Peregrine high-performance computer and the groundbreaking research it made possible. The

  16. Extracting the Textual and Temporal Structure of Supercomputing Logs

    Energy Technology Data Exchange (ETDEWEB)

    Jain, S; Singh, I; Chandra, A; Zhang, Z; Bronevetsky, G

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an online clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.

  17. Introduction to Reconfigurable Supercomputing

    CERN Document Server

    Lanzagorta, Marco; Rosenberg, Robert

    2010-01-01

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPe who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigur

  18. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-05-15

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation.

  19. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    International Nuclear Information System (INIS)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho

    2008-01-01

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation

  20. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  1. SUPERCOMPUTERS FOR AIDING ECONOMIC PROCESSES WITH REFERENCE TO THE FINANCIAL SECTOR

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2014-12-01

    Full Text Available The article discusses the use of supercomputers to support business processes with particular emphasis on the financial sector. A reference was made to the selected projects that support economic development. In particular, we propose the use of supercomputers to perform artificial intel-ligence methods in banking. The proposed methods combined with modern technology enables a significant increase in the competitiveness of enterprises and banks by adding new functionality.

  2. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jacobsen, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  3. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  4. Federal Market Information Technology in the Post Flash Crash Era: Roles for Supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Leinweber, David; Ruebel, Oliver; Wu, Kesheng

    2011-09-16

    This paper describes collaborative work between active traders, regulators, economists, and supercomputing researchers to replicate and extend investigations of the Flash Crash and other market anomalies in a National Laboratory HPC environment. Our work suggests that supercomputing tools and methods will be valuable to market regulators in achieving the goal of market safety, stability, and security. Research results using high frequency data and analytics are described, and directions for future development are discussed. Currently the key mechanism for preventing catastrophic market action are “circuit breakers.” We believe a more graduated approach, similar to the “yellow light” approach in motorsports to slow down traffic, might be a better way to achieve the same goal. To enable this objective, we study a number of indicators that could foresee hazards in market conditions and explore options to confirm such predictions. Our tests confirm that Volume Synchronized Probability of Informed Trading (VPIN) and a version of volume Herfindahl-Hirschman Index (HHI) for measuring market fragmentation can indeed give strong signals ahead of the Flash Crash event on May 6 2010. This is a preliminary step toward a full-fledged early-warning system for unusual market conditions.

  5. Integration of PanDA workload management system with Titan supercomputer at OLCF

    Science.gov (United States)

    De, K.; Klimentov, A.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, the future LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). The current approach utilizes a modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multicore worker nodes. It also gives PanDA new capability to collect, in real time, information about unused worker nodes on Titan, which allows precise definition of the size and duration of jobs submitted to Titan according to available free resources. This capability significantly reduces PanDA job wait time while improving Titan's utilization efficiency. This implementation was tested with a variety of Monte-Carlo workloads on Titan and is being tested on several other supercomputing platforms. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  6. A workbench for tera-flop supercomputing

    International Nuclear Information System (INIS)

    Resch, M.M.; Kuester, U.; Mueller, M.S.; Lang, U.

    2003-01-01

    Supercomputers currently reach a peak performance in the range of TFlop/s. With but one exception - the Japanese Earth Simulator - none of these systems has so far been able to also show a level of sustained performance for a variety of applications that comes close to the peak performance. Sustained TFlop/s are therefore rarely seen. The reasons are manifold and are well known: Bandwidth and latency both for main memory and for the internal network are the key internal technical problems. Cache hierarchies with large caches can bring relief but are no remedy to the problem. However, there are not only technical problems that inhibit the full exploitation by scientists of the potential of modern supercomputers. More and more organizational issues come to the forefront. This paper shows the approach of the High Performance Computing Center Stuttgart (HLRS) to deliver a sustained performance of TFlop/s for a wide range of applications from a large group of users spread over Germany. The core of the concept is the role of the data. Around this we design a simulation workbench that hides the complexity of interacting computers, networks and file systems from the user. (authors)

  7. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  8. KfK-seminar series on supercomputing und visualization from May till September 1992

    International Nuclear Information System (INIS)

    Hohenhinnebusch, W.

    1993-05-01

    During the period of may 1992 to september 1992 a series of seminars was held at KfK on several topics of supercomputing in different fields of application. The aim was to demonstrate the importance of supercomputing and visualization in numerical simulations of complex physical and technical phenomena. This report contains the collection of all submitted seminar papers. (orig./HP) [de

  9. Research to application: Supercomputing trends for the 90's - Opportunities for interdisciplinary computations

    International Nuclear Information System (INIS)

    Shankar, V.

    1991-01-01

    The progression of supercomputing is reviewed from the point of view of computational fluid dynamics (CFD), and multidisciplinary problems impacting the design of advanced aerospace configurations are addressed. The application of full potential and Euler equations to transonic and supersonic problems in the 70s and early 80s is outlined, along with Navier-Stokes computations widespread during the late 80s and early 90s. Multidisciplinary computations currently in progress are discussed, including CFD and aeroelastic coupling for both static and dynamic flexible computations, CFD, aeroelastic, and controls coupling for flutter suppression and active control, and the development of a computational electromagnetics technology based on CFD methods. Attention is given to computational challenges standing in a way of the concept of establishing a computational environment including many technologies. 40 refs

  10. Computational plasma physics and supercomputers. Revision 1

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1985-01-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular models, but parallel processing poses new programming difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematical models

  11. Quantum Hamiltonian Physics with Supercomputers

    International Nuclear Information System (INIS)

    Vary, James P.

    2014-01-01

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed

  12. Quantum Hamiltonian Physics with Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Vary, James P.

    2014-06-15

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed.

  13. Lectures in Supercomputational Neurosciences Dynamics in Complex Brain Networks

    CERN Document Server

    Graben, Peter beim; Thiel, Marco; Kurths, Jürgen

    2008-01-01

    Computational Neuroscience is a burgeoning field of research where only the combined effort of neuroscientists, biologists, psychologists, physicists, mathematicians, computer scientists, engineers and other specialists, e.g. from linguistics and medicine, seem to be able to expand the limits of our knowledge. The present volume is an introduction, largely from the physicists' perspective, to the subject matter with in-depth contributions by system neuroscientists. A conceptual model for complex networks of neurons is introduced that incorporates many important features of the real brain, such as various types of neurons, various brain areas, inhibitory and excitatory coupling and the plasticity of the network. The computational implementation on supercomputers, which is introduced and discussed in detail in this book, will enable the readers to modify and adapt the algortihm for their own research. Worked-out examples of applications are presented for networks of Morris-Lecar neurons to model the cortical co...

  14. Tryton Supercomputer Capabilities for Analysis of Massive Data Streams

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2015-09-01

    Full Text Available The recently deployed supercomputer Tryton, located in the Academic Computer Center of Gdansk University of Technology, provides great means for massive parallel processing. Moreover, the status of the Center as one of the main network nodes in the PIONIER network enables the fast and reliable transfer of data produced by miscellaneous devices scattered in the area of the whole country. The typical examples of such data are streams containing radio-telescope and satellite observations. Their analysis, especially with real-time constraints, can be challenging and requires the usage of dedicated software components. We propose a solution for such parallel analysis using the supercomputer, supervised by the KASKADA platform, which with the conjunction with immerse 3D visualization techniques can be used to solve problems such as pulsar detection and chronometric or oil-spill simulation on the sea surface.

  15. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  16. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  17. Use of high performance networks and supercomputers for real-time flight simulation

    Science.gov (United States)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  18. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  19. Cellular-automata supercomputers for fluid-dynamics modeling

    International Nuclear Information System (INIS)

    Margolus, N.; Toffoli, T.; Vichniac, G.

    1986-01-01

    We report recent developments in the modeling of fluid dynamics, and give experimental results (including dynamical exponents) obtained using cellular automata machines. Because of their locality and uniformity, cellular automata lend themselves to an extremely efficient physical realization; with a suitable architecture, an amount of hardware resources comparable to that of a home computer can achieve (in the simulation of cellular automata) the performance of a conventional supercomputer

  20. Evaluating the networking characteristics of the Cray XC-40 Intel Knights Landing-based Cori supercomputer at NERSC

    Energy Technology Data Exchange (ETDEWEB)

    Doerfler, Douglas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Austin, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Brandon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kandalla, Krishna [Cray Inc, Bloomington, MN (United States); Mendygral, Peter [Cray Inc, Bloomington, MN (United States)

    2017-09-12

    There are many potential issues associated with deploying the Intel Xeon Phi™ (code named Knights Landing [KNL]) manycore processor in a large-scale supercomputer. One in particular is the ability to fully utilize the high-speed communications network, given that the serial performance of a Xeon Phi TM core is a fraction of a Xeon®core. In this paper, we take a look at the trade-offs associated with allocating enough cores to fully utilize the Aries high-speed network versus cores dedicated to computation, e.g., the trade-off between MPI and OpenMP. In addition, we evaluate new features of Cray MPI in support of KNL, such as internode optimizations. We also evaluate one-sided programming models such as Unified Parallel C. We quantify the impact of the above trade-offs and features using a suite of National Energy Research Scientific Computing Center applications.

  1. Analyzing the Interplay of Failures and Workload on a Leadership-Class Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, Esteban [University of Pittsburgh; Ni, Xiang [University of Illinois at Urbana-Champaign; Jones, Terry R [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The unprecedented computational power of cur- rent supercomputers now makes possible the exploration of complex problems in many scientific fields, from genomic analysis to computational fluid dynamics. Modern machines are powerful because they are massive: they assemble millions of cores and a huge quantity of disks, cards, routers, and other components. But it is precisely the size of these machines that glooms the future of supercomputing. A system that comprises many components has a high chance to fail, and fail often. In order to make the next generation of supercomputers usable, it is imperative to use some type of fault tolerance platform to run applications on large machines. Most fault tolerance strategies can be optimized for the peculiarities of each system and boost efficacy by keeping the system productive. In this paper, we aim to understand how failure characterization can improve resilience in several layers of the software stack: applications, runtime systems, and job schedulers. We examine the Titan supercomputer, one of the fastest systems in the world. We analyze a full year of Titan in production and distill the failure patterns of the machine. By looking into Titan s log files and using the criteria of experts, we provide a detailed description of the types of failures. In addition, we inspect the job submission files and describe how the system is used. Using those two sources, we cross correlate failures in the machine to executing jobs and provide a picture of how failures affect the user experience. We believe such characterization is fundamental in developing appropriate fault tolerance solutions for Cray systems similar to Titan.

  2. Reactive flow simulations in complex geometries with high-performance supercomputing

    International Nuclear Information System (INIS)

    Rehm, W.; Gerndt, M.; Jahn, W.; Vogelsang, R.; Binninger, B.; Herrmann, M.; Olivier, H.; Weber, M.

    2000-01-01

    In this paper, we report on a modern field code cluster consisting of state-of-the-art reactive Navier-Stokes- and reactive Euler solvers that has been developed on vector- and parallel supercomputers at the research center Juelich. This field code cluster is used for hydrogen safety analyses of technical systems, for example, in the field of nuclear reactor safety and conventional hydrogen demonstration plants with fuel cells. Emphasis is put on the assessment of combustion loads, which could result from slow, fast or rapid flames, including transition from deflagration to detonation. As a sample of proof tests, the special tools have been tested for specific tasks, based on the comparison of experimental and numerical results, which are in reasonable agreement. (author)

  3. Frequently updated noise threat maps created with use of supercomputing grid

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2014-09-01

    Full Text Available An innovative supercomputing grid services devoted to noise threat evaluation were presented. The services described in this paper concern two issues, first is related to the noise mapping, while the second one focuses on assessment of the noise dose and its influence on the human hearing system. The discussed serviceswere developed within the PL-Grid Plus Infrastructure which accumulates Polish academic supercomputer centers. Selected experimental results achieved by the usage of the services proposed were presented. The assessment of the environmental noise threats includes creation of the noise maps using either ofline or online data, acquired through a grid of the monitoring stations. A concept of estimation of the source model parameters based on the measured sound level for the purpose of creating frequently updated noise maps was presented. Connecting the noise mapping grid service with a distributed sensor network enables to automatically update noise maps for a specified time period. Moreover, a unique attribute of the developed software is the estimation of the auditory effects evoked by the exposure to noise. The estimation method uses a modified psychoacoustic model of hearing and is based on the calculated noise level values and on the given exposure period. Potential use scenarios of the grid services for research or educational purpose were introduced. Presentation of the results of predicted hearing threshold shift caused by exposure to excessive noise can raise the public awareness of the noise threats.

  4. Energy. Supermaterial for solar cells, membranes against the global warming, energy conservation in the greenhouse; Energie. Supermaterial fuer Solarzellen, Membranen gegen die globale Erwaermung, Energiesparen im Treibhaus

    Energy Technology Data Exchange (ETDEWEB)

    Roegener, Wiebke; Frick, Frank; Tillemans, Axel; Stahl-Busse, Brigitte

    2010-07-01

    A kaleidoscope of pictures presents highlights from the research at the Forschungszentrum Juelich - from moving into a new computer era over the development of a detector for dangerous liquids up to a new method of treatment against tinnitus. The highlights of this brochure are: (a) An interview with he director of the Oak Ridge National Laboratory on the energy mix of the future; (b) Environment friendly power generation by means of fuel cells; (c) Transfer of knowledge from fusion experiments to greater plants using a supercomputer; (d) Development of powerful batteries for electrically powered cars by means of the know-how from fuel cell research; (e) Investigation of contacting used fuel elements with water; (f) Reduction if energy consumption in a greenhouse using a combination of glass and foils; (g) News on the energy research and environmental research.

  5. Ultrascalable petaflop parallel supercomputer

    Science.gov (United States)

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  6. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    International Nuclear Information System (INIS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-01-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers. (paper)

  7. Problem solving in nuclear engineering using supercomputers

    International Nuclear Information System (INIS)

    Schmidt, F.; Scheuermann, W.; Schatz, A.

    1987-01-01

    The availability of supercomputers enables the engineer to formulate new strategies for problem solving. One such strategy is the Integrated Planning and Simulation System (IPSS). With the integrated systems, simulation models with greater consistency and good agreement with actual plant data can be effectively realized. In the present work some of the basic ideas of IPSS are described as well as some of the conditions necessary to build such systems. Hardware and software characteristics as realized are outlined. (orig.) [de

  8. Visualizing quantum scattering on the CM-2 supercomputer

    International Nuclear Information System (INIS)

    Richardson, J.L.

    1991-01-01

    We implement parallel algorithms for solving the time-dependent Schroedinger equation on the CM-2 supercomputer. These methods are unconditionally stable as well as unitary at each time step and have the advantage of being spatially local and explicit. We show how to visualize the dynamics of quantum scattering using techniques for visualizing complex wave functions. Several scattering problems are solved to demonstrate the use of these methods. (orig.)

  9. Integration of Titan supercomputer at OLCF with ATLAS Production System

    CERN Document Server

    AUTHOR|(SzGeCERN)643806; The ATLAS collaboration; De, Kaushik; Klimentov, Alexei; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Wenaus, Torre

    2017-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for jo...

  10. Integration of Titan supercomputer at OLCF with ATLAS production system

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration

    2016-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this talk we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job...

  11. Supercomputer algorithms for reactivity, dynamics and kinetics of small molecules

    International Nuclear Information System (INIS)

    Lagana, A.

    1989-01-01

    Even for small systems, the accurate characterization of reactive processes is so demanding of computer resources as to suggest the use of supercomputers having vector and parallel facilities. The full advantages of vector and parallel architectures can sometimes be obtained by simply modifying existing programs, vectorizing the manipulation of vectors and matrices, and requiring the parallel execution of independent tasks. More often, however, a significant time saving can be obtained only when the computer code undergoes a deeper restructuring, requiring a change in the computational strategy or, more radically, the adoption of a different theoretical treatment. This book discusses supercomputer strategies based upon act and approximate methods aimed at calculating the electronic structure and the reactive properties of small systems. The book shows how, in recent years, intense design activity has led to the ability to calculate accurate electronic structures for reactive systems, exact and high-level approximations to three-dimensional reactive dynamics, and to efficient directive and declaratory software for the modelling of complex systems

  12. Adventures in supercomputing: An innovative program for high school teachers

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C.E.; Hicks, H.R.; Summers, B.G. [Oak Ridge National Lab., TN (United States); Staten, D.G. [Wartburg Central High School, TN (United States)

    1994-12-31

    Within the realm of education, seldom does an innovative program become available with the potential to change an educator`s teaching methodology. Adventures in Supercomputing (AiS), sponsored by the U.S. Department of Energy (DOE), is such a program. It is a program for high school teachers that changes the teacher paradigm from a teacher-directed approach of teaching to a student-centered approach. {open_quotes}A student-centered classroom offers better opportunities for development of internal motivation, planning skills, goal setting and perseverance than does the traditional teacher-directed mode{close_quotes}. Not only is the process of teaching changed, but the cross-curricula integration within the AiS materials is remarkable. Written from a teacher`s perspective, this paper will describe the AiS program and its effects on teachers and students, primarily at Wartburg Central High School, in Wartburg, Tennessee. The AiS program in Tennessee is sponsored by Oak Ridge National Laboratory (ORNL).

  13. Assessment techniques for a learning-centered curriculum: evaluation design for adventures in supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Helland, B. [Ames Lab., IA (United States); Summers, B.G. [Oak Ridge National Lab., TN (United States)

    1996-09-01

    As the classroom paradigm shifts from being teacher-centered to being learner-centered, student assessments are evolving from typical paper and pencil testing to other methods of evaluation. Students should be probed for understanding, reasoning, and critical thinking abilities rather than their ability to return memorized facts. The assessment of the Department of Energy`s pilot program, Adventures in Supercomputing (AiS), offers one example of assessment techniques developed for learner-centered curricula. This assessment has employed a variety of methods to collect student data. Methods of assessment used were traditional testing, performance testing, interviews, short questionnaires via email, and student presentations of projects. The data obtained from these sources have been analyzed by a professional assessment team at the Center for Children and Technology. The results have been used to improve the AiS curriculum and establish the quality of the overall AiS program. This paper will discuss the various methods of assessment used and the results.

  14. BSMBench: a flexible and scalable supercomputer benchmark from computational particle physics

    CERN Document Server

    Bennett, Ed; Del Debbio, Luigi; Jordan, Kirk; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2016-01-01

    Benchmarking plays a central role in the evaluation of High Performance Computing architectures. Several benchmarks have been designed that allow users to stress various components of supercomputers. In order for the figures they provide to be useful, benchmarks need to be representative of the most common real-world scenarios. In this work, we introduce BSMBench, a benchmarking suite derived from Monte Carlo code used in computational particle physics. The advantage of this suite (which can be freely downloaded from http://www.bsmbench.org/) over others is the capacity to vary the relative importance of computation and communication. This enables the tests to simulate various practical situations. To showcase BSMBench, we perform a wide range of tests on various architectures, from desktop computers to state-of-the-art supercomputers, and discuss the corresponding results. Possible future directions of development of the benchmark are also outlined.

  15. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  16. Plasma turbulence calculations on supercomputers

    International Nuclear Information System (INIS)

    Carreras, B.A.; Charlton, L.A.; Dominguez, N.; Drake, J.B.; Garcia, L.; Leboeuf, J.N.; Lee, D.K.; Lynch, V.E.; Sidikman, K.

    1991-01-01

    Although the single-particle picture of magnetic confinement is helpful in understanding some basic physics of plasma confinement, it does not give a full description. Collective effects dominate plasma behavior. Any analysis of plasma confinement requires a self-consistent treatment of the particles and fields. The general picture is further complicated because the plasma, in general, is turbulent. The study of fluid turbulence is a rather complex field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples back to the fluid. Since the fluid is not a perfect conductor, this turbulence can lead to changes in the topology of the magnetic field structure, causing the magnetic field lines to wander radially. Because the plasma fluid flows along field lines, they carry the particles with them, and this enhances the losses caused by collisions. The changes in topology are critical for the plasma confinement. The study of plasma turbulence and the concomitant transport is a challenging problem. Because of the importance of solving the plasma turbulence problem for controlled thermonuclear research, the high complexity of the problem, and the necessity of attacking the problem with supercomputers, the study of plasma turbulence in magnetic confinement devices is a Grand Challenge problem

  17. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  18. Supercomputers and the future of computational atomic scattering physics

    International Nuclear Information System (INIS)

    Younger, S.M.

    1989-01-01

    The advent of the supercomputer has opened new vistas for the computational atomic physicist. Problems of hitherto unparalleled complexity are now being examined using these new machines, and important connections with other fields of physics are being established. This talk briefly reviews some of the most important trends in computational scattering physics and suggests some exciting possibilities for the future. 7 refs., 2 figs

  19. Re-inventing electromagnetics - Supercomputing solution of Maxwell's equations via direct time integration on space grids

    International Nuclear Information System (INIS)

    Taflove, A.

    1992-01-01

    This paper summarizes the present state and future directions of applying finite-difference and finite-volume time-domain techniques for Maxwell's equations on supercomputers to model complex electromagnetic wave interactions with structures. Applications so far have been dominated by radar cross section technology, but by no means are limited to this area. In fact, the gains we have made place us on the threshold of being able to make tremendous contributions to non-defense electronics and optical technology. Some of the most interesting research in these commercial areas is summarized. 47 refs

  20. Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia.

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk (Kitware, Inc., Clifton Park, NY); Fabian, Nathan; Marion, Patrick (Kitware, Inc., Clifton Park, NY); Moreland, Kenneth D.

    2010-09-01

    This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four

  1. Micro-mechanical Simulations of Soils using Massively Parallel Supercomputers

    Directory of Open Access Journals (Sweden)

    David W. Washington

    2004-06-01

    Full Text Available In this research a computer program, Trubal version 1.51, based on the Discrete Element Method was converted to run on a Connection Machine (CM-5,a massively parallel supercomputer with 512 nodes, to expedite the computational times of simulating Geotechnical boundary value problems. The dynamic memory algorithm in Trubal program did not perform efficiently in CM-2 machine with the Single Instruction Multiple Data (SIMD architecture. This was due to the communication overhead involving global array reductions, global array broadcast and random data movement. Therefore, a dynamic memory algorithm in Trubal program was converted to a static memory arrangement and Trubal program was successfully converted to run on CM-5 machines. The converted program was called "TRUBAL for Parallel Machines (TPM." Simulating two physical triaxial experiments and comparing simulation results with Trubal simulations validated the TPM program. With a 512 nodes CM-5 machine TPM produced a nine-fold speedup demonstrating the inherent parallelism within algorithms based on the Discrete Element Method.

  2. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  3. Integration of Titan supercomputer at OLCF with ATLAS Production System

    Science.gov (United States)

    Barreiro Megino, F.; De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wells, J.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job submission to Titan’s batch queues and local data management, with lightweight MPI wrappers to run single node workloads in parallel on Titan’s multi-core worker nodes. It provides for running of standard ATLAS production jobs on unused resources (backfill) on Titan. The system already allowed ATLAS to collect on Titan millions of core-hours per month, execute hundreds of thousands jobs, while simultaneously improving Titans utilization efficiency. We will discuss the details of the implementation, current experience with running the system, as well as future plans aimed at improvements in scalability and efficiency. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to

  4. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen-Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2018-05-15

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaflop-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC). The ASIC nodes are interconnected by a five dimensional torus network that optimally maximize the throughput of packet communications between nodes and minimize latency. The network implements collective network and a global asynchronous network that provides global barrier and notification functions. Integrated in the node design include a list-based prefetcher. The memory system implements transaction memory, thread level speculation, and multiversioning cache that improves soft error rate at the same time and supports DMA functionality allowing for parallel processing message-passing.

  5. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  6. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  7. Virtual laboratory for fusion research in Japan

    International Nuclear Information System (INIS)

    Tsuda, K.; Nagayama, Y.; Yamamoto, T.; Horiuchi, R.; Ishiguro, S.; Takami, S.

    2008-01-01

    A virtual laboratory system for nuclear fusion research in Japan has been developed using SuperSINET, which is a super high-speed network operated by National Institute of Informatics. Sixteen sites including major Japanese universities, Japan Atomic Energy Agency and National Institute for Fusion Science (NIFS) are mutually connected to SuperSINET with the speed of 1 Gbps by the end of 2006 fiscal year. Collaboration categories in this virtual laboratory are as follows: the large helical device (LHD) remote participation; the remote use of supercomputer system; and the all Japan ST (Spherical Tokamak) research program. This virtual laboratory is a closed network system, and is connected to the Internet through the NIFS firewall in order to keep higher security. Collaborators in a remote station can control their diagnostic devices at LHD and analyze the LHD data as they were at the LHD control room. Researchers in a remote station can use the supercomputer of NIFS in the same environment as NIFS. In this paper, we will describe detail of technologies and the present status of the virtual laboratory. Furthermore, the items that should be developed in the near future are also described

  8. Energy Secretary Dedicates ESIF at NREL | News | NREL

    Science.gov (United States)

    3 » Energy Secretary Dedicates ESIF at NREL Energy Secretary Dedicates ESIF at NREL September 18 prey. Enlarge image Energy Secretary Ernest Moniz (center) joins NREL Director Dan Arvizu (left) and newest Energy Department supercomputer. The high performance computer inside NREL's new Energy Systems

  9. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  10. BigData and computing challenges in high energy and nuclear physics

    International Nuclear Information System (INIS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-01-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R and D computing projects started recently in National Research Center ''Kurchatov Institute''

  11. Design and performance characterization of electronic structure calculations on massively parallel supercomputers

    DEFF Research Database (Denmark)

    Romero, N. A.; Glinsvad, Christian; Larsen, Ask Hjorth

    2013-01-01

    Density function theory (DFT) is the most widely employed electronic structure method because of its favorable scaling with system size and accuracy for a broad range of molecular and condensed-phase systems. The advent of massively parallel supercomputers has enhanced the scientific community...

  12. Aviation Research and the Internet

    Science.gov (United States)

    Scott, Antoinette M.

    1995-01-01

    The Internet is a network of networks. It was originally funded by the Defense Advanced Research Projects Agency or DOD/DARPA and evolved in part from the connection of supercomputer sites across the United States. The National Science Foundation (NSF) made the most of their supercomputers by connecting the sites to each other. This made the supercomputers more efficient and now allows scientists, engineers and researchers to access the supercomputers from their own labs and offices. The high speed networks that connect the NSF supercomputers form the backbone of the Internet. The World Wide Web (WWW) is a menu system. It gathers Internet resources from all over the world into a series of screens that appear on your computer. The WWW is also a distributed. The distributed system stores data information on many computers (servers). These servers can go out and get data when you ask for it. Hypermedia is the base of the WWW. One can 'click' on a section and visit other hypermedia (pages). Our approach to demonstrating the importance of aviation research through the Internet began with learning how to put pages on the Internet (on-line) ourselves. We were assigned two aviation companies; Vision Micro Systems Inc. and Innovative Aerodynamic Technologies (IAT). We developed home pages for these SBIR companies. The equipment used to create the pages were the UNIX and Macintosh machines. HTML Supertext software was used to write the pages and the Sharp JX600S scanner to scan the images. As a result, with the use of the UNIX, Macintosh, Sun, PC, and AXIL machines, we were able to present our home pages to over 800,000 visitors.

  13. Support of theoretical high energy physics research at the Supercomputer Computations Research Institute. Final report, September 30, 1992 - July 31, 1997

    International Nuclear Information System (INIS)

    Bitar, K.M.; Edwards, R.G.; Heller, U.M.; Kennedy, A.D.

    1998-01-01

    The research primarily involved lattice field theory simulations such as Quantum Chromodynamics (QCD) and the Standard Model of electroweak interactions. Among the works completed by the members of the lattice group and their outside collaborators in QCD simulations are extensive hadronic spectrum computations with both Wilson and staggered fermions, and calculations of hadronic matrix elements and wavefunctions. Studies of the QCD β function with two flavors of Wilson fermions, and the study of a possible flavor-parity breaking phase in QCD with two flavors of Wilson fermions have been completed. Studies of the finite temperature behavior of QCD have also been a major activity within the group. Studies of non-relativistic QCD, both for heavy-heavy mesons and for the heavy quark in heavy-light mesons have been done. Combining large N analytic computations within the Higgs sector of the standard model and numerical simulations at N = 4 have yielded a computation of the upper bound of the mass of the Higgs particle, as well as the energy scale above which deviations from the Standard Model may be expected. A major research topic during the second half of the grant period was the study of improved lattice actions, designed to diminish finite lattice spacing effects and thus accelerate the approach to the continuum limit. A new exact Local Hybrid Monte Carlo (overrelaxation) algorithm with a tunable overrelaxation parameter has been developed for pure gauge theories. The characteristics of this algorithm have been investigated. A study of possible instabilities in the global HMC algorithm has been completed

  14. Harnessing Petaflop-Scale Multi-Core Supercomputing for Problems in Space Science

    Science.gov (United States)

    Albright, B. J.; Yin, L.; Bowers, K. J.; Daughton, W.; Bergen, B.; Kwan, T. J.

    2008-12-01

    The particle-in-cell kinetic plasma code VPIC has been migrated successfully to the world's fastest supercomputer, Roadrunner, a hybrid multi-core platform built by IBM for the Los Alamos National Laboratory. How this was achieved will be described and examples of state-of-the-art calculations in space science, in particular, the study of magnetic reconnection, will be presented. With VPIC on Roadrunner, we have performed, for the first time, plasma PIC calculations with over one trillion particles, >100× larger than calculations considered "heroic" by community standards. This allows examination of physics at unprecedented scale and fidelity. Roadrunner is an example of an emerging paradigm in supercomputing: the trend toward multi-core systems with deep hierarchies and where memory bandwidth optimization is vital to achieving high performance. Getting VPIC to perform well on such systems is a formidable challenge: the core algorithm is memory bandwidth limited with low compute-to-data ratio and requires random access to memory in its inner loop. That we were able to get VPIC to perform and scale well, achieving >0.374 Pflop/s and linear weak scaling on real physics problems on up to the full 12240-core Roadrunner machine, bodes well for harnessing these machines for our community's needs in the future. Many of the design considerations encountered commute to other multi-core and accelerated (e.g., via GPU) platforms and we modified VPIC with flexibility in mind. These will be summarized and strategies for how one might adapt a code for such platforms will be shared. Work performed under the auspices of the U.S. DOE by the LANS LLC Los Alamos National Laboratory. Dr. Bowers is a LANL Guest Scientist; he is presently at D. E. Shaw Research LLC, 120 W 45th Street, 39th Floor, New York, NY 10036.

  15. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00300320; Klimentov, Alexei; Oleynik, Danila; Panitkin, Sergey; Petrosyan, Artem; Vaniachine, Alexandre; Wenaus, Torre; Schovancova, Jaroslava

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modi ed PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real time, information about unused...

  16. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration; Klimentov, Alexei; Oleynik, Danila; Petrosyan, Artem; Schovancova, Jaroslava; Vaniachine, Alexandre; Wenaus, Torre

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently uses more than 100,000 cores at well over 100 Grid sites with a peak performance of 0.3 petaFLOPS, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real tim...

  17. ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.

    Science.gov (United States)

    Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping

    2018-04-27

    A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.

  18. Explaining the gap between theoretical peak performance and real performance for supercomputer architectures

    International Nuclear Information System (INIS)

    Schoenauer, W.; Haefner, H.

    1993-01-01

    The basic architectures of vector and parallel computers with their properties are presented. Then the memory size and the arithmetic operations in the context of memory bandwidth are discussed. For the exemplary discussion of a single operation micro-measurements of the vector triad for the IBM 3090 VF and the CRAY Y-MP/8 are presented. They reveal the details of the losses for a single operation. Then we analyze the global performance of a whole supercomputer by identifying reduction factors that bring down the theoretical peak performance to the poor real performance. The responsibilities of the manufacturer and of the user for these losses are dicussed. Then the price-performance ratio for different architectures in a snapshot of January 1991 is briefly mentioned. Finally some remarks to a user-friendly architecture for a supercomputer will be made. (orig.)

  19. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-01-01

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  20. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-03-13

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  1. Energy research and energy technology

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    Research and development in the field of energy technologies was and still is a rational necessity of our time. However, the current point of main effort has shifted from security of supply to environmental compatibility and safety of the technological processes used. Nuclear fusion is not expected to provide an extension of currently available energy resources until the middle of the next century. Its technological translation will be measured by the same conditions and issues of political acceptance that are relevant to nuclear technology today. Approaches in the major research establishments to studies of regenerative energy systems as elements of modern energy management have led to research and development programs on solar and hydrogen technologies as well as energy storage. The percentage these systems might achieve in a secured energy supply of European national economies is controversial yet today. In the future, the Arbeitsgemeinschaft Grossforschungseinrichtungen (AGF) (Cooperative of Major Research Establishments) will predominantly focus on nuclear safety research and on areas of nuclear waste disposal, which will continue to be a national task even after a reorganization of cooperation in Europe. In addition, they will above all assume tasks of nuclear plant safety research within international cooperation programs based on government agreements, in order to maintain access for the Federal Republic of Germany to an advancing development of nuclear technology in a concurrent partnership with other countries. (orig./HSCH) [de

  2. Swedish Energy Research 2009

    Energy Technology Data Exchange (ETDEWEB)

    2009-07-01

    Swedish Energy Research 2009 provides a brief, easily accessible overview of the Swedish energy research programme. The aims of the programme are to create knowledge and skills, as needed in order to commercialise the results and contribute to development of the energy system. Much of the work is carried out through about 40 research programmes in six thematic areas: energy system analysis, the building as an energy system, the transport sector, energy-intensive industries, biomass in energy systems and the power system. Swedish Energy Research 2009 describes the overall direction of research, with examples of current research, and results to date within various thematic areas and highlights

  3. An efficient implementation of a backpropagation learning algorithm on quadrics parallel supercomputer

    International Nuclear Information System (INIS)

    Taraglio, S.; Massaioli, F.

    1995-08-01

    A parallel implementation of a library to build and train Multi Layer Perceptrons via the Back Propagation algorithm is presented. The target machine is the SIMD massively parallel supercomputer Quadrics. Performance measures are provided on three different machines with different number of processors, for two network examples. A sample source code is given

  4. Women in Energy: Rinku Gupta - Argonne Today

    Science.gov (United States)

    -performance clusters and supercomputers. What is the best part of your job? The best part is working with Argonne Today Argonne Today Mission People Work/Life Connections Focal Point Women in Energy: Rinku Gupta Home People Women in Energy: Rinku Gupta Women in Energy: Rinku Gupta Apr 1, 2016 | Posted by Argonne

  5. Summaries of research and development activities by using JAEA computer system in FY2007. April 1, 2007 - March 31, 2008

    International Nuclear Information System (INIS)

    2008-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2007 (April 1, 2007 - March 31, 2008). (author)

  6. Summaries of research and development activities by using JAEA computer system in FY2009. April 1, 2009 - March 31, 2010

    International Nuclear Information System (INIS)

    2010-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2009 (April 1, 2009 - March 31, 2010). (author)

  7. Research for energy

    International Nuclear Information System (INIS)

    Garbers, C.F.

    1983-01-01

    This paper deals with energy R D and its funding in the South African public sector. The objectives of the National Programme for Energy Research are discussed within the framework of the country's manpower and financial needs and limitations. It is shown that energy research is multidisciplinary where the focus is on infrastructure development within the constraints of technical, economic and environmental factors. Possible mechanisms to cater for the country's energy research funding are suggested

  8. Building more powerful less expensive supercomputers using Processing-In-Memory (PIM) LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Richard C.

    2009-09-01

    This report details the accomplishments of the 'Building More Powerful Less Expensive Supercomputers Using Processing-In-Memory (PIM)' LDRD ('PIM LDRD', number 105809) for FY07-FY09. Latency dominates all levels of supercomputer design. Within a node, increasing memory latency, relative to processor cycle time, limits CPU performance. Between nodes, the same increase in relative latency impacts scalability. Processing-In-Memory (PIM) is an architecture that directly addresses this problem using enhanced chip fabrication technology and machine organization. PIMs combine high-speed logic and dense, low-latency, high-bandwidth DRAM, and lightweight threads that tolerate latency by performing useful work during memory transactions. This work examines the potential of PIM-based architectures to support mission critical Sandia applications and an emerging class of more data intensive informatics applications. This work has resulted in a stronger architecture/implementation collaboration between 1400 and 1700. Additionally, key technology components have impacted vendor roadmaps, and we are in the process of pursuing these new collaborations. This work has the potential to impact future supercomputer design and construction, reducing power and increasing performance. This final report is organized as follow: this summary chapter discusses the impact of the project (Section 1), provides an enumeration of publications and other public discussion of the work (Section 1), and concludes with a discussion of future work and impact from the project (Section 1). The appendix contains reprints of the refereed publications resulting from this work.

  9. Supercomputers and the mathematical modeling of high complexity problems

    International Nuclear Information System (INIS)

    Belotserkovskii, Oleg M

    2010-01-01

    This paper is a review of many works carried out by members of our scientific school in past years. The general principles of constructing numerical algorithms for high-performance computers are described. Several techniques are highlighted and these are based on the method of splitting with respect to physical processes and are widely used in computing nonlinear multidimensional processes in fluid dynamics, in studies of turbulence and hydrodynamic instabilities and in medicine and other natural sciences. The advances and developments related to the new generation of high-performance supercomputing in Russia are presented.

  10. Summaries of research and development activities by using JAERI computer system in FY2003. April 1, 2003 - March 31, 2004

    International Nuclear Information System (INIS)

    2005-03-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer system included super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and big user's research and development activities by using the computer system in FY2003 (April 1, 2003 - March 31, 2004). (author)

  11. Wavelet transform-vector quantization compression of supercomputer ocean model simulation output

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J N; Brislawn, C M

    1992-11-12

    We describe a new procedure for efficient compression of digital information for storage and transmission purposes. The algorithm involves a discrete wavelet transform subband decomposition of the data set, followed by vector quantization of the wavelet transform coefficients using application-specific vector quantizers. The new vector quantizer design procedure optimizes the assignment of both memory resources and vector dimensions to the transform subbands by minimizing an exponential rate-distortion functional subject to constraints on both overall bit-rate and encoder complexity. The wavelet-vector quantization method, which originates in digital image compression. is applicable to the compression of other multidimensional data sets possessing some degree of smoothness. In this paper we discuss the use of this technique for compressing the output of supercomputer simulations of global climate models. The data presented here comes from Semtner-Chervin global ocean models run at the National Center for Atmospheric Research and at the Los Alamos Advanced Computing Laboratory.

  12. Heat dissipation computations of a HVDC ground electrode using a supercomputer

    International Nuclear Information System (INIS)

    Greiss, H.; Mukhedkar, D.; Lagace, P.J.

    1990-01-01

    This paper reports on the temperature, of soil surrounding a High Voltage Direct Current (HVDC) toroidal ground electrode of practical dimensions, in both homogeneous and non-homogeneous soils that was computed at incremental points in time using finite difference methods on a supercomputer. Curves of the response were computed and plotted at several locations within the soil in the vicinity of the ground electrode for various values of the soil parameters

  13. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  14. Energy and technology review

    Energy Technology Data Exchange (ETDEWEB)

    1984-03-01

    The Lawrence Livermore National Laboratory publishes the Energy and Technology Review Monthly. This periodical reviews progress mode is selected programs at the laboratory. This issue includes articles on in-situ coal gasification, on chromosomal aberrations in human sperm, on high speed cell sorting and on supercomputers.

  15. Energy and technology review

    International Nuclear Information System (INIS)

    1984-03-01

    The Lawrence Livermore National Laboratory publishes the Energy and Technology Review Monthly. This periodical reviews progress mode is selected programs at the laboratory. This issue includes articles on in-situ coal gasification, on chromosomal aberrations in human sperm, on high speed cell sorting and on supercomputers

  16. Climate@Home: Crowdsourcing Climate Change Research

    Science.gov (United States)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate

  17. A supercomputing application for reactors core design and optimization

    International Nuclear Information System (INIS)

    Hourcade, Edouard; Gaudier, Fabrice; Arnaud, Gilles; Funtowiez, David; Ammar, Karim

    2010-01-01

    Advanced nuclear reactor designs are often intuition-driven processes where designers first develop or use simplified simulation tools for each physical phenomenon involved. Through the project development, complexity in each discipline increases and implementation of chaining/coupling capabilities adapted to supercomputing optimization process are often postponed to a further step so that task gets increasingly challenging. In the context of renewal in reactor designs, project of first realization are often run in parallel with advanced design although very dependant on final options. As a consequence, the development of tools to globally assess/optimize reactor core features, with the on-going design methods accuracy, is needed. This should be possible within reasonable simulation time and without advanced computer skills needed at project management scale. Also, these tools should be ready to easily cope with modeling progresses in each discipline through project life-time. An early stage development of multi-physics package adapted to supercomputing is presented. The URANIE platform, developed at CEA and based on the Data Analysis Framework ROOT, is very well adapted to this approach. It allows diversified sampling techniques (SRS, LHS, qMC), fitting tools (neuronal networks...) and optimization techniques (genetic algorithm). Also data-base management and visualization are made very easy. In this paper, we'll present the various implementing steps of this core physics tool where neutronics, thermo-hydraulics, and fuel mechanics codes are run simultaneously. A relevant example of optimization of nuclear reactor safety characteristics will be presented. Also, flexibility of URANIE tool will be illustrated with the presentation of several approaches to improve Pareto front quality. (author)

  18. Summaries of research and development activities by using JAERI computer system in FY2004 (April 1, 2004 - March 31, 2005)

    International Nuclear Information System (INIS)

    2005-08-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer systems including super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2004 (April 1, 2004 - March 31, 2005). (author)

  19. Energy research

    International Nuclear Information System (INIS)

    1979-03-01

    Status reports are given for the Danish Trade Ministry's energy research projects on uranium prospecting and extraction, oil and gas recovery, underground storage of district heating, electrochemical energy storage systems, wind mills, coal deposits, coal cambustion, energy consumption in buildings, solar heat, biogas, compost heat. (B.P.)

  20. Performance Evaluation of Supercomputers using HPCC and IMB Benchmarks

    Science.gov (United States)

    Saini, Subhash; Ciotti, Robert; Gunney, Brian T. N.; Spelce, Thomas E.; Koniges, Alice; Dossa, Don; Adamidis, Panagiotis; Rabenseifner, Rolf; Tiyyagura, Sunil R.; Mueller, Matthias; hide

    2006-01-01

    The HPC Challenge (HPCC) benchmark suite and the Intel MPI Benchmark (IMB) are used to compare and evaluate the combined performance of processor, memory subsystem and interconnect fabric of five leading supercomputers - SGI Altix BX2, Cray XI, Cray Opteron Cluster, Dell Xeon cluster, and NEC SX-8. These five systems use five different networks (SGI NUMALINK4, Cray network, Myrinet, InfiniBand, and NEC IXS). The complete set of HPCC benchmarks are run on each of these systems. Additionally, we present Intel MPI Benchmarks (IMB) results to study the performance of 11 MPI communication functions on these systems.

  1. Strategic research field no.4, industrial innovations

    International Nuclear Information System (INIS)

    Kato, Chisachi

    2011-01-01

    'Kei'-supercomputer is planned to start its full-scale operation in about one year and a half. With this, High Performance Computing (HPC) is most likely to contribute not only to further progress in basic and applied sciences, but also to bringing about innovations in various fields of industries. It is expected to substantially shorten design time, drastically improve performance and/or liability of various industrial products, and greatly enhance safety of large-scale power plants. In this particle, six research themes, which are currently being prepared in this strategic research field, 'industrial innovations' so as to use 'Kei'-supercomputer as soon as it starts operations, will be briefly described regarding their specific goals and break-through that they are expected to bring about in industries. It is also explained how we have determined these themes. We are also planning several measures in order to promote widespread use of HPC including 'Kei'-supercomputer in industries, which will also be elaborated in this article. (author)

  2. An Interface for Biomedical Big Data Processing on the Tianhe-2 Supercomputer.

    Science.gov (United States)

    Yang, Xi; Wu, Chengkun; Lu, Kai; Fang, Lin; Zhang, Yong; Li, Shengkang; Guo, Guixin; Du, YunFei

    2017-12-01

    Big data, cloud computing, and high-performance computing (HPC) are at the verge of convergence. Cloud computing is already playing an active part in big data processing with the help of big data frameworks like Hadoop and Spark. The recent upsurge of high-performance computing in China provides extra possibilities and capacity to address the challenges associated with big data. In this paper, we propose Orion-a big data interface on the Tianhe-2 supercomputer-to enable big data applications to run on Tianhe-2 via a single command or a shell script. Orion supports multiple users, and each user can launch multiple tasks. It minimizes the effort needed to initiate big data applications on the Tianhe-2 supercomputer via automated configuration. Orion follows the "allocate-when-needed" paradigm, and it avoids the idle occupation of computational resources. We tested the utility and performance of Orion using a big genomic dataset and achieved a satisfactory performance on Tianhe-2 with very few modifications to existing applications that were implemented in Hadoop/Spark. In summary, Orion provides a practical and economical interface for big data processing on Tianhe-2.

  3. European Union Energy Research

    International Nuclear Information System (INIS)

    Valdalbero, D.R.; Schmitz, B.; Raldow, W.; Poireau, M.

    2007-01-01

    This article presents an extensive state of the art of the energy research conducted at European Union level between 1984 and 2006, i.e. from the first to the sixth European Community Framework Programmes (FP1-FP6) for Research, Technological Development and Demonstration (RTD and D). The FP is the main legal tool and financial instrument of EU RTD and D policy. It sets the objectives, priorities and budgets for a period of several years. It has been complemented over time with a number of policy oriented initiatives and notably with the launch of the European Research Area. FP7 will cover the period 2007-2013 and will have a total budget of more than euros 50 billion. Energy has been a main research area in Europe since the founding Treaties (European Coal and Steel Community, European Atomic Energy Community-Euratom and European Economic Community), and energy RTD and D has always been a substantial part of common EU research. Nevertheless, when inflation and successive European enlargements are taken into account, over time the RTD and D effort in the field of energy has decreased significantly in relative terms. In nominal terms it has remained relatively stable at about euros 500 million per year. For the next years (FP7), it is expected that energy will still represent about 10 % of total EU research effort but with an annual budget of more than euros 800 million per year. This article presents a detailed review of the thematic areas and budget in both European nuclear energy research (fusion and fission) and non-nuclear energy research (energy efficiency/rational use of energy, fossil fuels, CO 2 capture and storage, fuel cells and hydrogen, renewable energy sources, strategic energy research/socio-economy). (authors)

  4. Supercomputations and big-data analysis in strong-field ultrafast optical physics: filamentation of high-peak-power ultrashort laser pulses

    Science.gov (United States)

    Voronin, A. A.; Panchenko, V. Ya; Zheltikov, A. M.

    2016-06-01

    High-intensity ultrashort laser pulses propagating in gas media or in condensed matter undergo complex nonlinear spatiotemporal evolution where temporal transformations of optical field waveforms are strongly coupled to an intricate beam dynamics and ultrafast field-induced ionization processes. At the level of laser peak powers orders of magnitude above the critical power of self-focusing, the beam exhibits modulation instabilities, producing random field hot spots and breaking up into multiple noise-seeded filaments. This problem is described by a (3  +  1)-dimensional nonlinear field evolution equation, which needs to be solved jointly with the equation for ultrafast ionization of a medium. Analysis of this problem, which is equivalent to solving a billion-dimensional evolution problem, is only possible by means of supercomputer simulations augmented with coordinated big-data processing of large volumes of information acquired through theory-guiding experiments and supercomputations. Here, we review the main challenges of supercomputations and big-data processing encountered in strong-field ultrafast optical physics and discuss strategies to confront these challenges.

  5. Institutional Research and Development: (Annual report), FY 1986

    Energy Technology Data Exchange (ETDEWEB)

    Strack, B. (ed.)

    1987-01-01

    The Institutional Research and Development (IR and D) program was established at the Lawrence Livermore National Laboratory (LLNL) by the Director in October 1984. The IR and D program fosters exploratory work to advance science and technology; disciplinary research to create varied, innovative approaches to selected scientific fields; and long-term research in support of the defense and energy missions at LLNL. Each project in the IR and D program was selected after personal interviews by the Director and his delegates and was deemed to show unusual promise. These projects include research in the following fields: chemistry and materials science, computation, earth sciences, engineering, nuclear chemistry, biotechnology, environmental consequences of nuclear war, geophysics and planetary physics, and supercomputer research and development. A separate section of the report is devoted to research projects receiving individual awards.

  6. Institutional Research and Development: [Annual report], FY 1986

    International Nuclear Information System (INIS)

    Strack, B.

    1987-01-01

    The Institutional Research and Development (IR and D) program was established at the Lawrence Livermore National Laboratory (LLNL) by the Director in October 1984. The IR and D program fosters exploratory work to advance science and technology; disciplinary research to create varied, innovative approaches to selected scientific fields; and long-term research in support of the defense and energy missions at LLNL. Each project in the IR and D program was selected after personal interviews by the Director and his delegates and was deemed to show unusual promise. These projects include research in the following fields: chemistry and materials science, computation, earth sciences, engineering, nuclear chemistry, biotechnology, environmental consequences of nuclear war, geophysics and planetary physics, and supercomputer research and development. A separate section of the report is devoted to research projects receiving individual awards

  7. Summaries of research and development activities by using JAEA computer system in FY2005. April 1, 2005 - March, 31, 2006

    International Nuclear Information System (INIS)

    2006-10-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2005 (April 1, 2005 - March 31, 2006). (author)

  8. Summaries of research and development activities by using JAEA computer system in FY2006. April 1, 2006 - March 31, 2007

    International Nuclear Information System (INIS)

    2008-02-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2006 (April 1, 2006 - March 31, 2007). (author)

  9. Coherent 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an Optimal Supercomputer Optical Switch Fabric

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko

    2013-01-01

    We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates.......We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates....

  10. Cooperative visualization and simulation in a supercomputer environment

    International Nuclear Information System (INIS)

    Ruehle, R.; Lang, U.; Wierse, A.

    1993-01-01

    The article takes a closer look on the requirements being imposed by the idea to integrate all the components into a homogeneous software environment. To this end several methods for the distribtuion of applications in dependence of certain problem types are discussed. The currently available methods at the University of Stuttgart Computer Center for the distribution of applications are further explained. Finally the aims and characteristics of a European sponsored project, called PAGEIN, are explained, which fits perfectly into the line of developments at RUS. The aim of the project is to experiment with future cooperative working modes of aerospace scientists in a high speed distributed supercomputing environment. Project results will have an impact on the development of real future scientific application environments. (orig./DG)

  11. Accelerators for atomic energy research

    International Nuclear Information System (INIS)

    Shibata, Tokushi

    1999-01-01

    The research and educational activities accomplished using accelerators for atomic energy research were studied. The studied items are research subjects, facility operation, the number of master theses and doctor theses on atomic energy research using accelerators and the future role of accelerators in atomic energy research. The strategy for promotion of the accelerator facility for atomic energy research is discussed. (author)

  12. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  13. Energy research for tomorrow

    International Nuclear Information System (INIS)

    Arzberger, Isolde; Breh, Wolfgang; Brendler, Vinzenz; Danneil, Friederike; Eulenburg, Katharina; Messner, Frank; Ossing, Franz; Saupe, Stephan; Sieber, Julia; Zeiss, Erhard

    2011-04-01

    One of the central challenges of the 21st century is to ensure a sustainable energy supply for the world's people and its economy. That's why scientists are searching for solutions that will provide sufficient amounts of energy - reliably, affordably and without endangering the natural environment on which our lives are based. One thing everyone agrees on is that there are no obvious solutions. No single energy carrier or technology will suffice to safeguard our future energy supply. Consequently, researchers must examine a broad range of options and develop many different kinds of technologies. This is the only way to create a sustainable energy system that adequately takes local environmental, political, social and economic conditions into account. Germany's largest scientific organisation, the Helmholtz Association of German Research Centres, is carrying out world-class research into diverse aspects of this existential challenge in its Research Field Energy. A broad spectrum of energy sources such as the sun, nuclear fusion, fossil fuels, geothermal energy, water, wind, nuclear fission and biomass are being investigated - but this is not all. Technologies for energy storage, energy distribution and efficient energy use also play a key role. This comprehensive approach corresponds to the energy concept of the government of the Federal Republic of Germany, which calls for a dynamic energy mix that includes the expanded use of renewable energies, a corresponding extension of the power grid, the development of new energy storage systems and increased energy efficiency. The scientists of the Helmholtz Association are investigating entire chains of energy processes, including boundary conditions and side effects such as the impact on the climate and the environment and acceptance issues. They are taking into account interactions with other sectors such as the raw materials, construction and mobility industries. Energy research is directed at industrial application and

  14. Energy research for tomorrow

    Energy Technology Data Exchange (ETDEWEB)

    Arzberger, Isolde; Breh, Wolfgang; Brendler, Vinzenz; Danneil, Friederike; Eulenburg, Katharina; Messner, Frank; Ossing, Franz; Saupe, Stephan; Sieber, Julia; Zeiss, Erhard (eds.)

    2011-04-15

    One of the central challenges of the 21st century is to ensure a sustainable energy supply for the world's people and its economy. That's why scientists are searching for solutions that will provide sufficient amounts of energy - reliably, affordably and without endangering the natural environment on which our lives are based. One thing everyone agrees on is that there are no obvious solutions. No single energy carrier or technology will suffice to safeguard our future energy supply. Consequently, researchers must examine a broad range of options and develop many different kinds of technologies. This is the only way to create a sustainable energy system that adequately takes local environmental, political, social and economic conditions into account. Germany's largest scientific organisation, the Helmholtz Association of German Research Centres, is carrying out world-class research into diverse aspects of this existential challenge in its Research Field Energy. A broad spectrum of energy sources such as the sun, nuclear fusion, fossil fuels, geothermal energy, water, wind, nuclear fission and biomass are being investigated - but this is not all. Technologies for energy storage, energy distribution and efficient energy use also play a key role. This comprehensive approach corresponds to the energy concept of the government of the Federal Republic of Germany, which calls for a dynamic energy mix that includes the expanded use of renewable energies, a corresponding extension of the power grid, the development of new energy storage systems and increased energy efficiency. The scientists of the Helmholtz Association are investigating entire chains of energy processes, including boundary conditions and side effects such as the impact on the climate and the environment and acceptance issues. They are taking into account interactions with other sectors such as the raw materials, construction and mobility industries. Energy research is directed at industrial

  15. [High energy particle physics]: Progress report covering the five year period from August 1, 1984 to May 31, 1989 with special emphasis for the period of August 1, 1988 to May 31, 1989: Part 1

    International Nuclear Information System (INIS)

    1989-01-01

    In this document the High Energy Physics group reviews its accomplishments and progress during the past five years, with special emphasis for the past year and presents plans for continuing research during the next several years. During the last few years the effort of the experimental group has been divided approximately equally between fixed target physics and preparations for future collider experiments. The main emphasis of the theory group has been in the area of strong and electroweak phenomenology with an emphasis on hard scattering processes. With the recent creation of the Supercomputer Computations Research Institute, some work has also been done in the area of numerical simulations of condensed matter spin models and techniques for implementing numerical simulations on supercomputers

  16. Feynman diagrams sampling for quantum field theories on the QPACE 2 supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Rappl, Florian

    2016-08-01

    This work discusses the application of Feynman diagram sampling in quantum field theories. The method uses a computer simulation to sample the diagrammatic space obtained in a series expansion. For running large physical simulations powerful computers are obligatory, effectively splitting the thesis in two parts. The first part deals with the method of Feynman diagram sampling. Here the theoretical background of the method itself is discussed. Additionally, important statistical concepts and the theory of the strong force, quantum chromodynamics, are introduced. This sets the context of the simulations. We create and evaluate a variety of models to estimate the applicability of diagrammatic methods. The method is then applied to sample the perturbative expansion of the vertex correction. In the end we obtain the value for the anomalous magnetic moment of the electron. The second part looks at the QPACE 2 supercomputer. This includes a short introduction to supercomputers in general, as well as a closer look at the architecture and the cooling system of QPACE 2. Guiding benchmarks of the InfiniBand network are presented. At the core of this part, a collection of best practices and useful programming concepts are outlined, which enables the development of efficient, yet easily portable, applications for the QPACE 2 system.

  17. Improving the energy efficiency of sparse linear system solvers on multicore and manycore systems.

    Science.gov (United States)

    Anzt, H; Quintana-Ortí, E S

    2014-06-28

    While most recent breakthroughs in scientific research rely on complex simulations carried out in large-scale supercomputers, the power draft and energy spent for this purpose is increasingly becoming a limiting factor to this trend. In this paper, we provide an overview of the current status in energy-efficient scientific computing by reviewing different technologies used to monitor power draft as well as power- and energy-saving mechanisms available in commodity hardware. For the particular domain of sparse linear algebra, we analyse the energy efficiency of a broad collection of hardware architectures and investigate how algorithmic and implementation modifications can improve the energy performance of sparse linear system solvers, without negatively impacting their performance. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. A fast random number generator for the Intel Paragon supercomputer

    Science.gov (United States)

    Gutbrod, F.

    1995-06-01

    A pseudo-random number generator is presented which makes optimal use of the architecture of the i860-microprocessor and which is expected to have a very long period. It is therefore a good candidate for use on the parallel supercomputer Paragon XP. In the assembler version, it needs 6.4 cycles for a real∗4 random number. There is a FORTRAN routine which yields identical numbers up to rare and minor rounding discrepancies, and it needs 28 cycles. The FORTRAN performance on other microprocessors is somewhat better. Arguments for the quality of the generator and some numerical tests are given.

  19. The past, present, and future of test and research reactor physics

    International Nuclear Information System (INIS)

    Ryskamp, J.M.

    1992-01-01

    Reactor physics calculations have been performed on research reactors since the first one was built 50 yr ago under the University of Chicago stadium. Since then, reactor physics calculations have evolved from Fermi-age theory calculations performed with slide rules to three-dimensional, continuous-energy, coupled neutron-photon Monte Carlo computations performed with supercomputers and workstations. Such enormous progress in reactor physics leads us to believe that the next 50 year will be just as exciting. This paper reviews this transition from the past to the future

  20. Making Research Cyberinfrastructure a Strategic Choice

    Science.gov (United States)

    Hacker, Thomas J.; Wheeler, Bradley C.

    2007-01-01

    The commoditization of low-cost hardware has enabled even modest-sized laboratories and research projects to own their own "supercomputers." The authors argue that this local solution undermines rather than amplifies the research potential of scholars. CIOs, provosts, and research technologists should consider carefully an overall…

  1. Energy research program 82

    International Nuclear Information System (INIS)

    1982-01-01

    The energy research program 82 (EFP-82) is prepared by the Danish ministry of energy in order to continue the extension of the Danish energy research and development started through the former trade ministry's programs EM-1 (1976) and EM-2 (1978), and the energy ministry's programs EFP-80 and EFP-81. The new program is a continuation of the activities in the period 1982-84 with a total budget of 100 mio.Dkr. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (BP)

  2. Holistic Approach to Data Center Energy Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Steven W [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-18

    This presentation discusses NREL's Energy System Integrations Facility and NREL's holistic design approach to sustainable data centers that led to the world's most energy-efficient data center. It describes Peregrine, a warm water liquid cooled supercomputer, waste heat reuse in the data center, demonstrated PUE and ERE, and lessons learned during four years of operation.

  3. Energy research 2003 - Overview

    International Nuclear Information System (INIS)

    2004-01-01

    This publication issued by the Swiss Federal Office of Energy (SFOE) presents an overview of advances made in energy research in Switzerland in 2003. In the report, the heads of various programmes present projects and summarise the results of research in four main areas: Efficient use of energy, renewable energies, nuclear energy and energy policy fundamentals. Energy-efficiency is illustrated by examples from the areas of building, traffic, electricity, ambient heat and combined heat and power, combustion, fuel cells and in the process engineering areas. In the renewable energy area, projects concerning energy storage, photovoltaics, solar chemistry and hydrogen, biomass, small-scale hydro, geothermal energy and wind energy are presented. Work being done on nuclear safety and disposal regulations as well as controlled thermonuclear fusion are discussed

  4. Energy research program 84

    International Nuclear Information System (INIS)

    1984-01-01

    The energy research program 84 (EFP-84) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81, EFP-82 and EFP-83. The new program is a continuation of the activities in the period 1984-86 with a total budget of 112 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  5. Energy research program 83

    International Nuclear Information System (INIS)

    1983-01-01

    The energy research program 83 (EFP-83) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81 and EFP-82. The new program is a continuation of the activities in the period 1983-85 with a total budget of 111 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  6. Energy research program 85

    International Nuclear Information System (INIS)

    1985-01-01

    The energy research program 85 (EFP-85) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and Ministry of Energy's programs EFP-80, EFP-81, EFP-82, EFP-83, and EFP-84. The new program is a continuation of the activities in the period 1985-87 with a total budget of 110 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  7. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  8. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-03-29

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  9. Energy research program 86

    International Nuclear Information System (INIS)

    1986-01-01

    The energy research program 86 (EFP-86) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81, EFP-82, EFP-83, EFP-84, and EFP-85. The new program is a continuation of the activities in the period 1986-88 with a total budget of 116 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  10. Simulation of x-rays in refractive structure by the Monte Carlo method using the supercomputer SKIF

    International Nuclear Information System (INIS)

    Yaskevich, Yu.R.; Kravchenko, O.I.; Soroka, I.I.; Chembrovskij, A.G.; Kolesnik, A.S.; Serikova, N.V.; Petrov, P.V.; Kol'chevskij, N.N.

    2013-01-01

    Software 'Xray-SKIF' for the simulation of the X-rays in refractive structures by the Monte-Carlo method using the supercomputer SKIF BSU are developed. The program generates a large number of rays propagated from a source to the refractive structure. The ray trajectory under assumption of geometrical optics is calculated. Absorption is calculated for each ray inside of refractive structure. Dynamic arrays are used for results of calculation rays parameters, its restore the X-ray field distributions very fast at different position of detector. It was found that increasing the number of processors leads to proportional decreasing of calculation time: simulation of 10 8 X-rays using supercomputer with the number of processors from 1 to 30 run-times equal 3 hours and 6 minutes, respectively. 10 9 X-rays are calculated by software 'Xray-SKIF' which allows to reconstruct the X-ray field after refractive structure with a special resolution of 1 micron. (authors)

  11. Cyberinfrastructure for high energy physics in Korea

    International Nuclear Information System (INIS)

    Cho, Kihyeon; Kim, Hyunwoo; Jeung, Minho

    2010-01-01

    We introduce the hierarchy of cyberinfrastructure which consists of infrastructure (supercomputing and networks), Grid, e-Science, community and physics from bottom layer to top layer. KISTI is the national headquarter of supercomputer, network, Grid and e-Science in Korea. Therefore, KISTI is the best place to for high energy physicists to use cyberinfrastructure. We explain this concept on the CDF and the ALICE experiments. In the meantime, the goal of e-Science is to study high energy physics anytime and anywhere even if we are not on-site of accelerator laboratories. The components are data production, data processing and data analysis. The data production is to take both on-line and off-line shifts remotely. The data processing is to run jobs anytime, anywhere using Grid farms. The data analysis is to work together to publish papers using collaborative environment such as EVO (Enabling Virtual Organization) system. We also present the global community activities of FKPPL (France-Korea Particle Physics Laboratory) and physics as top layer.

  12. Neutrons and sustainable energy research

    International Nuclear Information System (INIS)

    Peterson, V.

    2009-01-01

    Full text: Neutron scattering is essential for the study of sustainable energy materials, including the areas of hydrogen research (such as its separation, storage, and use in fuel-cells) and energy transport (such as fuel-cell and battery materials). Researchers at the Bragg Institute address critical questions in sustainable energy research, with researchers providing a source of expertise for external collaborators, specialist analysis equipment, and acting as a point of contact for the study of sustainable energy materials using neutron scattering. Some recent examples of sustainable energy materials research using neutron scattering will be presented. These examples include the storage of energy, in the form of hydrogen through a study of its location in and interaction with new porous hydrogen storage materials [1-3] and in battery materials through in-situ studies of structure during charge-discharge cycling, and use of energy in fuel cells by studying proton diffusion through fuel cell membranes.

  13. [Supercomputer investigation of the protein-ligand system low-energy minima].

    Science.gov (United States)

    Oferkin, I V; Sulimov, A V; Katkova, E V; Kutov, D K; Grigoriev, F V; Kondakova, O A; Sulimov, V B

    2015-01-01

    The accuracy of the protein-ligand binding energy calculations and ligand positioning is strongly influenced by the choice of the docking target function. This work demonstrates the evaluation of the five different target functions used in docking: functions based on MMFF94 force field and functions based on PM7 quantum-chemical method accounting or without accounting the implicit solvent model (PCM, COSMO or SGB). For these purposes the ligand positions corresponding to the minima of the target function and the experimentally known ligand positions in the protein active site (crystal ligand positions) were compared. Each function was examined on the same test-set of 16 protein-ligand complexes. The new parallelized docking program FLM based on Monte Carlo search algorithm was developed to perform the comprehensive low-energy minima search and to calculate the protein-ligand binding energy. This study demonstrates that the docking target function based on the MMFF94 force field can be used to detect the crystal or near crystal positions of the ligand by the finding the low-energy local minima spectrum of the target function. The importance of solvent accounting in the docking process for the accurate ligand positioning is also shown. The accuracy of the ligand positioning as well as the correlation between the calculated and experimentally determined protein-ligand binding energies are improved when the MMFF94 force field is substituted by the new PM7 method with implicit solvent accounting.

  14. Energy 2007. Research, development, demonstration; Energi 07. Forskning, udvikling, demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Byriel, I.P.; Justesen, Helle; Beck, A.; Borup Jensen, J.; Rosenfeldt Jakobsen, Kl; Jacobsen, Steen Hartvig (eds.)

    2007-08-10

    Danish energy research is in an exciting and challenging situation. Rising oil prices, unstable energy supply, climate policy responsibilities and globalization have brought development of new environmentally friendly and more efficient energy technologies into focus. Promising international markets for newly developed energy technologies are emerging, and at the same time well established Danish positions of strength are challenged by new strong actors on the global market. The Danish government has set to work on its vision of an appreciable strengthening of public energy research funding through the recent law on the energy technological development and demonstration programme EUDP and the realization of globalization funds. The interaction between basic and applied research must be kept intact. In this report the various Danish energy research programmes administered by Energinet.dk, Danish Energy Authority, Danish Energy Association, Danish Council for Strategic Research's Programme Commission on Energy and Environment and Danish National Advanced Technology Foundation, coordinate their annual reports for the first time. The aim of Energy 2007 is to give the reader an idea of how the energy research programmes collaborate on solving the major energy technology challenges - also in an international context. (BA)

  15. Performance Analysis and Scaling Behavior of the Terrestrial Systems Modeling Platform TerrSysMP in Large-Scale Supercomputing Environments

    Science.gov (United States)

    Kollet, S. J.; Goergen, K.; Gasper, F.; Shresta, P.; Sulis, M.; Rihani, J.; Simmer, C.; Vereecken, H.

    2013-12-01

    In studies of the terrestrial hydrologic, energy and biogeochemical cycles, integrated multi-physics simulation platforms take a central role in characterizing non-linear interactions, variances and uncertainties of system states and fluxes in reciprocity with observations. Recently developed integrated simulation platforms attempt to honor the complexity of the terrestrial system across multiple time and space scales from the deeper subsurface including groundwater dynamics into the atmosphere. Technically, this requires the coupling of atmospheric, land surface, and subsurface-surface flow models in supercomputing environments, while ensuring a high-degree of efficiency in the utilization of e.g., standard Linux clusters and massively parallel resources. A systematic performance analysis including profiling and tracing in such an application is crucial in the understanding of the runtime behavior, to identify optimum model settings, and is an efficient way to distinguish potential parallel deficiencies. On sophisticated leadership-class supercomputers, such as the 28-rack 5.9 petaFLOP IBM Blue Gene/Q 'JUQUEEN' of the Jülich Supercomputing Centre (JSC), this is a challenging task, but even more so important, when complex coupled component models are to be analysed. Here we want to present our experience from coupling, application tuning (e.g. 5-times speedup through compiler optimizations), parallel scaling and performance monitoring of the parallel Terrestrial Systems Modeling Platform TerrSysMP. The modeling platform consists of the weather prediction system COSMO of the German Weather Service; the Community Land Model, CLM of NCAR; and the variably saturated surface-subsurface flow code ParFlow. The model system relies on the Multiple Program Multiple Data (MPMD) execution model where the external Ocean-Atmosphere-Sea-Ice-Soil coupler (OASIS3) links the component models. TerrSysMP has been instrumented with the performance analysis tool Scalasca and analyzed

  16. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  17. Report of the Subpanel on Theoretical Computing of the High Energy Physics Advisory Panel

    International Nuclear Information System (INIS)

    1984-09-01

    The Subpanel on Theoretical Computing of the High Energy Physics Advisory Panel (HEPAP) was formed in July 1984 to make recommendations concerning the need for state-of-the-art computing for theoretical studies. The specific Charge to the Subpanel is attached as Appendix A, and the full membership is listed in Appendix B. For the purposes of this study, theoretical computing was interpreted as encompassing both investigations in the theory of elementary particles and computation-intensive aspects of accelerator theory and design. Many problems in both areas are suited to realize the advantages of vectorized processing. The body of the Subpanel Report is organized as follows. The Introduction, Section I, explains some of the goals of computational physics as it applies to elementary particle theory and accelerator design. Section II reviews the availability of mainframe supercomputers to researchers in the United States, in Western Europe, and in Japan. Other promising approaches to large-scale computing are summarized in Section III. Section IV details the current computing needs for problems in high energy theory, and for beam dynamics studies. The Subpanel Recommendations appear in Section V. The Appendices attached to this Report give the Charge to the Subpanel, the Subpanel membership, and some background information on the financial implications of establishing a supercomputer center

  18. New generation of docking programs: Supercomputer validation of force fields and quantum-chemical methods for docking.

    Science.gov (United States)

    Sulimov, Alexey V; Kutov, Danil C; Katkova, Ekaterina V; Ilin, Ivan S; Sulimov, Vladimir B

    2017-11-01

    Discovery of new inhibitors of the protein associated with a given disease is the initial and most important stage of the whole process of the rational development of new pharmaceutical substances. New inhibitors block the active site of the target protein and the disease is cured. Computer-aided molecular modeling can considerably increase effectiveness of new inhibitors development. Reliable predictions of the target protein inhibition by a small molecule, ligand, is defined by the accuracy of docking programs. Such programs position a ligand in the target protein and estimate the protein-ligand binding energy. Positioning accuracy of modern docking programs is satisfactory. However, the accuracy of binding energy calculations is too low to predict good inhibitors. For effective application of docking programs to new inhibitors development the accuracy of binding energy calculations should be higher than 1kcal/mol. Reasons of limited accuracy of modern docking programs are discussed. One of the most important aspects limiting this accuracy is imperfection of protein-ligand energy calculations. Results of supercomputer validation of several force fields and quantum-chemical methods for docking are presented. The validation was performed by quasi-docking as follows. First, the low energy minima spectra of 16 protein-ligand complexes were found by exhaustive minima search in the MMFF94 force field. Second, energies of the lowest 8192 minima are recalculated with CHARMM force field and PM6-D3H4X and PM7 quantum-chemical methods for each complex. The analysis of minima energies reveals the docking positioning accuracies of the PM7 and PM6-D3H4X quantum-chemical methods and the CHARMM force field are close to one another and they are better than the positioning accuracy of the MMFF94 force field. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Danish energy research

    International Nuclear Information System (INIS)

    1976-04-01

    Review of current Danish research and development on energy, with the main weight laid on public financing. Based on this review, a proposal is presented for extended research and development i Denmark. (B.P.)

  20. NEA activities in 1993. 22. Annual Report of the OECD Nuclear Energy Agency

    International Nuclear Information System (INIS)

    1994-01-01

    The titles and themes of the ten chapters of this report on NEA activities are: trends in nuclear energy; nuclear development and the fuel cycle (potential contribution of nuclear energy, policy alternatives, maintaining the nuclear option, prospective); reactor safety and regulation (safety research, regulatory approach, safety assessment, accident phenomenology and management, human factors, international standards); radiation protection (revision of the standards, assessment of the protection, international emergency exercises); radioactive waste management (long term safety assessment, in situ evaluation, other radioactive wastes); nuclear science (role, nuclear data, use of supercomputers, actinide transmutation, NEA Data Bank); joint projects (Three Mile Island vessel investigation, Halden reactor project...); legal affairs (liability aspects...); information programme; relations with non-member countries. 28 figs

  1. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  2. Bioprocessing research for energy applications

    Energy Technology Data Exchange (ETDEWEB)

    Scott, C.D.; Gaden, E.L. Jr.; Humphrey, A.E.; Carta, G.; Kirwan, D.J.

    1989-04-01

    The new biotechnology that is emerging could have a major impact on many of the industries important to our country, especially those associated with energy production and conservation. Advances in bioprocessing systems will provide important alternatives for the future utilization of various energy resources and for the control of environmental hazards that can result from energy generation. Although research in the fundamental biological sciences has helped set the scene for a ''new biotechnology,'' the major impediment to rapid commercialization for energy applications is the lack of a firm understanding of the necessary engineering concepts. Engineering research is now the essential ''bridge'' that will allow the development of a wide range of energy-related bioprocessing systems. A workshop entitled ''Bioprocessing Research for Energy Applications'' was held to address this technological area, to define the engineering research needs, and to identify those opportunities which would encourage rapid implementation of advanced bioprocessing concepts.

  3. Fossil energy research meeting

    Energy Technology Data Exchange (ETDEWEB)

    Kropschot, R. H.; Phillips, G. C.

    1977-12-01

    U.S. ERDA's research programs in fossil energy are reviewed with brief descriptions, budgets, etc. Of general interest are discussions related to the capabilities for such research of national laboratories, universities, energy centers, etc. Of necessity many items are treated briefly, but a general overview of the whole program is provided. (LTN)

  4. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1992-09-01

    The programs of the Office of Energy Research provide basic science support for energy technologies as well as advancing understanding in general science and training future scientists. Energy Research provides insights into fundamental science and associated phenomena and develops new or advanced concepts and techniques. Research of this type has been supported by the Department of Energy and its predecessors for over 40 years and includes research in the natural and physical sciences, including high energy and nuclear physics; magnetic fusion energy; biological and environmental research; and basic energy sciences research in the materials, chemical, and applied mathematical sciences, engineering and geosciences, and energy biosciences. These basic research programs help build the science and technology base that underpins energy development by Government and industry

  5. The green computing book tackling energy efficiency at large scale

    CERN Document Server

    Feng, Wu-chun

    2014-01-01

    Low-Power, Massively Parallel, Energy-Efficient Supercomputers The Blue Gene TeamCompiler-Driven Energy Efficiency Mahmut Kandemir and Shekhar Srikantaiah An Adaptive Run-Time System for Improving Energy Efficiency Chung-Hsing Hsu, Wu-chun Feng, and Stephen W. PooleEnergy-Efficient Multithreading through Run-Time Adaptation Exploring Trade-Offs between Energy Savings and Reliability in Storage Systems Ali R. Butt, Puranjoy Bhattacharjee, Guanying Wang, and Chris GniadyCross-Layer Power Management Zhikui Wang and Parthasarathy Ranganathan Energy-Efficient Virtualized Systems Ripal Nathuji and K

  6. Energy research and development in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Hultberg, S.; Lindstroem Thomsen, P.

    1996-06-01

    The document describes some of the most important results produced during the last twenty years under the Danish government`s Energy Research Programme (ERP). Some of the involved research groups, and their current research projects, are described. The aim is to invite international cooperation on research in this field. Research areas are divided under the main headings of energy policy, energy supply and energy end-use. The document is illustrated with coloured photographs, diagrams and graphs. The names of contact persons, firms and institutions relevant to the described projects are listed. (AB)

  7. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  8. Energy research strategic plan

    International Nuclear Information System (INIS)

    1995-08-01

    Research and development is an essential element of economic prosperity and a traditional source of strength for the U.S. economy. During the past two decades, the way of introducing technological developments into the national economy has changed steadily. Previously, industry did most long-term technology development and some basic research with private funding. Today, the Nation's industry relies mostly on federally-funded research to provide the knowledge base that leads to new technologies and economic growth. In the 1980s, U.S. firms lost major technology markets to foreign competition. In response, many firms increased emphasis on technology development for near term payoff while decreasing long term research for new technology. The purpose of the Office of Energy Research of the U.S. Department of Energy (DOE) is to provide basic research and technology development that triggers and drives economic development and helps maintain U.S. world leadership in science. We do so through programs of basic and applied research that support the Department's energy, environmental and national defense missions and that provide the foundation for technical advancement. We do so by emphasizing research that maintains our world leadership in science, mathematics, and engineering and through partnerships with universities, National Laboratories, and industries across the Nation

  9. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; Kumar, Jitendra [ORNL; Mills, Richard T. [Argonne National Laboratory; Hoffman, Forrest M. [ORNL; Sripathi, Vamsi [Intel Corporation; Hargrove, William Walter [United States Department of Agriculture (USDA), United States Forest Service (USFS)

    2017-09-01

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like the Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.

  10. Energy Frontier Research Center Materials Science of Actinides (A 'Life at the Frontiers of Energy Research' contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    International Nuclear Information System (INIS)

    Burns, Peter

    2011-01-01

    'Energy Frontier Research Center Materials Science of Actinides' was submitted by the EFRC for Materials Science of Actinides (MSA) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. MSA is directed by Peter Burns at the University of Notre Dame, and is a partnership of scientists from ten institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  11. Advances in energy research

    CERN Document Server

    Acosta, Morena J

    2013-01-01

    This book presents a comprehensive review of energy research studies from authors around the globe, including recent research in new technologies associated with the construction of nuclear power plants; oil disperse systems study using nuclear magnetic resonance relaxometry (NMRR); low energy consumption for cooling and heating systems; experimental investigation of the performance of a ground-source heat pump system for buildings heating and cooling; sustainable development of bioenergy from agricultural wastes and the environment; hazard identification and parametric analysis of toxic pollutants dispersion from large liquid hydrocarbon fuel-tank fires; maintenance benchmarking in petrochemicals plants by means of a multicriteria model; wind energy development innovation; power, people and pollution; nature and technology of geothermal energy and clean sustainable energy for the benefit of humanity and the environment; and soil thermal properties and the effects of groundwater on closed loops.

  12. Nuclear energy research in Germany 2008. Research centers and universities

    International Nuclear Information System (INIS)

    Tromm, Walter

    2009-01-01

    This summary report presents nuclear energy research at research centers and universities in Germany in 2008. Activities are explained on the basis of examples of research projects and a description of the situation of research and teaching in general. Participants are the - Karlsruhe Research Center, - Juelich Research Center (FZJ), - Dresden-Rossendorf Research Center (FZD), - Verein fuer Kernverfahrenstechnik und Analytik Rossendorf e.V. (VKTA), - Technical University of Dresden, - University of Applied Sciences, Zittau/Goerlitz, - Institute for Nuclear Energy and Energy Systems (IKE) at the University of Stuttgart, - Reactor Simulation and Reactor Safety Working Group at the Bochum Ruhr University. (orig.)

  13. International energy: Research organizations, 1986--1990

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, P.; Jordan, S. (eds.) (USDOE Office of Scientific and Technical Information, Oak Ridge, TN (USA))

    1991-03-01

    The International Energy: Research Organizations publication contains the standardized names of energy research organizations used in energy information databases. Involved in this cooperative task are (1) the technical staff of the USDOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the Energy Technology Data Exchange (ETDE) and (2) the International Nuclear Information System (INIS). This publication identifies current organizations doing research in all energy fields, standardizes the format for recording these organization names in bibliographic citations, assigns a numeric code to facilitate data entry, and identifies report number prefixes assigned by these organizations. These research organization names may be used in searching the databases Energy Science Technology'' on DIALOG and Energy'' on STN International. These organization names are also used in USDOE databases on the Integrated Technical Information System. Research organizations active in the past five years, as indicated by database records, were identified to form this publication. This directory includes approximately 34,000 organizations that reported energy-related literature from 1986 to 1990 and updates the DOE Energy Data Base: Corporate Author Entries.

  14. US Department of Energy nuclear energy research initiative

    International Nuclear Information System (INIS)

    Ross, F.

    2001-01-01

    This paper describes the Department of Energy's (DOE's) Nuclear Energy Research Initiative (NERI) that has been established to address and help overcome the principal technical and scientific issues affecting the future use of nuclear energy in the United States. (author)

  15. Swiss Federal Energy Research Concept 2008 - 2011

    International Nuclear Information System (INIS)

    2007-04-01

    This report for the Swiss Federal Office of Energy (SFOE) presents the plan for the activities of the Swiss Federal Commission on Energy Research CORE during the period 2008 - 2011. The motivation behind the state promotion of energy research is discussed. The visions, aims and strategies of the energy research programme are discussed. The main areas of research to be addressed during the period are presented. These include the efficient use of energy in buildings and traffic - batteries and supercaps, electrical technologies, combustion systems, fuel cells and power generation are discussed. Research to be done in the area of renewable sources of energy are listed. Here, solar-thermal, photovoltaics, hydrogen, biomass, geothermal energy, wind energy and ambient heat are among the areas to be examined. Research on nuclear energy and safety aspects are mentioned. Finally, work on the basics of energy economy are looked at and the allocation of funding during the period 2008 - 2011 is looked at

  16. Use of QUADRICS supercomputer as embedded simulator in emergency management systems

    International Nuclear Information System (INIS)

    Bove, R.; Di Costanzo, G.; Ziparo, A.

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system

  17. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  18. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  19. SUPERCOMPUTER SIMULATION OF CRITICAL PHENOMENA IN COMPLEX SOCIAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Petrus M.A. Sloot

    2014-09-01

    Full Text Available The paper describes a problem of computer simulation of critical phenomena in complex social systems on a petascale computing systems in frames of complex networks approach. The three-layer system of nested models of complex networks is proposed including aggregated analytical model to identify critical phenomena, detailed model of individualized network dynamics and model to adjust a topological structure of a complex network. The scalable parallel algorithm covering all layers of complex networks simulation is proposed. Performance of the algorithm is studied on different supercomputing systems. The issues of software and information infrastructure of complex networks simulation are discussed including organization of distributed calculations, crawling the data in social networks and results visualization. The applications of developed methods and technologies are considered including simulation of criminal networks disruption, fast rumors spreading in social networks, evolution of financial networks and epidemics spreading.

  20. Jointly Sponsored Research Program Energy Related Research

    Energy Technology Data Exchange (ETDEWEB)

    Western Research Institute

    2009-03-31

    Cooperative Agreement, DE-FC26-98FT40323, Jointly Sponsored Research (JSR) Program at Western Research Institute (WRI) began in 1998. Over the course of the Program, a total of seventy-seven tasks were proposed utilizing a total of $23,202,579 in USDOE funds. Against this funding, cosponsors committed $26,557,649 in private funds to produce a program valued at $49,760,228. The goal of the Jointly Sponsored Research Program was to develop or assist in the development of innovative technology solutions that will: (1) Increase the production of United States energy resources - coal, natural gas, oil, and renewable energy resources; (2) Enhance the competitiveness of United States energy technologies in international markets and assist in technology transfer; (3) Reduce the nation's dependence on foreign energy supplies and strengthen both the United States and regional economies; and (4) Minimize environmental impacts of energy production and utilization. Under the JSR Program, energy-related tasks emphasized enhanced oil recovery, heavy oil upgrading and characterization, coal beneficiation and upgrading, coal combustion systems development including oxy-combustion, emissions monitoring and abatement, coal gasification technologies including gas clean-up and conditioning, hydrogen and liquid fuels production, coal-bed methane recovery, and the development of technologies for the utilization of renewable energy resources. Environmental-related activities emphasized cleaning contaminated soils and waters, processing of oily wastes, mitigating acid mine drainage, and demonstrating uses for solid waste from clean coal technologies, and other advanced coal-based systems. Technology enhancement activities included resource characterization studies, development of improved methods, monitors and sensors. In general the goals of the tasks proposed were to enhance competitiveness of U.S. technology, increase production of domestic resources, and reduce environmental

  1. Overview of energy-conservation research opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Hopp, W.J.; Hauser, S.G.; Hane, G.J.; Gurwell, W.E.; Bird, S.P.; Cliff, W.C.; Williford, R.E.; Williams, T.A.; Ashton, W.B.

    1981-12-01

    This document is a study of research opportunities that are important to developing advanced technologies for efficient energy use. The study's purpose is to describe a wide array of attractive technical areas from which specific research and development programs could be implemented. Research areas are presented for potential application in each of the major end-use sectors. The study develops and applies a systematic approach to identifying and screening applied energy conservation research opportunities. To broadly cover the energy end-use sectors, this study develops useful information relating to the areas where federally-funded applied research will most likely play an important role in promoting energy conservation. This study is not designed to produce a detailed agenda of specific recommended research activities. The general information presented allows uniform comparisons of disparate research areas and as such provides the basis for formulating a cost-effective, comprehensive federal-applied energy conservation research strategy. Chapter 2 discusses the various methodologies that have been used in the past to identify research opportunities and details the approach used here. In Chapters 3, 4, and 5 the methodology is applied to the buildings, transportation, and industrial end-use sectors and the opportunities for applied research in these sectors are discussed.Chapter 6 synthesizes the results of the previous three chapters to give a comprehensive picture of applied energy conservation research opportunities across all end-use sectors and presents the conclusions to the report.

  2. Future plant of basic research for nuclear energy by university researchers

    International Nuclear Information System (INIS)

    Shibata, Toshikazu

    1984-01-01

    National Committee for Nuclear Energy Research, Japan Science Council has completed a future plan for basic nuclear energy research by university researchers. The JSC has recommended the promotion of basic research for nuclear energy based on the plan in 1983. The future plan consists of four main research fields, namely, (1) improvements of reactor safety, (2) down stream, (3) thorium fuel reactors, and (4) applications of research reactor and radioisotopes. (author)

  3. Energy research 2002 - Overview; Energie-Forschung 2002 / Recherche energetique 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This publication issued by the Swiss Federal Office of Energy presents an overview of advances made in energy research in Switzerland in 2002. In the report, the heads of various programmes present projects and summarise the results of research in four main areas: Efficient use of energy, renewable energy sources, nuclear energy and energy policy fundamentals. Energy-efficiency is illustrated by examples from the areas of building, traffic, electricity, ambient heat and combined heat and power, fuel cells and combustion. In the renewable energy area, projects concerning energy storage, photovoltaics, solar chemistry and hydrogen, biomass, geothermal energy, wind energy and small-scale hydro are presented. Nuclear safety and controlled thermonuclear fusion are discussed.

  4. Energy research at DOE, was it worth it?: energy efficiency and fossil energy research 1978 to 2000

    National Research Council Canada - National Science Library

    2001-01-01

    ... from the R&D conducted since 1978 in DOE's energy efficiency and fossil energy programs. In response to the congressional charge, the National Research Council formed the Committee on Benefits of DOE...

  5. Research using energy landscape

    International Nuclear Information System (INIS)

    Kim, Hack Jin

    2007-01-01

    Energy landscape is a theoretical tool used for the study of systems where cooperative processes occur such as liquid, glass, clusters, and protein. Theoretical and experimental researches related to energy landscape are introduced in this review

  6. Basic Research Needs for Advanced Nuclear Systems. Report of the Basic Energy Sciences Workshop on Basic Research Needs for Advanced Nuclear Energy Systems, July 31-August 3, 2006

    Energy Technology Data Exchange (ETDEWEB)

    Roberto, J.; Diaz de la Rubia, T.; Gibala, R.; Zinkle, S.; Miller, J.R.; Pimblott, S.; Burns, C.; Raymond, K.; Grimes, R.; Pasamehmetoglu, K.; Clark, S.; Ewing, R.; Wagner, A.; Yip, S.; Buchanan, M.; Crabtree, G.; Hemminger, J.; Poate, J.; Miller, J.C.; Edelstein, N.; Fitzsimmons, T.; Gruzalski, G.; Michaels, G.; Morss, L.; Peters, M.; Talamini, K.

    2006-10-01

    -ray sources, neutron sources, nanoscale science research centers, and supercomputers, offer the opportunity to transform and accelerate the fundamental materials and chemical sciences that underpin technology development for advanced nuclear energy systems. The fundamental challenge is to understand and control chemical and physical phenomena in multi-component systems from femto-seconds to millennia, at temperatures to 1000?C, and for radiation doses to hundreds of displacements per atom (dpa). This is a scientific challenge of enormous proportions, with broad implications in the materials science and chemistry of complex systems. New understanding is required for microstructural evolution and phase stability under relevant chemical and physical conditions, chemistry and structural evolution at interfaces, chemical behavior of actinide and fission-product solutions, and nuclear and thermomechanical phenomena in fuels and waste forms. First-principles approaches are needed to describe f-electron systems, design molecules for separations, and explain materials failure mechanisms. Nanoscale synthesis and characterization methods are needed to understand and design materials and interfaces with radiation, temperature, and corrosion resistance. Dynamical measurements are required to understand fundamental physical and chemical phenomena. New multiscale approaches are needed to integrate this knowledge into accurate models of relevant phenomena and complex systems across multiple length and time scales.

  7. Basic Research Needs for Advanced Nuclear Systems. Report of the Basic Energy Sciences Workshop on Basic Research Needs for Advanced Nuclear Energy Systems, July 31-August 3, 2006

    International Nuclear Information System (INIS)

    Roberto, J.; Diaz de la Rubia, T.; Gibala, R.; Zinkle, S.; Miller, J.R.; Pimblott, S.; Burns, C.; Raymond, K.; Grimes, R.; Pasamehmetoglu, K.; Clark, S.; Ewing, R.; Wagner, A.; Yip, S.; Buchanan, M.; Crabtree, G.; Hemminger, J.; Poate, J.; Miller, J.C.; Edelstein, N.; Fitzsimmons, T.; Gruzalski, G.; Michaels, G.; Morss, L.; Peters, M.; Talamini, K.

    2006-01-01

    -ray sources, neutron sources, nanoscale science research centers, and supercomputers, offer the opportunity to transform and accelerate the fundamental materials and chemical sciences that underpin technology development for advanced nuclear energy systems. The fundamental challenge is to understand and control chemical and physical phenomena in multi-component systems from femto-seconds to millennia, at temperatures to 1000?C, and for radiation doses to hundreds of displacements per atom (dpa). This is a scientific challenge of enormous proportions, with broad implications in the materials science and chemistry of complex systems. New understanding is required for microstructural evolution and phase stability under relevant chemical and physical conditions, chemistry and structural evolution at interfaces, chemical behavior of actinide and fission-product solutions, and nuclear and thermomechanical phenomena in fuels and waste forms. First-principles approaches are needed to describe f-electron systems, design molecules for separations, and explain materials failure mechanisms. Nanoscale synthesis and characterization methods are needed to understand and design materials and interfaces with radiation, temperature, and corrosion resistance. Dynamical measurements are required to understand fundamental physical and chemical phenomena. New multiscale approaches are needed to integrate this knowledge into accurate models of relevant phenomena and complex systems across multiple length and time scales

  8. Nuclear energy related research

    International Nuclear Information System (INIS)

    Toerroenen, K.; Kilpi, K.

    1985-01-01

    This research programme plan for 1985 covers the nuclear energy related research planned to be carried out at the Technical Research Centre of Finland (VTT) and funded by the Ministry of Trade and Industry in Finland, the Nordic Council of Ministers and VTT

  9. Symbolic simulation of engineering systems on a supercomputer

    International Nuclear Information System (INIS)

    Ragheb, M.; Gvillo, D.; Makowitz, H.

    1986-01-01

    Model-Based Production-Rule systems for analysis are developed for the symbolic simulation of Complex Engineering systems on a CRAY X-MP Supercomputer. The Fault-Tree and Event-Tree Analysis methodologies from Systems-Analysis are used for problem representation and are coupled to the Rule-Based System Paradigm from Knowledge Engineering to provide modelling of engineering devices. Modelling is based on knowledge of the structure and function of the device rather than on human expertise alone. To implement the methodology, we developed a production-Rule Analysis System that uses both backward-chaining and forward-chaining: HAL-1986. The inference engine uses an Induction-Deduction-Oriented antecedent-consequent logic and is programmed in Portable Standard Lisp (PSL). The inference engine is general and can accommodate general modifications and additions to the knowledge base. The methodologies used will be demonstrated using a model for the identification of faults, and subsequent recovery from abnormal situations in Nuclear Reactor Safety Analysis. The use of the exposed methodologies for the prognostication of future device responses under operational and accident conditions using coupled symbolic and procedural programming is discussed

  10. Nuclear energy related research

    International Nuclear Information System (INIS)

    Rintamaa, R.

    1992-05-01

    The annual Research Programme Plan describes publicly funded nuclear energy related research to be carried out mainly at the Technical Research Centre of Finland (VTT) in 1992. The research is financed primarily by the Ministry of Trade and Industry (KTM), the Finnish Centre for Radiation and Nuclear Safety (STUK) and VTT itself. Other research institutes, utilities and industry also contribute to many projects

  11. Research for the energy transition. The organization of the energy systems

    International Nuclear Information System (INIS)

    2017-01-01

    The volume on research for the energy transition includes contributions to the FVEE annual meeting 2016 concerning the following issues: status and perspectives of the energy transition, key technologies for the energy transition, political boundary conditions, development trends in photovoltaics, components for the energy supply (wind energy, hydrogen technologies, smart bioenergy concept, contribution of the geosphere), grids and storage systems for the energy transition, research network renewable energies.

  12. Energy research, national and international

    International Nuclear Information System (INIS)

    Rhijn, A.A.T. van

    1976-01-01

    The Dutch Energy Research Programme inaugurated by the National Steering Group for Energy Research (LSEO) is discussed. Three types of criteria to be borne in mind in the selection of new directions in development are considered: the setting of targets for energy policy: the general central social and economic aims of the country; and the scientific, financial and organisational possibilities. International aspects are reviewed with reference to the IEA, CERN, Euratom, ELDO and ESRO. (D.J.B.)

  13. Nuclear energy related research

    International Nuclear Information System (INIS)

    Salminen, Pertti

    1989-03-01

    This annual Research Programme Plan covers the publicly funded nuclear energy related research planned to be carried out at the Technical Research Centre of Finland (VTT) in 1989. The research will be financed by the Ministry of Trade and Industry, the Finnish Centre for Radiation and Nuclear Safety, the Nordic Council of Ministers and VTT itself

  14. Nuclear energy related research

    International Nuclear Information System (INIS)

    Salminen, P.

    1988-02-01

    This annual Research Programme Plan covers the publicly funded nuclear energy related research planned to be carried out at the Technical Research Centre of Finland (VTT) in 1988. The research will be financed by the Ministry of Trade and Industry, the Finnish Centre for Radiation and Nuclear Safety, the Nordic Council of Ministers and VTT itself

  15. Nuclear energy related research

    International Nuclear Information System (INIS)

    Mattila, L.; Vanttola, T.

    1991-10-01

    The annual Research Programme Plan describes the publicly funded nuclear energy related research to be carried out mainly at the Technical Research Centre of Finland (VTT) in 1991. The research is financed primarily by the Ministry of Trade and Industry (KTM), the Finnish Centre for Radiation and Nuclear Safety (STUK) and VTT itself. Other research institutes, utilities and industry also contribute to many projects

  16. Three-dimensional kinetic simulations of whistler turbulence in solar wind on parallel supercomputers

    Science.gov (United States)

    Chang, Ouliang

    The objective of this dissertation is to study the physics of whistler turbulence evolution and its role in energy transport and dissipation in the solar wind plasmas through computational and theoretical investigations. This dissertation presents the first fully three-dimensional (3D) particle-in-cell (PIC) simulations of whistler turbulence forward cascade in a homogeneous, collisionless plasma with a uniform background magnetic field B o, and the first 3D PIC simulation of whistler turbulence with both forward and inverse cascades. Such computationally demanding research is made possible through the use of massively parallel, high performance electromagnetic PIC simulations on state-of-the-art supercomputers. Simulations are carried out to study characteristic properties of whistler turbulence under variable solar wind fluctuation amplitude (epsilon e) and electron beta (betae), relative contributions to energy dissipation and electron heating in whistler turbulence from the quasilinear scenario and the intermittency scenario, and whistler turbulence preferential cascading direction and wavevector anisotropy. The 3D simulations of whistler turbulence exhibit a forward cascade of fluctuations into broadband, anisotropic, turbulent spectrum at shorter wavelengths with wavevectors preferentially quasi-perpendicular to B o. The overall electron heating yields T ∥ > T⊥ for all epsilone and betae values, indicating the primary linear wave-particle interaction is Landau damping. But linear wave-particle interactions play a minor role in shaping the wavevector spectrum, whereas nonlinear wave-wave interactions are overall stronger and faster processes, and ultimately determine the wavevector anisotropy. Simulated magnetic energy spectra as function of wavenumber show a spectral break to steeper slopes, which scales as k⊥lambda e ≃ 1 independent of betae values, where lambdae is electron inertial length, qualitatively similar to solar wind observations. Specific

  17. Leveraging the national cyberinfrastructure for biomedical research.

    Science.gov (United States)

    LeDuc, Richard; Vaughn, Matthew; Fonner, John M; Sullivan, Michael; Williams, James G; Blood, Philip D; Taylor, James; Barnett, William

    2014-01-01

    In the USA, the national cyberinfrastructure refers to a system of research supercomputer and other IT facilities and the high speed networks that connect them. These resources have been heavily leveraged by scientists in disciplines such as high energy physics, astronomy, and climatology, but until recently they have been little used by biomedical researchers. We suggest that many of the 'Big Data' challenges facing the medical informatics community can be efficiently handled using national-scale cyberinfrastructure. Resources such as the Extreme Science and Discovery Environment, the Open Science Grid, and Internet2 provide economical and proven infrastructures for Big Data challenges, but these resources can be difficult to approach. Specialized web portals, support centers, and virtual organizations can be constructed on these resources to meet defined computational challenges, specifically for genomics. We provide examples of how this has been done in basic biology as an illustration for the biomedical informatics community.

  18. Large scale simulations of lattice QCD thermodynamics on Columbia Parallel Supercomputers

    International Nuclear Information System (INIS)

    Ohta, Shigemi

    1989-01-01

    The Columbia Parallel Supercomputer project aims at the construction of a parallel processing, multi-gigaflop computer optimized for numerical simulations of lattice QCD. The project has three stages; 16-node, 1/4GF machine completed in April 1985, 64-node, 1GF machine completed in August 1987, and 256-node, 16GF machine now under construction. The machines all share a common architecture; a two dimensional torus formed from a rectangular array of N 1 x N 2 independent and identical processors. A processor is capable of operating in a multi-instruction multi-data mode, except for periods of synchronous interprocessor communication with its four nearest neighbors. Here the thermodynamics simulations on the two working machines are reported. (orig./HSI)

  19. Nuclear Energy Research in Europe

    International Nuclear Information System (INIS)

    Schenkel, Roland; Haas, Didier

    2008-01-01

    The energy situation in Europe is mainly characterized by a growth in consumption, together with increasing import dependence in all energy resources. Assuring security of energy supply is a major goal at European Union level, and this can best be achieved by an adequate energy mix, including nuclear energy, producing now 32 % of our electricity. An increase of this proportion would not only improve our independence, but also reduce greenhouse gases emissions in Europe. Another major incentive in favor of nuclear is its competitiveness, as compared to other energy sources, and above all the low dependence of the electricity price on variation of the price of the raw material. The European Commission has launched a series of initiatives aiming at better coordinating energy policies and research. Particular emphasis in future European research will be given on the long-term sustainability of nuclear energy through the development of fast reactors, and to potential industrial heat applications. (authors)

  20. Base Program on Energy Related Research

    Energy Technology Data Exchange (ETDEWEB)

    Western Research Institute

    2008-06-30

    The main objective of the Base Research Program was to conduct both fundamental and applied research that will assist industry in developing, deploying, and commercializing efficient, nonpolluting fossil energy technologies that can compete effectively in meeting the energy requirements of the Nation. In that regard, tasks proposed under the WRI research areas were aligned with DOE objectives of secure and reliable energy; clean power generation; development of hydrogen resources; energy efficiency and development of innovative fuels from low and no-cost sources. The goal of the Base Research Program was to develop innovative technology solutions that will: (1) Increase the production of United States energy resources--coal, natural gas, oil, and renewable energy resources; (2) Enhance the competitiveness of United States energy technologies in international markets and assist in technology transfer; (3) Reduce the nation's dependence on foreign energy supplies and strengthen both the United States and regional economies; and (4) Minimize environmental impacts of energy production and utilization. This report summarizes the accomplishments of the overall Base Program. This document represents a stand-alone Final Report for the entire Program. It should be noted that an interim report describing the Program achievements was prepared in 2003 covering the progress made under various tasks completed during the first five years of this Program.

  1. Energy research program 99. Program for expansion of the Danish energy research and development in the period 1999-2001

    International Nuclear Information System (INIS)

    2000-08-01

    The present 'Energy research program 99' contains descriptions of projects under The Energy Research Programme (EFP) supported by the Danish Energy Agency. The research programme covers the areas Fuel oils and natural gas, biomass, production and distribution of electric power and heating, wind energy, energy consumption in buildings, solar energy, energy conservation, fuel cells, super conductors, industrial processes and international co-operation. The manuscript is based on print-outs of the Danish input from the database Nordic Energy Index (NEI). The descriptions give project titles, summary descriptions of aims, methods etc., names, addresses, telephone and tele fax numbers of institutions etc. responsible for the projects, names of project leaders, of other involved firms, institutes or institutions, and details of the total budget and the financing of the energy research projects. (EHS)

  2. Nuclear energy related research

    International Nuclear Information System (INIS)

    Salminen, P.; Mattila, L.

    1990-08-01

    The annual Research Programme Plan describes the publicly funded nuclear energy related research to be carried out at the Technical Research Centre of Finland (VTT) in 1990. The research is financed primarily by the Ministry of Trade and Industry (KTM), the Finnish Centre for Radiation and Nuclear Safety (STUK) and VTT itself. Utilities and industry also contribute to some projects

  3. Molecularly Engineered Energy Materials, an Energy Frontier Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Materials Science and Engineering Dept.

    2016-09-28

    Molecularly Engineered Energy Materials (MEEM) was established as an interdisciplinary cutting-edge UCLA-based research center uniquely equipped to attack the challenge of rationally designing, synthesizing and testing revolutionary new energy materials. Our mission was to achieve transformational improvements in the performance of materials via controlling the nano-and mesoscale structure using selectively designed, earth-abundant, inexpensive molecular building blocks. MEEM has focused on materials that are inherently abundant, can be easily assembled from intelligently designed building blocks (molecules, nanoparticles), and have the potential to deliver transformative economic benefits in comparison with the current crystalline-and polycrystalline-based energy technologies. MEEM addressed basic science issues related to the fundamental mechanisms of carrier generation, energy conversion, as well as transport and storage of charge and mass in tunable, architectonically complex materials. Fundamental understanding of these processes will enable rational design, efficient synthesis and effective deployment of novel three-dimensional material architectures for energy applications. Three interrelated research directions were initially identified where these novel architectures hold great promise for high-reward research: solar energy generation, electrochemical energy storage, and materials for CO2 capture. Of these, the first two remained throughout the project performance period, while carbon capture was been phased out in consultation and with approval from BES program manager.

  4. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1986-04-01

    The programs of the Office of Energy Research, DOE, include several thousand individual projects and hundreds of laboratories, universities, and other research facilities throughout the United States. The major programs and activities are described briefly, and include high energy and nuclear physics, fusion energy, basic energy sciences, and health and environmental research, as well as advisory, assessment, support, and scientific computing activities

  5. R&D 100 Awards Demonstrate Clean Energy Legacy - Continuum Magazine |

    Science.gov (United States)

    Intel to develop an innovative warm-water, liquid-cooled supercomputer that later won an R&D 100 Award. Photo by Dennis Schroeder, NREL R&D 100 Awards Demonstrate Clean Energy Legacy NREL has won 57 R&D 100 Awards since 1982, many of which led directly to industry successes today. R&D 100

  6. Energy consumption optimization of the total-FETI solver by changing the CPU frequency

    Science.gov (United States)

    Horak, David; Riha, Lubomir; Sojka, Radim; Kruzik, Jakub; Beseda, Martin; Cermak, Martin; Schuchart, Joseph

    2017-07-01

    The energy consumption of supercomputers is one of the critical problems for the upcoming Exascale supercomputing era. The awareness of power and energy consumption is required on both software and hardware side. This paper deals with the energy consumption evaluation of the Finite Element Tearing and Interconnect (FETI) based solvers of linear systems, which is an established method for solving real-world engineering problems. We have evaluated the effect of the CPU frequency on the energy consumption of the FETI solver using a linear elasticity 3D cube synthetic benchmark. In this problem, we have evaluated the effect of frequency tuning on the energy consumption of the essential processing kernels of the FETI method. The paper provides results for two types of frequency tuning: (1) static tuning and (2) dynamic tuning. For static tuning experiments, the frequency is set before execution and kept constant during the runtime. For dynamic tuning, the frequency is changed during the program execution to adapt the system to the actual needs of the application. The paper shows that static tuning brings up 12% energy savings when compared to default CPU settings (the highest clock rate). The dynamic tuning improves this further by up to 3%.

  7. Energy research and energy technologies. Fossil energy sources. Annual report 1994

    International Nuclear Information System (INIS)

    1995-01-01

    After an introduction into the research programme and an overview of the sponsored projects, the main part of the book gives a description of the projects in the research area fossile energy sources. Several indexes provide access to this comprehensive compilation: a project number index, an index of interconnected projects, and an index of companies. The organization plan of ''BEO'', the project group biology, energy, ecology, is appended. (UA) [de

  8. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1985-07-01

    The purpose of this research has been to support the energy technology development programs by providing insight into fundamental science and associated phenomena and developing new or advanced concepts and techniques. Today, this responsibility rests with the Office of Energy Research (ER), DOE, whose present programs have their origins in pioneering energy-related research which was initiated nearly 40 years ago. The Director, Office of Energy Research, also acts as the chief scientist and scientific advisor to the Secretary of Energy for the entire spectrum of energy research and development (R and D) programs of the Department. ER programs include several thousand individual projects and hundreds of laboratories, universities, and other research facilities throughout the United States. The current organization of ER is shown. The budgets for the various ER programs for the last two fiscal years are shown. In the following pages, each of these programs and activities are described briefly for the information of the scientific community and the public at large

  9. Fast and Accurate Simulation of the Cray XMT Multithreaded Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Oreste; Tumeo, Antonino; Secchi, Simone; Manzano Franco, Joseph B.

    2012-12-31

    Irregular applications, such as data mining and analysis or graph-based computations, show unpredictable memory/network access patterns and control structures. Highly multithreaded architectures with large processor counts, like the Cray MTA-1, MTA-2 and XMT, appear to address their requirements better than commodity clusters. However, the research on highly multithreaded systems is currently limited by the lack of adequate architectural simulation infrastructures due to issues such as size of the machines, memory footprint, simulation speed, accuracy and customization. At the same time, Shared-memory MultiProcessors (SMPs) with multi-core processors have become an attractive platform to simulate large scale machines. In this paper, we introduce a cycle-level simulator of the highly multithreaded Cray XMT supercomputer. The simulator runs unmodified XMT applications. We discuss how we tackled the challenges posed by its development, detailing the techniques introduced to make the simulation as fast as possible while maintaining a high accuracy. By mapping XMT processors (ThreadStorm with 128 hardware threads) to host computing cores, the simulation speed remains constant as the number of simulated processors increases, up to the number of available host cores. The simulator supports zero-overhead switching among different accuracy levels at run-time and includes a network model that takes into account contention. On a modern 48-core SMP host, our infrastructure simulates a large set of irregular applications 500 to 2000 times slower than real time when compared to a 128-processor XMT, while remaining within 10\\% of accuracy. Emulation is only from 25 to 200 times slower than real time.

  10. Nuclear energy research in Germany 2009

    International Nuclear Information System (INIS)

    2010-01-01

    Research and development (R and D) in the fields of nuclear reactor safety and safety of nuclear waste and spent fuel management in Germany are carried out at research centers and, in addition, some 32 universities. In addition, industrial research is conducted by plant vendors, and research in plant and operational safety of power plants in operation is organized by operators and by organizations of technical and scientific research and expert consultant organizations. This summary report presents nuclear energy research conducted at research centers and universities in Germany in 2009, including examples of research projects and descriptions of the situation of research and teaching. These are the organizations covered: - Hermann von Helmholtz Association of German Research Centers, - Karlsruhe Institute of Technology (KIT, responsibility of the former Karlsruhe Research Center), - Juelich Research Center (FZJ), - Nuclear Technology Competence Center East, - Dresden-Rossendorf Research Center (FZD), - Rossendorf Nuclear Process Technology and Analysis Association (VKTA), - Dresden Technical University, - Zittau/Goerlitz University of Applied Science, - Institute of Nuclear Energy and Energy Systems (IKE) of the University of Stuttgart. (orig.)

  11. Leveraging HPC resources for High Energy Physics

    International Nuclear Information System (INIS)

    O'Brien, B; Washbrook, A; Walker, R

    2014-01-01

    High Performance Computing (HPC) supercomputers provide unprecedented computing power for a diverse range of scientific applications. The most powerful supercomputers now deliver petaflop peak performance with the expectation of 'exascale' technologies available in the next five years. More recent HPC facilities use x86-based architectures managed by Linux-based operating systems which could potentially allow unmodified HEP software to be run on supercomputers. There is now a renewed interest from both the LHC experiments and the HPC community to accommodate data analysis and event simulation production on HPC facilities. This study provides an outline of the challenges faced when incorporating HPC resources for HEP software by using the HECToR supercomputer as a demonstrator.

  12. NREL Research Earns Two Prestigious R&D 100 Awards | News | NREL

    Science.gov (United States)

    and development not only create jobs in America but help advance the goal of a clean energy future and Steve Johnston. High Performance Supercomputing Platform Uses Warm Water to Prevent Heat Build-up initiative were NREL's Steve Hammond and Nicolas Dube of HP. "Oscars" of Innovation Winners of the

  13. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1984-04-01

    An overview is given for the DOE research programs in high energy and nuclear physics; fusion energy; basic energy sciences; health and environmental research; and advisory, assessment and support activities

  14. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  15. A proposed programme for energy risk research

    International Nuclear Information System (INIS)

    1979-01-01

    The report consists of two parts. Part I presents an overview of technological risk management, noting major contributions and current research needs. Part II details a proposed program of energy research, including discussions of some seven recommended projects. The proposed energy risk research program addresses two basic problem areas: improving the management of energy risks and energy risk communication and public response. Specific recommended projects are given for each. (Auth.)

  16. Programs of the Office of Energy Research: Revision

    International Nuclear Information System (INIS)

    1987-06-01

    In establishing each of the Federal Agencies that have been successively responsible for energy technologies and their development - the Atomic Energy Commission, the Energy Research and Development Administration, and, currently, the US Department of Energy (DOE) - Congress made specific provisions for the conduct of advanced and fundamental research. The purpose of this research has been to support the energy technology development programs by providing insight into fundamental science and associated phenomena and developing new or advanced concepts and techniques. Today, this responsibility rests with the Office of Energy Research (ER), DOE, whose present programs have their origins in pioneering energy-related research of this nature, which was initiated nearly 40 years ago. The Director, Office of Energy Research, also acts as the chief scientist and scientific advisor to the Secretary of Energy for the entire spectrum of energy research and development (R and D) programs of the Department. ER programs include several thousand individual projects and hundreds of laboratories, universities, and other research facilities throughout the Unites States. In the following pages, each of these programs and activities are described briefly for the information of the scientific community and the public at large. 5 figs., 6 tabs

  17. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out b...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses.......This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...

  18. An Optimized Parallel FDTD Topology for Challenging Electromagnetic Simulations on Supercomputers

    Directory of Open Access Journals (Sweden)

    Shugang Jiang

    2015-01-01

    Full Text Available It may not be a challenge to run a Finite-Difference Time-Domain (FDTD code for electromagnetic simulations on a supercomputer with more than 10 thousands of CPU cores; however, to make FDTD code work with the highest efficiency is a challenge. In this paper, the performance of parallel FDTD is optimized through MPI (message passing interface virtual topology, based on which a communication model is established. The general rules of optimal topology are presented according to the model. The performance of the method is tested and analyzed on three high performance computing platforms with different architectures in China. Simulations including an airplane with a 700-wavelength wingspan, and a complex microstrip antenna array with nearly 2000 elements are performed very efficiently using a maximum of 10240 CPU cores.

  19. Energy research in the public sector

    International Nuclear Information System (INIS)

    Gfeller, J.

    1980-01-01

    The objects of state-sponsored energy research in Switzerland are stated to include specialist training in co-operation with the technical universities, and long term energy technology as well as international liaison. Tables are presented which indicate the trends in sources of funding for research, and the division between various technological areas, including energy conservation (10%), solar energy (10%), bioenergy, geothermal energy and wind power (4.5%), atomic energy (40%), nuclear fusion (20%), electricity (6%) and environmental studies (7%). These ratios are compared with those for other developed countries and it is concluded that the aim must be to approach smoothly the 'post-oil era'. (Auth.)

  20. Research on Utilization of Geo-Energy

    Science.gov (United States)

    Bock, Michaela; Scheck-Wenderoth, Magdalena; GeoEn Working Group

    2013-04-01

    The world's energy demand will increase year by year and we have to search for alternative energy resources. New concepts concerning the energy production from geo-resources have to be provided and developed. The joint project GeoEn combines research on the four core themes geothermal energy, shale gas, CO2 capture and CO2 storage. Sustainable energy production from deep geothermal energy resources is addressed including all processes related to geothermal technologies, from reservoir exploitation to energy conversion in the power plant. The research on the unconventional natural gas resource, shale gas, is focussed on the sedimentological, diagenetic and compositional characteristics of gas shales. Technologies and solutions for the prevention of the greenhouse gas carbon dioxide are developed in the research fields CO2 capture technologies, utilization, transport, and CO2 storage. Those four core themes are studied with an integrated approach using the synergy of cross-cutting methodologies. New exploration and reservoir technologies and innovative monitoring methods, e.g. CSMT (controlled-source magnetotellurics) are examined and developed. All disciplines are complemented by numerical simulations of the relevant processes. A particular strength of the project is the availability of large experimental infrastructures where the respective technologies are tested and monitored. These include the power plant Schwarze Pumpe, where the Oxyfuel process is improved, the pilot storage site for CO2 in Ketzin and the geothermal research platform Groß Schönebeck, with two deep wells and an experimental plant overground for research on corrosion. In addition to fundamental research, the acceptance of new technologies, especially in the field of CCS is examined. Another focus addressed is the impact of shale gas production on the environment. A further important goal is the education of young scientists in the new field "geo-energy" to fight skills shortage in this field

  1. Rationale for energy research and development programme

    Energy Technology Data Exchange (ETDEWEB)

    1976-04-01

    This paper describes the rationale for the expenditure of government money on energy research and development. The Committee, organized in 1974, established the following order of project priorities: projects to determine current and future energy demand; projects concerned with the conservation and more efficient use of energy; projects concerned with the assessment of indigenous energy resources; projects concerned with the assessment of the human, financial, and organizational resources for energy production and use; and projects concerned with economic, technological, social, and environmental aspects of energy use and production over the next 15 years and beyond the next 15 years. Significant factors affecting the national energy economy, the strategy for energy research and development, and the results of committee activities are summarized. An energy scenario research is laid out. (MCW)

  2. Research and Energy Efficiency: Selected Success Stories

    Science.gov (United States)

    Garland, P. W.; Garland, R. W.

    1997-06-26

    Energy use and energy technology play critical roles in the U.S. economy and modern society. The Department of Energy (DOE) conducts civilian energy research and development (R&D) programs for the purpose of identifying promising technologies that promote energy security, energy efficiency, and renewable energy use. DOE-sponsored research ranges from basic investigation of phenomena all the way through development of applied technology in partnership with industry. DOE`s research programs are conducted in support of national strategic energy objectives, however austere financial times have dictated that R&D programs be measured in terms of cost vs. benefit. In some cases it is difficult to measure the return on investment for the basic "curiosity-driven" research, however many applied technology development programs have resulted in measurable commercial successes. The DOE has published summaries of their most successful applied technology energy R&D programs. In this paper, we will discuss five examples from the Building Technologies area of the DOE Energy Efficiency program. Each story will describe the technology, discuss the level of federal funding, and discuss the returns in terms of energy savings, cost savings, or national economic impacts.

  3. Development of a high performance eigensolver on the peta-scale next generation supercomputer system

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Yamada, Susumu; Machida, Masahiko

    2010-01-01

    For the present supercomputer systems, a multicore and multisocket processors are necessary to build a system, and choice of interconnection is essential. In addition, for effective development of a new code, high performance, scalable, and reliable numerical software is one of the key items. ScaLAPACK and PETSc are well-known software on distributed memory parallel computer systems. It is needless to say that highly tuned software towards new architecture like many-core processors must be chosen for real computation. In this study, we present a high-performance and high-scalable eigenvalue solver towards the next-generation supercomputer system, so called 'K-computer' system. We have developed two versions, the standard version (eigen s) and enhanced performance version (eigen sx), which are developed on the T2K cluster system housed at University of Tokyo. Eigen s employs the conventional algorithms; Householder tridiagonalization, divide and conquer (DC) algorithm, and Householder back-transformation. They are carefully implemented with blocking technique and flexible two-dimensional data-distribution to reduce the overhead of memory traffic and data transfer, respectively. Eigen s performs excellently on the T2K system with 4096 cores (theoretical peak is 37.6 TFLOPS), and it shows fine performance 3.0 TFLOPS with a two hundred thousand dimensional matrix. The enhanced version, eigen sx, uses more advanced algorithms; the narrow-band reduction algorithm, DC for band matrices, and the block Householder back-transformation with WY-representation. Even though this version is still on a test stage, it shows 4.7 TFLOPS with the same dimensional matrix on eigen s. (author)

  4. Watson will see you now: a supercomputer to help clinicians make informed treatment decisions.

    Science.gov (United States)

    Doyle-Lindrud, Susan

    2015-02-01

    IBM has collaborated with several cancer care providers to develop and train the IBM supercomputer Watson to help clinicians make informed treatment decisions. When a patient is seen in clinic, the oncologist can input all of the clinical information into the computer system. Watson will then review all of the data and recommend treatment options based on the latest evidence and guidelines. Once the oncologist makes the treatment decision, this information can be sent directly to the insurance company for approval. Watson has the ability to standardize care and accelerate the approval process, a benefit to the healthcare provider and the patient.

  5. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1990-01-01

    The Office of Energy Research sponsors long-term research in certain fundamental areas and in technical areas associated with energy resources, production, use, and resulting health and environmental effects. This document describes these activities, including recent accomplishments, types of facilities, and gives some impacts on energy, science, and scientific manpower development. The document is intended to respond to the many requests from diverse communities --- such as government, education, and public and private research --- for a summary of the types of research sponsored by the Department of Energy's Office of Energy Research. This is important since the Office relies to a considerable extent on unsolicited proposals from capable university and industrial groups, self-motivated interested individuals, and organizations that may wish to use the Department's extensive facilities and resources. By describing our activities and facilities, we hope not only to inform, but to also encourage interest and participation

  6. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    Science.gov (United States)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  7. Energy in Ireland: context, management and research

    International Nuclear Information System (INIS)

    Saintherant, N.; Lerouge, Ch.; Welcker, A.

    2008-01-01

    In the framework of the climatic change and the fossil fuels shortage, the Ireland defined a new energy policy. The priority is the energy supply security and the research programs present a great interest in the ocean energies, which represent an important source in Ireland. The report presents the context, the irish energy policy, the research programs on energy and the different actors of the domain. (A.L.B.)

  8. The Evolution of Research and Education Networks and their Essential Role in Modern Science

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, W.; Chaniotakis, E.; Dart, E.; Guok, C.; Metzger, J.; Tierney, B.

    2009-06-15

    ESnet - the Energy Sciences Network - has the mission of enabling the aspects of the US Department of Energy's Office of Science programs and facilities that depend on large collaborations and large-scale data sharing to accomplish their science. The Office of Science supports a large fraction of all U.S. physical science research and operates many large science instruments and supercomputers that are used by both DOE and University researchers. The network requirements of this community have been explored in some detail by ESnet and a long-term plan has been developed in order to ensure adequate networking to support the science. In this paper we describe the planning process (which has been in place for several years and was the basis of a new network that is just now being completed and a new set of network services) and examine the effectiveness and adequacy of the planning process in the light of evolving science requirements.

  9. Solar energy storage researchers information user study

    Energy Technology Data Exchange (ETDEWEB)

    Belew, W.W.; Wood, B.L.; Marle, T.L.; Reinhardt, C.L.

    1981-03-01

    The results of a series of telephone interviews with groups of users of information on solar energy storage are described. In the current study only high-priority groups were examined. Results from 2 groups of researchers are analyzed: DOE-Funded Researchers and Non-DOE-Funded Researchers. The data will be used as input to the determination of information products and services the Solar Energy Research Institute, the Solar Energy Information Data Bank Network, and the entire information outreach community should be preparing and disseminating.

  10. Earth and environmental science in the 1980's: Part 1: Environmental data systems, supercomputer facilities and networks

    Science.gov (United States)

    1986-01-01

    Overview descriptions of on-line environmental data systems, supercomputer facilities, and networks are presented. Each description addresses the concepts of content, capability, and user access relevant to the point of view of potential utilization by the Earth and environmental science community. The information on similar systems or facilities is presented in parallel fashion to encourage and facilitate intercomparison. In addition, summary sheets are given for each description, and a summary table precedes each section.

  11. Energy transitions research: Insights and cautionary tales

    International Nuclear Information System (INIS)

    Grubler, Arnulf

    2012-01-01

    This short essay first reviews the pioneers of energy transition research both in terms of data as well as theories. Three major insights that have emerged from this nascent research fields are summarized highlighting the importance of energy end-use and services, the lengthy process of transitions, as well as the patterns that characterize successful scale up of technologies and industries that drive historical energy transitions. The essay concludes with cautionary notes also derived from historical experience. In order to trigger a next energy transition policies and innovation efforts need to be persistent and continuous, aligned, as well as balanced. It is argued that current policy frameworks in place invariably do not meet these criteria and need to change in order to successfully trigger a next energy transition towards sustainability. - Highlights: ► Includes the first literature review of early energy transition research. ► Summarizes three major research findings from the literature. ► Reviews policy implications of recent case studies of energy technology innovation. ► Argues that current policy frameworks are deficient in view of above lessons.

  12. Advanced energy projects FY 1994 research summaries

    International Nuclear Information System (INIS)

    1994-09-01

    The Division of Advanced Energy Projects (AEP) provides support to explore the feasibility of novel, energy-related concepts that evolve from advances in basic research. These concepts are typically at an early stage of scientific definition and, therefore, are premature for consideration by applied research or technology development programs. The AEP also supports high-risk, exploratory concepts that do not readily fit into a program area but could have several applications that may span scientific disciplines or technical areas. Projects supported by the Division arise from unsolicited ideas and concepts submitted by researchers. The portfolio of projects is dynamic and reflects the broad role of the Department in supporting research and development for improving the Nation's energy outlook. FY 1994 projects include the following topical areas: novel materials for energy technology; renewable and biodegradable materials; exploring uses of new scientific discoveries; alternate pathways to energy efficiency; alternative energy sources; and innovative approaches to waste treatment and reduction. Summaries are given for 66 projects

  13. Future of nuclear energy research

    International Nuclear Information System (INIS)

    Fuketa, Toyojiro

    1989-09-01

    In spite of the easing of worldwide energy supply and demand situation in these years, we believe that research efforts towards the next generation nuclear energy are indispensably necessary. Firstly, the nuclear colleagues believe that nuclear energy is the best major energy source from many points of view including the global environmental viewpoint. Secondly, in the medium- and long-range view, there will once again be a high possibility of a tight supply and demand situation for oil. Thirdly, nuclear energy is the key energy source to overcome the vulnerability of the energy supply structure in industrialized countries like Japan where virtually no fossil energy source exists. In this situation, nuclear energy is a sort of quasi-domestic energy as a technology-intensive energy. Fourthly, the intensive efforts to develop the nuclear technology in the next generation will give rise to a further evolution in science and technology in the future. A few examples of medium- and long-range goals of the nuclear energy research are development of new types of reactors which can meet various needs of energy more flexibly and reliably than the existing reactors, fundamental and ultimate solution of the radioactive waste problems, creation and development of new types of energy production systems which are to come beyond the fusion, new development in the biological risk assessment of the radiation effects and so on. In order to accomplish those goals it is quite important to introduce innovations in such underlying technologies as materials control in more microscopic manners, photon and particle beam techniques, accelerator engineering, artificial intelligence, and so on. 32 refs, 2 figs

  14. Research progress about chemical energy storage of solar energy

    Science.gov (United States)

    Wu, Haifeng; Xie, Gengxin; Jie, Zheng; Hui, Xiong; Yang, Duan; Du, Chaojun

    2018-01-01

    In recent years, the application of solar energy has been shown obvious advantages. Solar energy is being discontinuity and inhomogeneity, so energy storage technology becomes the key to the popularization and utilization of solar energy. Chemical storage is the most efficient way to store and transport solar energy. In the first and the second section of this paper, we discuss two aspects about the solar energy collector / reactor, and solar energy storage technology by hydrogen production, respectively. The third section describes the basic application of solar energy storage system, and proposes an association system by combining solar energy storage and power equipment. The fourth section briefly describes several research directions which need to be strengthened.

  15. Nuclear energy research until 2000

    International Nuclear Information System (INIS)

    Reiman, L.; Rintamaa, R.; Vanttola, T.

    1994-03-01

    The working group was to assess the need and orientation of nuclear energy research (apart from research on nuclear waste management and fusion technology) up until the year 2000 in Finland and to propose framework schemes and organization guidelines for any forthcoming publicly financed research programmes from 1995 onwards. The main purpose of nuclear energy research is to ensure the safety and continued development of Finland's existing nuclear power plants. Factors necessarily influencing the orientation of research are Parliaments decision of late 1993 against further nuclear capacity in the country, the need to assess reactor safety in the eastern neighbour regions, and Finland's potential membership in the European Union. The working group proposes two new research programmes similar to the current ones but with slightly modified emphasis. Dedicated to reactor safety and structural safety respectively, they would both cover the four years from 1995 to 1998. A separate research project is proposed for automation technology. In addition, environmental research projects should have a joint coordination unit. (9 figs., 4 tabs.)

  16. Ocean energy researchers information user study

    Energy Technology Data Exchange (ETDEWEB)

    Belew, W.W.; Wood, B.L.; Marle, T.L.; Reinhardt, C.L.

    1981-03-01

    This report describes the results of a series of telephone interviews with groups of users of information on ocean energy systems. These results, part of a larger study on many different solar technologies, identify types of information each group needed and the best ways to get information to each group. The report is 1 of 10 discussing study results. The overall study provides baseline data about information needs in the solar community. Only high-priority groups were examined. Results from 2 groups of researchers are analyzed in this report: DOE-Funded Researchers and Non-DOE-Funded Researchers. The data will be used as input to the determination of information products and services the Solar Energy Research Institute, the Solar Energy Information Data Bank Network, and the entire information outreach community should be preparing and disseminating.

  17. Public Engagement in Energy Research

    NARCIS (Netherlands)

    Jellema, Jako; Mulder, Henk A. J.

    Public Engagement in Research is a key element in "Responsible Research and Innovation"; a cross-cutting issue in current European research funding. Public engagement can advance energy R&D, by delivering results that are more in-line with society's views and demands; and collaboration also unlocks

  18. Research for energy efficiency; Forschung fuer Energieeffizienz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-09-15

    The Federal Ministry of Economy enhanced its funding for research in the field of non-nuclear energy in the programme ''Forschung fuer Energieeffizienz'' (Research for Energy Efficiency). The programme focuses on established areas like modern power plant technologies (''Moderne Kraftwerkstechnologien''), fuel cells and hydrogen (''Brennstoffzelle, Wasserstoff''), and energy-optimized building construction (''Energieoptimiertes Bauen''). New subjects are energy-efficient towns and cities (''Energieeffiziente Stadt''), power grids for future power supply (''Netze fuer die Stromversorgung der Zukunft''), power storage (''Stromspeicher''), and electromobility (''Elektromobilitaet''). The brochure presents research and demonstration projects that illustrate the situation in 2010 when the programme was initiated. (orig.)

  19. Summaries of FY 1984 research in high energy physics

    International Nuclear Information System (INIS)

    1984-12-01

    The US Department of Energy, through the Office of Energy Research, Division of High Energy and Nuclear Physics, provides approximately 90 percent of the total federal support for high energy physics research effort in the United States. The High Energy Physics Program primarily utilizes four major US high energy accelerator facilities and over 90 universities under contract to do experimental and theoretical investigations on the properties, structure, and transformation of matter and energy in their most basic forms. This compilation of research summaries is intended to present a convenient report of the scope and nature of high energy physics research presently funded by the US Department of Energy. The areas covered include: (1) conception, design, construction, and operation of particle accelerators; (2) experimental research using the accelerators and ancillary equipment; (3) theoretical research; and (4) research and development programs to advance accelerator technology, particle detector systems, and data analysis capabilities. Major concepts and experimental facts in high energy physics have recently been discovered which have the promise of unifying the fundamental forces and of unerstanding the basic nature of matter and energy

  20. Efficient development of memory bounded geo-applications to scale on modern supercomputers

    Science.gov (United States)

    Räss, Ludovic; Omlin, Samuel; Licul, Aleksandar; Podladchikov, Yuri; Herman, Frédéric

    2016-04-01

    Numerical modeling is an actual key tool in the area of geosciences. The current challenge is to solve problems that are multi-physics and for which the length scale and the place of occurrence might not be known in advance. Also, the spatial extend of the investigated domain might strongly vary in size, ranging from millimeters for reactive transport to kilometers for glacier erosion dynamics. An efficient way to proceed is to develop simple but robust algorithms that perform well and scale on modern supercomputers and permit therefore very high-resolution simulations. We propose an efficient approach to solve memory bounded real-world applications on modern supercomputers architectures. We optimize the software to run on our newly acquired state-of-the-art GPU cluster "octopus". Our approach shows promising preliminary results on important geodynamical and geomechanical problematics: we have developed a Stokes solver for glacier flow and a poromechanical solver including complex rheologies for nonlinear waves in stressed rocks porous rocks. We solve the system of partial differential equations on a regular Cartesian grid and use an iterative finite difference scheme with preconditioning of the residuals. The MPI communication happens only locally (point-to-point); this method is known to scale linearly by construction. The "octopus" GPU cluster, which we use for the computations, has been designed to achieve maximal data transfer throughput at minimal hardware cost. It is composed of twenty compute nodes, each hosting four Nvidia Titan X GPU accelerators. These high-density nodes are interconnected with a parallel (dual-rail) FDR InfiniBand network. Our efforts show promising preliminary results for the different physics investigated. The glacier flow solver achieves good accuracy in the relevant benchmarks and the coupled poromechanical solver permits to explain previously unresolvable focused fluid flow as a natural outcome of the porosity setup. In both cases

  1. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  2. 2017 Publications Demonstrate Advancements in Wind Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-17

    In 2017, wind energy experts at the National Renewable Energy Laboratory (NREL) made significant strides to advance wind energy. Many of these achievements were presented in articles published in scientific and engineering journals and technical reports that detailed research accomplishments in new and progressing wind energy technologies. During fiscal year 2017, NREL wind energy thought leaders shared knowledge and insights through 45 journal articles and 25 technical reports, benefiting academic and national-lab research communities; industry stakeholders; and local, state, and federal decision makers. Such publications serve as important outreach, informing the public of how NREL wind research, analysis, and deployment activities complement advanced energy growth in the United States and around the world. The publications also illustrate some of the noteworthy outcomes of U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) and Laboratory Directed Research and Development funding, as well as funding and facilities leveraged through strategic partnerships and other collaborations.

  3. The National Geothermal Energy Research Program

    Science.gov (United States)

    Green, R. J.

    1974-01-01

    The continuous demand for energy and the concern for shortages of conventional energy resources have spurred the nation to consider alternate energy resources, such as geothermal. Although significant growth in the one natural steam field located in the United States has occurred, a major effort is now needed if geothermal energy, in its several forms, is to contribute to the nation's energy supplies. From the early informal efforts of an Interagency Panel for Geothermal Energy Research, a 5-year Federal program has evolved whose objective is the rapid development of a commercial industry for the utilization of geothermal resources for electric power production and other products. The Federal program seeks to evaluate the realistic potential of geothermal energy, to support the necessary research and technology needed to demonstrate the economic and environmental feasibility of the several types of geothermal resources, and to address the legal and institutional problems concerned in the stimulation and regulation of this new industry.

  4. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Science.gov (United States)

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  5. The law for the Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    1985-01-01

    The Act for Japan Atomic Energy Research Institute has been promulgated anew. Contents are the following : general rules, officials, advisors and personnel, duties, financial affairs and accounts, supervision, miscellaneous rules, penal provisions, and additional rules. (In the additional rules, the merger into JAERI of Japan Nuclear Ship Research and Development Agency is treated.) Japan Atomic Energy Research Institute conducts research etc. for the development of atomic energy comprehensively and efficiently, thereby contributing to the promotion of atomic energy research, development and utilization, according to the Atomic Energy Fundamental Act. Duties are atomic energy basic and application research, reactor relation, training of the personnel, RIs relation, etc. (Mori, K.)

  6. Energy from Biomass Research and Technology Transfer Program

    Energy Technology Data Exchange (ETDEWEB)

    Schumacher, Dorin

    2015-12-31

    The purpose of CPBR is to foster and facilitate research that will lead to commercial applications. The goals of CPBR’s Energy from Biomass Research and Technology Transfer Program are to bring together industry, academe, and federal resources to conduct research in plant biotechnology and other bio-based technologies and to facilitate the commercialization of the research results to: (1) improve the utilization of plants as energy sources; (2) reduce the cost of renewable energy production; (3) facilitate the replacement of petroleum by plant-based materials; (4) create an energy supply that is safer in its effect on the environment, and (5) contribute to U.S. energy independence.

  7. NETL Super Computer

    Data.gov (United States)

    Federal Laboratory Consortium — The NETL Super Computer was designed for performing engineering calculations that apply to fossil energy research. It is one of the world’s larger supercomputers,...

  8. Jointly working on research for the energies of the future. Objectives of research 2013; Gemeinsam forschen fuer die Energie der Zukunft. Forschungsziele 2013

    Energy Technology Data Exchange (ETDEWEB)

    Szczepanski, Petra (comp.)

    2012-11-01

    The Renewable Energy Research Association (Berlin-Adlershof, Federal Republic of Germany) is a nationwide cooperation of research institutes. The members of this Research Association develop technologies for renewable energies and their system oriented cooperation as well as technologies for energy efficiency, energy storage and power distribution grids. The spectrum of research areas covers all renewable energy sources. These renewable energy sources complement each other quantitatively and temporarily in an electrical-thermal-chemical energy mix which is optimized by system technology, efficiency and storage technologies.

  9. Swiss Federal Energy Research Commission - Annual report 2009

    International Nuclear Information System (INIS)

    Maus, K.

    2010-02-01

    This annual report for the Swiss Federal Office of Energy (SFOE) provides an overview of the work carried out by the Swiss Federal Energy Research Commission CORE in 2009. The commission's main work included preparation work for the revised energy research concept for the period 2013 - 2016, a review of all research programmes operated by the Swiss Federal Office of Energy SFOE, the enhancement of cooperation with public and private research and promotion institutions, the coordination and consultation of research institutions and the improvement of international information exchange. The report summarises coordination work with the many CORE programmes and defines strategic main areas of interest for future work

  10. Solar Energy Innovation Network | Solar Research | NREL

    Science.gov (United States)

    Energy Innovation Network Solar Energy Innovation Network The Solar Energy Innovation Network grid. Text version The Solar Energy Innovation Network is a collaborative research effort administered (DOE) Solar Energy Technologies Office to develop and demonstrate new ways for solar energy to improve

  11. University of Kentucky Center for Applied Energy Research

    Science.gov (United States)

    University of Kentucky Center for Applied Energy Research Search Help Research Our Expertise University of Kentucky Center for Applied Energy Research | An Equal Opportunity University All Rights Remediation Power Generation CAER TechFacts CAER Factsheets CAER Affiliations Research Contacts Publications

  12. Research Needs for Magnetic Fusion Energy Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Neilson, Hutch

    2009-07-01

    Nuclear fusion — the process that powers the sun — offers an environmentally benign, intrinsically safe energy source with an abundant supply of low-cost fuel. It is the focus of an international research program, including the ITER fusion collaboration, which involves seven parties representing half the world’s population. The realization of fusion power would change the economics and ecology of energy production as profoundly as petroleum exploitation did two centuries ago. The 21st century finds fusion research in a transformed landscape. The worldwide fusion community broadly agrees that the science has advanced to the point where an aggressive action plan, aimed at the remaining barriers to practical fusion energy, is warranted. At the same time, and largely because of its scientific advance, the program faces new challenges; above all it is challenged to demonstrate the timeliness of its promised benefits. In response to this changed landscape, the Office of Fusion Energy Sciences (OFES) in the US Department of Energy commissioned a number of community-based studies of the key scientific and technical foci of magnetic fusion research. The Research Needs Workshop (ReNeW) for Magnetic Fusion Energy Sciences is a capstone to these studies. In the context of magnetic fusion energy, ReNeW surveyed the issues identified in previous studies, and used them as a starting point to define and characterize the research activities that the advance of fusion as a practical energy source will require. Thus, ReNeW’s task was to identify (1) the scientific and technological research frontiers of the fusion program, and, especially, (2) a set of activities that will most effectively advance those frontiers. (Note that ReNeW was not charged with developing a strategic plan or timeline for the implementation of fusion power.)

  13. Energy in Ireland: context, strategy and research

    International Nuclear Information System (INIS)

    Saintherant, N.; Lerouge, Ch.; Welcker, A.

    2008-01-01

    In the present day situation of sudden awareness about climatic change and announced fossil fuels shortage, Ireland has defined a new strategy for its energy future. Context: Ireland is strongly dependent of oil and gas imports which increase regularly to meet the demand. A small part of the electricity consumed is imported from Ulster. The share of renewable energies remains weak but is increasing significantly. Therefore, from 1990 to 2006, the proportion of renewable energies increased from 1.9% (mainly of hydroelectric origin) to 4.5%. Wind power represents now the main renewable energy source. The transportation sector is the most energy consuming and the biggest source of greenhouse gases. Strategy: the Irish policy is driven by pluri-annual strategic plans which define the objectives and means. Priority is given to the security of supplies at affordable prices: 8.5 billion euros will be invested during the 2007-2013 era for the modernization of existing energy infrastructures and companies, and in a lesser extent for the development of renewable energy sources. During this period, 415 million euros more will be devoted to the research, development and demonstration (RD and D) of new energy solutions. Research: in 2005 the energy RD and D expenses reached 12.8 million euros shared between 54% for R and D and 46% for demonstration projects. Half of the financing is given to higher education schools and is devoted to energy saving purposes (33%) and to renewable energies (29%, mainly wind power and biomass). Academic research gives a particular attention to ocean energy which represents an important potential resource in Ireland and which has already led to the creation of innovative companies. The integration of renewable energy sources to the power grid and the stability of supplies are also the object of active researches. (J.S.)

  14. Summaries of FY 1977, research in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    1977-10-01

    The U.S. Department of Energy, through the Office of Energy Research and the Division of High Energy and Nuclear Physics, provides approximately 90% of the total federal support for high energy physics research effort in the United States. The High Energy Physics Program primarily utilizes four major U.S. high energy accelerator facilities and over 50 universities under contract to do experimental and theoretical investigations on the properties, structure and transformation of matter and energy in their most basic forms. This compilation of research summaries is intended to present a convenient report of the scope and nature of high energy physics research presently funded by the U.S. Department of Energy. The areas covered include conception, design, construction, and operation of particle accelerators; experimental research using the accelerators and ancillary equipment; theoretical research; and research and development programs to advance accelerator technology, particle detector systems, and data analysis capabilities. Major concepts and experimental facts in high energy physics have recently been discovered which have the promise of unifying the fundamental forces and of understanding the basic nature of matter and energy. The summaries contained in this document were reproduced in essentially the form submitted by contractors as of January 1977.

  15. Summaries of FY 1977, research in high energy physics

    International Nuclear Information System (INIS)

    1977-10-01

    The U.S. Department of Energy, through the Office of Energy Research and the Division of High Energy and Nuclear Physics, provides approximately 90% of the total federal support for high energy physics research effort in the United States. The High Energy Physics Program primarily utilizes four major U.S. high energy accelerator facilities and over 50 universities under contract to do experimental and theoretical investigations on the properties, structure and transformation of matter and energy in their most basic forms. This compilation of research summaries is intended to present a convenient report of the scope and nature of high energy physics research presently funded by the U.S. Department of Energy. The areas covered include conception, design, construction, and operation of particle accelerators; experimental research using the accelerators and ancillary equipment; theoretical research; and research and development programs to advance accelerator technology, particle detector systems, and data analysis capabilities. Major concepts and experimental facts in high energy physics have recently been discovered which have the promise of unifying the fundamental forces and of understanding the basic nature of matter and energy. The summaries contained in this document were reproduced in essentially the form submitted by contractors as of January 1977

  16. Conference on energy research at historically black universities

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    A conference was convened to present and discuss significant research and development in Historically Black Institutions (current and past); areas that show potential for inter-institutional collaboration and the sharing of facilities; existing capabilities to sustain funded research activities and future potential for expansion and enhancement; and appropriate arrangements for maximum interaction with industry and government agencies. Papers were presented at small group meetings in various energy research areas, and abstracts of the projects or programs are presented. The Solar Energy small group provided contributions in the areas of photovoltaics, biomass, solar thermal, and wind. Research reported on by the Fossil Fuel small group comprises efforts in the areas of fluidized bed combustion of coal, coal liquefaction, and oil shale pyrolysis. Five research programs reported on by the Conservation Research small group involve a summer workshop for high school students on energy conservation; use of industrial waste heat for a greenhouse; solar energy and energy conservation research and demonstration; energy efficiency and management; and a conservation program targeted at developing a model for educating low income families. The Environment Impact groups (2) presented contributions on physical and chemical impacts and biological monitors and impacts. The Policy Research group presented four papers on a careful analysis of the Equity issues; one on a model for examining the economic issue in looking at the interaction between energy technology and the state of the economy; and a second paper examined the institutional constraints on environmental oriented energy policy. Six additional abstracts by invited participants are presented. (MCW)

  17. Energy engineering: Student-researcher collaboration

    DEFF Research Database (Denmark)

    Leban, Krisztina Monika; Ritchie, Ewen; Beckowska, Patrycja Maria

    2013-01-01

    This article reports on cooperation methods between researchers and students at different levels. Levels included in this work are BSc, MSc and PhD student levels. At Aalborg University, Department of Energy Technology education and research are closely linked. The relationship between student...

  18. Environmental Systems Research Candidates Program--FY2000 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Piet, Steven James

    2001-01-01

    The Environmental Systems Research Candidates (ESRC) Program, which is scheduled to end September 2001, was established in April 2000 as part of the Environmental Systems Research and Analysis Program at the Idaho National Engineering and Environmental Laboratory (INEEL) to provide key science and technology to meet the clean-up mission of the U.S. Department of Energy Office of Environmental Management, and perform research and development that will help solve current legacy problems and enhance the INEEL’s scientific and technical capability for solving longer-term challenges. This report documents the progress and accomplishments of the ESRC Program from April through September 2000. The ESRC Program consists of 24 tasks subdivided within four research areas: A. Environmental Characterization Science and Technology. This research explores new data acquisition, processing, and interpretation methods that support cleanup and long-term stewardship decisions. B. Subsurface Understanding. This research expands understanding of the biology, chemistry, physics, hydrology, and geology needed to improve models of contamination problems in the earth’s subsurface. C. Environmental Computational Modeling. This research develops INEEL computing capability for modeling subsurface contaminants and contaminated facilities. D. Environmental Systems Science and Technology. This research explores novel processes to treat waste and decontaminate facilities. Our accomplishments during FY 2000 include the following: • We determined, through analysis of samples taken in and around the INEEL site, that mercury emissions from the INEEL calciner have not raised regional off-INEEL mercury contamination levels above normal background. • We have initially demonstrated the use of x-ray fluorescence to image uranium and heavy metal concentrations in soil samples. • We increased our understanding of the subsurface environment; applying mathematical complexity theory to the problem of

  19. Energy research program 80

    International Nuclear Information System (INIS)

    1980-01-01

    The energy research program 80 contains an extension of the activities for the period 1980-82 within a budget of 100 mio.kr., that are a part of the goverment's employment plan for 1980. The research program is based on a number of project proposals, that have been collected, analysed, and supplemented in October-November 1979. This report consists of two parts. Part 1: a survey of the program, with a brief description of the background, principles, organization and financing. Part 2: Detailed description of the different research programs. (LN)

  20. Solar Energy Research Center Instrumentation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Thomas, J.; Papanikolas, John, P.

    2011-11-11

    SOLAR ENERGY RESEARCH CENTER INSTRUMENTATION FACILITY The mission of the Solar Energy Research Center (UNC SERC) at the University of North Carolina at Chapel Hill (UNC-CH) is to establish a world leading effort in solar fuels research and to develop the materials and methods needed to fabricate the next generation of solar energy devices. We are addressing the fundamental issues that will drive new strategies for solar energy conversion and the engineering challenges that must be met in order to convert discoveries made in the laboratory into commercially available devices. The development of a photoelectrosynthesis cell (PEC) for solar fuels production faces daunting requirements: (1) Absorb a large fraction of sunlight; (2) Carry out artificial photosynthesis which involves multiple complex reaction steps; (3) Avoid competitive and deleterious side and reverse reactions; (4) Perform 13 million catalytic cycles per year with minimal degradation; (5) Use non-toxic materials; (6) Cost-effectiveness. PEC efficiency is directly determined by the kinetics of each reaction step. The UNC SERC is addressing this challenge by taking a broad interdisciplinary approach in a highly collaborative setting, drawing on expertise across a broad range of disciplines in chemistry, physics and materials science. By taking a systematic approach toward a fundamental understanding of the mechanism of each step, we will be able to gain unique insight and optimize PEC design. Access to cutting-edge spectroscopic tools is critical to this research effort. We have built professionally-staffed facilities equipped with the state-of the-art instrumentation funded by this award. The combination of staff, facilities, and instrumentation specifically tailored for solar fuels research establishes the UNC Solar Energy Research Center Instrumentation Facility as a unique, world-class capability. This congressionally directed project funded the development of two user facilities: TASK 1: SOLAR

  1. Strategies and directions of Malaysian energy research

    International Nuclear Information System (INIS)

    Baharudin Yatim

    1995-01-01

    Research on energy efficiency could reconcile environmental issues associated with economic development. It could enhance energy supplies, improve the environment and develop alternative energy sources. Author reviews some of Malaysia's best energy R and D programmes

  2. The Center for Frontiers of Subsurface Energy Security (A 'Life at the Frontiers of Energy Research' contest entry from the 2011 Energy Frontier Research Centers (EFRCs) Summit and Forum)

    International Nuclear Information System (INIS)

    Pope, Gary A.

    2011-01-01

    'The Center for Frontiers of Subsurface Energy Security (CFSES)' was submitted to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CFSES is directed by Gary A. Pope at the University of Texas at Austin and partners with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.

  3. USU Alternative and Unconventional Energy Research and Development

    Energy Technology Data Exchange (ETDEWEB)

    Behunin, Robert [Utah State Univ., Logan, UT (United States); Wood, Byard [Utah State Univ., Logan, UT (United States); Heaslip, Kevin [Utah State Univ., Logan, UT (United States); Zane, Regan [Utah State Univ., Logan, UT (United States); Lyman, Seth [Utah State Univ., Logan, UT (United States); Simmons, Randy [Utah State Univ., Logan, UT (United States); Christensen, David [Utah State Univ., Logan, UT (United States)

    2014-01-29

    The purpose and rationale of this project has been to develop enduring research capabilities at Utah State University (USU) and the Utah State University Research Foundation (USURF) in a number of energy efficient and renewable energy areas including primarily a) algae energy systems, b) solar lighting, c) intuitive buildings, d) electric transportation, 3) unconventional energy environmental monitoring and beneficial reuse technologies (water and CO2), f) wind energy profiling, and g) land use impacts. The long-term goal of this initiative has been to create high-wage jobs in Utah and a platform for sustained faculty and student engagement in energy research. The program’s objective has been to provide a balanced portfolio of R&D conducted by faculty, students, and permanent staff. This objective has been met. While some of the project’s tasks met with more success than others, as with any research project of this scope, overall the research has contributed valuable technical insight and broader understanding in key energy related areas. The algae energy systems research resulted in a highly productive workforce development enterprise as it graduated a large number of well prepared students entering alternative energy development fields and scholarship. Moreover, research in this area has demonstrated both the technological and economic limitations and tremendous potential of algae feedstock-based energy and co-products. Research conducted in electric transportation, specifically in both stationary and dynamic wireless inductive coupling charging technologies, has resulted in impactful advances. The project initiated the annual Conference on Electric Roads and Vehicles (http://www.cervconference.org/), which is growing and attracts more than 100 industry experts and scholars. As a direct result of the research, the USU/USURF spin-out startup, WAVE (Wireless Advanced Vehicle Electrification), continues work in wirelessly charged bus transit systems

  4. Combining density functional theory calculations, supercomputing, and data-driven methods to design new materials (Conference Presentation)

    Science.gov (United States)

    Jain, Anubhav

    2017-04-01

    Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.

  5. Heuristic Scheduling in Grid Environments: Reducing the Operational Energy Demand

    Science.gov (United States)

    Bodenstein, Christian

    In a world where more and more businesses seem to trade in an online market, the supply of online services to the ever-growing demand could quickly reach its capacity limits. Online service providers may find themselves maxed out at peak operation levels during high-traffic timeslots but too little demand during low-traffic timeslots, although the latter is becoming less frequent. At this point deciding which user is allocated what level of service becomes essential. The concept of Grid computing could offer a meaningful alternative to conventional super-computing centres. Not only can Grids reach the same computing speeds as some of the fastest supercomputers, but distributed computing harbors a great energy-saving potential. When scheduling projects in such a Grid environment however, simply assigning one process to a system becomes so complex in calculation that schedules are often too late to execute, rendering their optimizations useless. Current schedulers attempt to maximize the utility, given some sort of constraint, often reverting to heuristics. This optimization often comes at the cost of environmental impact, in this case CO 2 emissions. This work proposes an alternate model of energy efficient scheduling while keeping a respectable amount of economic incentives untouched. Using this model, it is possible to reduce the total energy consumed by a Grid environment using 'just-in-time' flowtime management, paired with ranking nodes by efficiency.

  6. Energy Efficiency Evaluation and Benchmarking of AFRL’s Condor High Performance Computer

    Science.gov (United States)

    2011-08-01

    PlayStation 3 nodes executing the HPL benchmark. When idle, the two PS3s consume 188.49 W on average. At peak HPL performance, the nodes draw an average of...AUG 2011 2. REPORT TYPE CONFERENCE PAPER (Post Print) 3. DATES COVERED (From - To) JAN 2011 – JUN 2011 4 . TITLE AND SUBTITLE ENERGY EFFICIENCY...the High Performance LINPACK (HPL) benchmark while also measuring the energy consumed to achieve such performance. Supercomputers are ranked by

  7. Production, consumption and research on solar energy

    DEFF Research Database (Denmark)

    Sanz-Casado, Elias; Lascurain-Sánchez, Maria Luisa; Serrano-Lopez, Antonio Eleazar

    2014-01-01

    An analysis of scientific publications on solar energy was conducted to determine whether public interest in the subject is mirrored by more intense research in the area. To this end, the research published by Spain and Germany, the two EU countries with the highest installed photovoltaic capacity......, was analyzed based on Web of Science data. The results show that: solar output has risen substantially; solar research has a greater impact (measured in terms of citations) than publications on other renewables such as wind power; scientific production on solar energy is high in Germany and Spain, which...... intense. The main conclusion is the divergence in Germany and Spain between solar energy demand/output growth, being exponential, and the growth of research papers on the subject, which is linear...

  8. Design for energy efficiency: Energy efficient industrialized housing research program. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Kellett, R.; Berg, R.; Paz, A.; Brown, G.Z.

    1991-03-01

    Since 1989, the U.S. Department of Energy has sponsored the Energy Efficient Industrialized Housing research program (EEIH) to improve the energy efficiency of industrialized housing. Two research centers share responsibility for this program: The Center for Housing Innovation at the University of Oregon and the Florida Solar Energy Center, a research institute of the University of Central Florida. Additional funding is provided through the participation of private industry, state governments and utilities. The program is guided by a steering committee comprised of industry and government representatives. This report summarizes Fiscal Year (FY) 1990 activities and progress, and proposed activities for FY 1991 in Task 2.1 Design for Energy Efficiency. This task establishes a vision of energy conservation opportunities in critical regions, market segments, climate zones and manufacturing strategies significant to industrialized housing in the 21st Century. In early FY 1990, four problem statements were developed to define future housing demand scenarios inclusive of issues of energy efficiency, housing design and manufacturing. Literature surveys were completed to assess seven areas of influence for industrialized housing and energy conservation in the future. Fifty-five future trends were identified in computing and design process; manufacturing process; construction materials, components and systems; energy and environment; demographic context; economic context; and planning policy and regulatory context.

  9. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  10. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  11. The Case to Widen Defence Acquisition Research Paradigms

    Science.gov (United States)

    2012-04-30

    Group, BAE Systems, the Pension Benefit Guaranty Corporation, and the Departments of Defense, Energy, Justice, and State. Prior to that, he served as...the U.S. Air Force ordered 2,200 Sony PlayStation 3 videogame consoles which then formed the building block of a supercomputer. Soldiers in Iraq and...Afghanistan used Apple iPods and iPhones to run translation software and calculate bullet trajectories. XBox videogame controllers have been modified

  12. DNA Sequence Patterns – A Successful Example of Grid Computing in Genome Research and Building Virtual Super-Computers for the Research Commons of e-Societies

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); A. Abuseiris (Anis); M. Lesnussa (Michael); F.N. Kepper (Nick); R.M. de Graaf (Rob); F.G. Grosveld (Frank)

    2011-01-01

    textabstractThe amount of information is growing exponentially with ever-new technologies emerging and is believed to be always at the limit. In contrast, huge resources are obviously available, which are underused in the IT sector, similar as e.g. in the renewable energy sector. Genome research is

  13. Norway's centres for environment-friendly energy research (CEERs)

    Energy Technology Data Exchange (ETDEWEB)

    2009-07-01

    In February 2009 Norway's Minister of Petroleum and Energy announced the establishment of eight new Centres for Environment-friendly Energy Research (CEERs). The centres form national teams within the areas of offshore wind energy, solar energy, energy efficiency, bio energy, energy planning and design, and carbon capture and storage. These centres are: BIGCCS Centre - International CCS Research Centre; Centre for Environmental Design of Renewable Energy (CEDREN); Bioenergy Innovation Centre (CenBio); Norwegian Centre for Offshore Wind Energy (NORCOW E); Norwegian Research Centre for Offshore Wind Technology (NOWITECH); The Norwegian Research Centre for Solar Cell Technology; SUbsurface CO{sub 2} storage - Critical Elements and Superior Strategy (SUCCESS) The Research Centre on Zero Emission Buildings - ZEB (AG)

  14. Jointly Sponsored Research Program on Energy Related Research

    Energy Technology Data Exchange (ETDEWEB)

    No, author

    2013-12-31

    Cooperative Agreements, DE-FC26-08NT43293, DOE-WRI Cooperative Research and Development Program for Fossil Energy-Related Resources began in June 2009. The goal of the Program was to develop, commercialize, and deploy technologies of value to the nation’s fossil and renewable energy industries. To ensure relevancy and early commercialization, the involvement of an industrial partner was encouraged. In that regard, the Program stipulated that a minimum of 20% cost share be achieved in a fiscal year. This allowed WRI to carry a diverse portfolio of technologies and projects at various development technology readiness levels. Depending upon the maturity of the research concept and technology, cost share for a given task ranged from none to as high as 67% (two-thirds). Over the course of the Program, a total of twenty six tasks were proposed for DOE approval. Over the period of performance of the Cooperative agreement, WRI has put in place projects utilizing a total of $7,089,581 in USDOE funds. Against this funding, cosponsors have committed $7,398,476 in private funds to produce a program valued at $14,488,057. Tables 1 and 2 presented at the end of this section is a compilation of the funding for all the tasks conducted under the program. The goal of the Cooperative Research and Development Program for Fossil Energy-Related Resources was to through collaborative research with the industry, develop or assist in the development of innovative technology solutions that will: • Increase the production of United States energy resources – coal, natural gas, oil, and renewable energy resources; • Enhance the competitiveness of United States energy technologies in international markets and assist in technology transfer; • Reduce the nation's dependence on foreign energy supplies and strengthen both the United States and regional economies; and • Minimize environmental impacts of energy production and utilization. Success of the Program can be measured by

  15. Computational fluid dynamics: complex flows requiring supercomputers. January 1975-July 1988 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-July 1988

    International Nuclear Information System (INIS)

    1988-08-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets, and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate published search. (Contains 83 citations fully indexed and including a title list.)

  16. The application of contrast explanation to energy policy research: UK nuclear energy policy 2002–2012

    International Nuclear Information System (INIS)

    Heffron, Raphael J.

    2013-01-01

    This paper advances the application of the methodology, contrast explanation, to energy policy research. Research in energy policy is complex and often involves inter-disciplinary work, which traditional economic methodologies fail to capture. Consequently, the more encompassing methodology of contrast explanation is assessed and its use in other social science disciplines explored in brief. It is then applied to an energy policy research topic—in this case, nuclear energy policy research in the UK. Contrast explanation facilitates research into policy and decision-making processes in energy studies and offers an alternative to the traditional economic methods used in energy research. Further, contrast explanation is extended by the addition of contested and uncontested hypotheses analyses. This research focuses on the methods employed to deliver the new nuclear programme of the UK government. In order to achieve a sustainable nuclear energy policy three issues are of major importance: (1) law, policy and development; (2) public administration; and (3) project management. Further, the research identifies that policy in the area remains to be resolved, in particular at an institutional and legal level. However, contrary to the literature, in some areas, the research identifies a change of course as the UK concentrates on delivering a long-term policy for the nuclear energy sector and the overall energy sector. - Highlights: ► Energy policy research is interdisciplinary and needs additional methodological approaches. ► New method of contrast explanation advanced for energy policy research. ► This methodology is based on dialectical learning which examines conflict between sources of data. ► Research example used here is of UK nuclear energy policy. ► Major issues in UK nuclear energy policy are planning law, public administration, and project management

  17. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  18. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  19. Advanced energy projects FY 1992 research summaries

    International Nuclear Information System (INIS)

    1992-09-01

    The Division of Advanced Energy Projects (AEP) provides support to explore the feasibility of novel, energy-related concepts that evolve from advances in basic research. These concepts are typically at an early stage of scientific definition and, therefore, are beyond the scope of ongoing applied research or technology development programs. The Division provides a mechanism for converting basic research findings to applications that eventually could impact the Nation's energy economy. Technical topics include physical, chemical, materials, engineering, and biotechnologies. Projects can involve interdisciplinary approaches to solve energy-related problems. Projects are supported for a finite period of time, which is typically three years. Annual funding levels for projects are usually about $300,000 but can vary from approximately $50,000 to $500,000. It is expected that, following AEP support, each concept will be sufficiently developed and promising to attract further funding from other sources in order to realize its full potential. There were 39 research projects in the Division of Advanced Energy Projects during Fiscal Year 1992 (October 1, 1991 -- September 30, 1992). The abstracts of those projects are provided to introduce the overall program in Advanced Energy Projects. Further information on a specific project may be obtained by contacting the principal investigator, who is listed below the project title. Projects completed during FY 1992 are indicated

  20. Sociologies of energy. Towards a research agenda

    Directory of Open Access Journals (Sweden)

    Tomás Ariztía

    2017-12-01

    Full Text Available This article offers a panoramic view of the field of the social studies of energy while introducing the articles of the special issue. It begins by discussing the progressive interest on studying the social aspects of energy. We relate this interest to the increasing challenges imposed by global climate change as well as the growing sociological attention to the material dimension of social life. The article suggests understanding energy and energy related phenomena as a socio-technical object which involve material, social, cultural and technical elements. The article then briefly describes different research areas concerning the intersection between energy and society and present the contributions to the monograph. We suggest that the articles comprised in this special issue are not only relevant for social scientist interested on energy related issues; they might also help energy professionals and researchers from outside the social sciences to further problematize the social aspects and challenges of energy.

  1. Advanced Energy Projects FY 1990 research summaries

    International Nuclear Information System (INIS)

    1990-09-01

    This report serves as a guide to prepare proposals and provides summaries of the research projects active in FY 1990, sponsored by the Office of Basic Energy Sciences Division of Advanced Energy Projects, Department of Energy. (JF)

  2. Energy Materials Research Laboratory (EMRL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Energy Materials Research Laboratory at the Savannah River National Laboratory (SRNL) creates a cross-disciplinary laboratory facility that lends itself to the...

  3. PSI nuclear energy research progress report 1988

    International Nuclear Information System (INIS)

    Alder, H.P.; Wiedemann, K.H.

    1989-07-01

    The progress report at hand deals with nuclear energy research at PSI. The collection of articles covers a large number of topics: different reactor systems, part of the fuel cycle, the behaviour of structural materials. Examples of the state of knowledege in different disciplines are given: reactor physics, thermal-hydraulics, heat transfer, fracture mechanics, instrumental analysis, mathematical modelling. The purpose of this collection is to give a fair account of nuclear energy research at PSI. It should demonstrate that nuclear energy research is a central activity also in the new institute, the scientific basis for the continuing exploitation of nuclear power in Switzerland is preserved, work has continued not only along established lines but also new research topics were tackled, the quality of work corresponds to international standards and in selected areas is in the forefront, the expertise acquired also finds applications in non-nuclear research tasks. (author) 92 figs., 18 tabs., 316 refs

  4. International energy: Research organizations, 1988--1992. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, P.; Jordan, S. [eds.] [USDOE Office of Scientific and Technical Information, Oak Ridge, TN (United States)

    1993-06-01

    This publication contains the standardized names of energy research organizations used in energy information databases. Involved in this cooperative task are (1) the technical staff of the US DOE Office of Scientific and Technical Information (OSTI) in cooperation with the member countries of the Energy Technology Data Exchange (ETDE) and (2) the International Nuclear Information System (INIS). ETDE member countries are also members of the International Nuclear Information System (INIS). Nuclear organization names recorded for INIS by these ETDE member countries are also included in the ETDE Energy Database. Therefore, these organization names are cooperatively standardized for use in both information systems. This publication identifies current organizations doing research in all energy fields, standardizes the format for recording these organization names in bibliographic citations, assigns a numeric code to facilitate data entry, and identifies report number prefixes assigned by these organizations. These research organization names may be used in searching the databases ``Energy Science & Technology`` on DIALOG and ``Energy`` on STN International. These organization names are also used in USDOE databases on the Integrated Technical Information System. Research organizations active in the past five years, as indicated by database records, were identified to form this publication. This directory includes approximately 31,000 organizations that reported energy-related literature from 1988 to 1992 and updates the DOE Energy Data Base: Corporate Author Entries.

  5. Consumer energy research: an annotated bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, C.D.; McDougall, G.H.G.

    1980-01-01

    This document is an updated and expanded version of an earlier annotated bibliography by Dr. C. Dennis Anderson and Carman Cullen (A Review and Annotation of Energy Research on Consumers, March 1978). It is the final draft of the major report that will be published in English and French and made publicly available through the Consumer Research and Evaluation Branch of Consumer and Corporate Affairs, Canada. Two agencies granting permission to include some of their energy abstracts are the Rand Corporation and the DOE Technical Information Center. The bibliography consists mainly of empirical studies, including surveys and experiments. It also includes a number of descriptive and econometric studies that utilize secondary data. Many of the studies provide summaries of research is specific areas, and point out directions for future research efforts. 14 tables.

  6. 1997: BMBF expenditures for energy research

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    Departmental budget No. 30 in the 1997 draft federal budget covers the activities of the Federal Ministry for Research and Technology (BMFT). It level of DM 15,000 million represents a 4.5% decrease from the funds earmarked for the current year of 1996. DM 72.600 million is to be spent on safety research for nuclear plants, and DM 239.978 million has been planned for decommissioning and demolition of nuclear experimental and demonstration plants. The operation of, and investements into, the research centers are funded to the tune of DM 1314.268 million and DM 325.728 million, respectively. Institutions of basic research will receive DM 444.088 million, and renewable energies, economical energy uses, conversion and combustion technologies will be funded in the amount of DM 328.100 million. (orig.) [de

  7. 1999: BMBF expenditures for energy research

    International Nuclear Information System (INIS)

    Anon.

    1998-01-01

    Departmental budget No. 30 in the 1999 draft federal budget covers the activities of the Federal Ministry for Education, Science, Research and Technology (BMBF). Its level of DM 15428 million represents a 3,34% increase from the funds earmarked for the current year of 1998. DM 66 million is to be spent on safety research for nuclear plants, and DM 220 million has been planned for decommissioning and demolition of nuclear experimental and demonstration plants. The operation of, and investments into, the research centers are funded to the tune of DM 1307 million and DM 350 million, respectively. Institutions of basic research will receive DM 471 million, and renewable energies, economical energy uses, conversion and combustion technologies will be funded in the amount of DM 234 million [de

  8. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  9. Energy Technology Division research summary - 1999.

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-31

    The Energy Technology Division provides materials and engineering technology support to a wide range of programs important to the US Department of Energy. As shown on the preceding page, the Division is organized into ten sections, five with concentrations in the materials area and five in engineering technology. Materials expertise includes fabrication, mechanical properties, corrosion, friction and lubrication, and irradiation effects. Our major engineering strengths are in heat and mass flow, sensors and instrumentation, nondestructive testing, transportation, and electromechanics and superconductivity applications. The Division Safety Coordinator, Environmental Compliance Officers, Quality Assurance Representative, Financial Administrator, and Communication Coordinator report directly to the Division Director. The Division Director is personally responsible for cultural diversity and is a member of the Laboratory-wide Cultural Diversity Advisory Committee. The Division's capabilities are generally applied to issues associated with energy production, transportation, utilization, or conservation, or with environmental issues linked to energy. As shown in the organization chart on the next page, the Division reports administratively to the Associate Laboratory Director (ALD) for Energy and Environmental Science and Technology (EEST) through the General Manager for Environmental and Industrial Technologies. While most of our programs are under the purview of the EEST ALD, we also have had programs funded under every one of the ALDs. Some of our research in superconductivity is funded through the Physical Research Program ALD. We also continue to work on a number of nuclear-energy-related programs under the ALD for Engineering Research. Detailed descriptions of our programs on a section-by-section basis are provided in the remainder of this book.

  10. Atlantic Canada's energy research and development website and database

    International Nuclear Information System (INIS)

    2005-01-01

    Petroleum Research Atlantic Canada maintains a website devoted to energy research and development in Atlantic Canada. The site can be viewed on the world wide web at www.energyresearch.ca. It includes a searchable database with information about researchers in Nova Scotia, their projects and published materials on issues related to hydrocarbons, alternative energy technologies, energy efficiency, climate change, environmental impacts and policy. The website also includes links to research funding agencies, external related databases and related energy organizations around the world. Nova Scotia-based users are invited to submit their academic, private or public research to the site. Before being uploaded into the database, a site administrator reviews and processes all new information. Users are asked to identify their areas of interest according to the following research categories: alternative or renewable energy technologies; climate change; coal; computer applications; economics; energy efficiency; environmental impacts; geology; geomatics; geophysics; health and safety; human factors; hydrocarbons; meteorology and oceanology (metocean) activities; petroleum operations in deep and shallow waters; policy; and power generation and supply. The database can be searched 5 ways according to topic, researchers, publication, projects or funding agency. refs., tabs., figs

  11. The Global Climate and Energy Project at Stanford University: Fundamental Research Towards Future Energy Technologies

    Science.gov (United States)

    Milne, Jennifer L.; Sassoon, Richard E.; Hung, Emilie; Bosshard, Paolo; Benson, Sally M.

    The Global Climate and Energy Project (GCEP), at Stanford University, invests in research with the potential to lead to energy technologies with lower greenhouse gas emissions than current energy technologies. GCEP is sponsored by four international companies, ExxonMobil, GE, Schlumberger, and Toyota and supports research programs in academic institutions worldwide. Research falls into the broad areas of carbon based energy systems, renewables, electrochemistry, and the electric grid. Within these areas research efforts are underway that are aimed at achieving break-throughs and innovations that greatly improve efficiency, performance, functionality and cost of many potential energy technologies of the future including solar, batteries, fuel cells, biofuels, hydrogen storage and carbon capture and storage. This paper presents a summary of some of GCEP's activities over the past 7 years with current research areas of interest and potential research directions in the near future.

  12. Energy research for practice; Energieforschung fuer die Praxis

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Johannes (ed.) [FIZ Karlsruhe, Bonn (Germany). BINE Informationsdienst

    2006-07-01

    The BINE editorial team, experts with a background in engineering and journalism, provide information in an independent, experienced and critical manner. Current information from research and pilot projects is thoroughly researched and prepared in a target-group-oriented way. The three series of brochures (Projektinfo, Themeninfo and basisEnergie), which describe results and experience gathered from research projects, are geared toward those who could potentially apply this information in practice, i.e. developers, planners, consultants, investors, energy suppliers and occupants. These publications, as well as the BINE newsletter, can be subscribed to at no cost. At www.bine.info, the information provided is systematically interconnected with additional information. The BINE Information Service facilitates the transfer of knowledge and information from energy research to practice, while cooperating closely with companies and institutions which, within the framework of sponsored projects, work to make efficiency technologies and renewable energy sources ready for use. Numerous collaborations with establishments in the fields of research, education and practice, as well as with trade press and politicians, serve to accelerate the application of energy research topics. The BINE Information Service is provided by FIZ Karlsruhe and sponsored by the German Federal Ministry of Economics and Technology. (orig.)

  13. Energy Research - Sandia Energy

    Science.gov (United States)

    Energy Energy Secure & Sustainable Energy Future Search Icon Sandia Home Locations Contact Us Employee Locator Menu Stationary Power solar Energy Conversion Efficiency Increasing the amount of electricity produced from a given thermal energy input. Solar Energy Wind Energy Water Power Supercritical CO2

  14. A user-friendly web portal for T-Coffee on supercomputers

    Directory of Open Access Journals (Sweden)

    Koetsier Jos

    2011-05-01

    Full Text Available Abstract Background Parallel T-Coffee (PTC was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.

  15. A multilayered analysis of energy security research and the energy supply process

    International Nuclear Information System (INIS)

    Kiriyama, Eriko; Kajikawa, Yuya

    2014-01-01

    Highlights: • The analysis reveals that energy security research is highly multidisciplinary. • Diversification is important for ensuring security in the energy supply process. • A multilayered overview of the energy supply process is important for energy risk management. • Consumer lifestyle innovation will be a part of energy security in the future. - Abstract: After the Fukushima nuclear disaster, a reassessment of the energy system is needed in order to include such aspects as human security and resilience. More open and careful discussions are needed concerning the various risks and uncertainties of future energy options, both in Japan and globally. In this paper, we aim to offer a fundamental basis for discourse on energy security by analyzing the status and trends in academic publications on that issue. Our bibliometrics analysis indicates that research has shifted from promoting strategies for ensuring the self-sufficiency of the primary energy to diversification of the secondary energy supply chain by introducing energy networks consisting of an infrastructure established through international coordination. In the literature, the concept of energy security is ambiguous and allows for multiple interpretations. Our results illustrate the existence of highly multidisciplinary topics within energy security, which can be categorized into four perspectives: geopolitical, economic, policy related, and technological

  16. The Centres for Environment-friendly Energy Research (FME)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    High expectations for Norway's Centres for Environment-friendly Energy Research (FME).The FME centres address a broad range of areas, allcentral to developing the energy sector of the future. The activities of the eight centres established in 2009 focus on renewable energy, raising energy efficiency, energy planning, and carbon capture and storage (CCS). In 2011 three new FME centres were established which focus on social science-related energy research. The FME scheme is a direct follow-up of the broad-based political agreement on climate policy achieved in the Storting in January 2008, and of the national RandD Energi21 strategy submitted in February 2008 to the Ministry of Petroleum and Energy. In April 2008 the Research Council of Norway's Executive Board decided to launch a process to establish centres for environment-friendly energy research, and a funding announcement was issued that same year. In 2010 it was decided that additional FME centres would be established in the field of social science-related energy research. After a thorough assessment of each project (based on feasibility, scientific merit, potential to generate value creation and innovation, and composition of the consortium) eight applicants were selected to become FME centres in February 2009. A new call for proposals was issued in 2010, and three more centres were awarded FME status in February 2011. The objective of the FME scheme is to establish time-limited research centres which conduct concentrated, focused and long-term research of high international calibre in order to solve specific challenges in the energy sphere. The selected centres must exhibit higher goals, a longer-term perspective and a more concentrated focus than is required under other funding instruments for the same scientific area. The make-up of the centres is critical to achieving this objective. The centres bring together Norway's leading research institutions and key players in private enterprise, the

  17. [Applications of GIS in biomass energy source research].

    Science.gov (United States)

    Su, Xian-Ming; Wang, Wu-Kui; Li, Yi-Wei; Sun, Wen-Xiang; Shi, Hai; Zhang, Da-Hong

    2010-03-01

    Biomass resources have the characteristics of widespread and dispersed distribution, which have close relations to the environment, climate, soil, and land use, etc. Geographic information system (GIS) has the functions of spatial analysis and the flexibility of integrating with other application models and algorithms, being of predominance to the biomass energy source research. This paper summarized the researches on the GIS applications in biomass energy source research, with the focus in the feasibility study of bioenergy development, assessment of biomass resources amount and distribution, layout of biomass exploitation and utilization, evaluation of gaseous emission from biomass burning, and biomass energy information system. Three perspectives of GIS applications in biomass energy source research were proposed, i. e., to enrich the data source, to improve the capacity on data processing and decision-support, and to generate the online proposal.

  18. Research planning in the energy sector

    International Nuclear Information System (INIS)

    Graenicher, H.

    1977-06-01

    The author considers research planning split into four separate aspects: the character of the research situation; the function of planning stages; the type of research target; and the limit of the application of research planning by planning stages. He then considers the specific problem of energy research and discusses the question of what the state is to do and how to do it with particular attention to the Swiss situation. (G.T.H)

  19. Sustainable energy systems and the EURATOM research programme

    International Nuclear Information System (INIS)

    Webster, S.; Van Goethem, G.; )

    2007-01-01

    We are at a turning point in European research. With the launch of the EU's 7th Framework Programme, committing some Euro 53 billion of public funds to the European research effort over the next 7 years, Europe has finally woken up to the importance of Research and Development in the realisation of the most fundamental objectives defining the Union: growth, competitiveness, and knowledge. At the same time, and with strong links to growth and competitiveness but also to environmental protection, the Union is in the throws of an intense debate on future energy policy and climate change. Part of the research budget, some would say too small a part, is earmarked for energy - in particular the technological aspects of low carbon systems such renewables. This effort, together with measures to improve the EU's security and independence of supply, are essential if Europe is to respond effectively to solve the future energy conundrum. But where does nuclear fit in all this? What will the Union be doing in the area of nuclear research? Indeed, does nuclear figure at all in the long-term plans of the Union? Through the EURATOM part of the Framework Programme, the EU is maintaining important support to up-stream research in the area of advanced reactor technologies. This effort is being coordinated at the global level through EURATOM's membership of the Generation-IV International Forum. Though EU research in this field still has its critics among the Member States, and despite the relatively small sums currently committed, the leverage effect of current actions is significant and this is set to grow in the future. The imminent setting up of a Strategic Energy Technology Plan, as part of the European Commission on-going activities in the field of energy policy, and the feedback from independent experts in the Advisory Group on Energy and the EURATOM Scientific and Technical Committee all point to following conclusions: EU support for research on advanced nuclear fission

  20. Jointly working on research for the energies of the future. Objectives of research; Gemeinsam forschen fuer die Energie der Zukunft. Forschungsziele

    Energy Technology Data Exchange (ETDEWEB)

    Stadermann, G.; Szczepanski, P. (comps.)

    2006-07-01

    The booklet consists of chapters and various articles: Doing research work with joint efforts; R and D - political objectives of FVS; fields of research and development; electrical system techniques; network management and separated power plants; heat and coolness from renewable energies; solar construction works: building covers and system techniques; generating and utilizing chemical energy sources from renewable energies; estimating consequences of techniques.

  1. History of the Energy Research and Development Administration

    Energy Technology Data Exchange (ETDEWEB)

    Buck, A.L.

    1982-03-01

    Congress created the Energy Research and Development Administration on October 11, 1974 in response to the Nation's growing need for additional sources of energy. The new agency would coordinate energy programs formerly scattered among many federal agencies, and serve as the focus point for a major effort by the Federal Government to expand energy research and development efforts. New ways to conserve existing supplies as well as the commercial demonstration of new technologies would hopefully be the fruit of the Government's first significant effort to amalgamate energy resource development programs. This history briefly summarizes the accomplishments of the agency.

  2. CREATIV: Research-based innovation for industry energy efficiency

    International Nuclear Information System (INIS)

    Tangen, Grethe; Hemmingsen, Anne Karin T.; Neksa, Petter

    2011-01-01

    Improved energy efficiency is imperative to minimise the greenhouse gas emissions and to ensure future energy security. It is also a key to continued profitability in energy consuming industry. The project CREATIV is a research initiative for industry energy efficiency focusing on utilisation of surplus heat and efficient heating and cooling. In CREATIV, international research groups work together with key vendors of energy efficiency equipment and an industry consortium including the areas metallurgy, pulp and paper, food and fishery, and commercial refrigeration supermarkets. The ambition of CREATIV is to bring forward technology and solutions enabling Norway to reduce both energy consumption and greenhouse gas emissions by 25% within 2020. The main research topics are electricity production from low temperature heat sources in supercritical CO 2 cycles, energy efficient end-user technology for heating and cooling based on natural working fluids and system optimisation, and efficient utilisation of low temperature heat by developing new sorption systems and compact compressor-expander units. A defined innovation strategy in the project will ensure exploitation of research results and promote implementation in industry processes. CREATIV will contribute to the recruitment of competent personnel to industry and academia by educating PhD and post doc candidates and several MSc students. The paper presents the CREATIV project, discusses its scientific achievements so far, and outlines how the project results can contribute to reducing industry energy consumption. - Highlights: → New technology for improved energy efficiency relevant across several industries. → Surplus heat exploitation and efficient heating and cooling are important means. → Focus on power production from low temperature heat and heat pumping technologies. → Education and competence building are given priority. → The project consortium includes 20 international industry companies and

  3. Research in high energy physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses research being conducted in high energy physics in the following areas; quantum chromodynamics; drift chambers; proton-antiproton interactions; particle decays; particle production; polarimeters; quark-gluon plasma; and conformed field theory

  4. Research in high energy physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses research being conducted in high energy physics in the following areas: quantum chromodynamics; drift chambers; proton-antiproton interactions; particle decays; particle production; polarimeters; quark-gluon plasma; and conformed field theory

  5. Portable implementation model for CFD simulations. Application to hybrid CPU/GPU supercomputers

    Science.gov (United States)

    Oyarzun, Guillermo; Borrell, Ricard; Gorobets, Andrey; Oliva, Assensi

    2017-10-01

    Nowadays, high performance computing (HPC) systems experience a disruptive moment with a variety of novel architectures and frameworks, without any clarity of which one is going to prevail. In this context, the portability of codes across different architectures is of major importance. This paper presents a portable implementation model based on an algebraic operational approach for direct numerical simulation (DNS) and large eddy simulation (LES) of incompressible turbulent flows using unstructured hybrid meshes. The strategy proposed consists in representing the whole time-integration algorithm using only three basic algebraic operations: sparse matrix-vector product, a linear combination of vectors and dot product. The main idea is based on decomposing the nonlinear operators into a concatenation of two SpMV operations. This provides high modularity and portability. An exhaustive analysis of the proposed implementation for hybrid CPU/GPU supercomputers has been conducted with tests using up to 128 GPUs. The main objective consists in understanding the challenges of implementing CFD codes on new architectures.

  6. Basic Solar Energy Research in Japan (2011 EFRC Forum)

    International Nuclear Information System (INIS)

    Domen, Kazunari

    2011-01-01

    Kazunari Domen, Chemical System Engineering Professor at the University of Tokyo, was the second speaker in the May 26, 2011 EFRC Forum session, 'Global Perspectives on Frontiers in Energy Research.' In his presentation, Professor Domen talked about basic solar energy research in Japan. The 2011 EFRC Summit and Forum brought together the EFRC community and science and policy leaders from universities, national laboratories, industry and government to discuss 'Science for our Nation's Energy Future.' In August 2009, the Office of Science established 46 Energy Frontier Research Centers. The EFRCs are collaborative research efforts intended to accelerate high-risk, high-reward fundamental research, the scientific basis for transformative energy technologies of the future. These Centers involve universities, national laboratories, nonprofit organizations, and for-profit firms, singly or in partnerships, selected by scientific peer review. They are funded at $2 to $5 million per year for a total planned DOE commitment of $777 million over the initial five-year award period, pending Congressional appropriations. These integrated, multi-investigator Centers are conducting fundamental research focusing on one or more of several 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The purpose of the EFRCs is to integrate the talents and expertise of leading scientists in a setting designed to accelerate research that transforms the future of energy and the environment.

  7. Progress in turbulence research

    International Nuclear Information System (INIS)

    Bradshaw, P.

    1990-01-01

    Recent developments in experiments and eddy simulations, as an introduction to a discussion of turbulence modeling for engineers is reviewed. The most important advances in the last decade rely on computers: microcomputers to control laboratory experiments, especially for multidimensional imaging, and supercomputers to simulate turbulence. These basic studies in turbulence research are leading to genuine breakthroughs in prediction methods for engineers and earth scientists. The three main branches of turbulence research: experiments, simulations (numerically-accurate three-dimensional, time-dependent solutions of the Navier-Stokes equations, with any empiricism confined to the smallest eddies), and modeling (empirical closure of time-averaged equations for turbulent flow) are discussed. 33 refs

  8. Energy research information system projects report, volume 5, number 1

    Science.gov (United States)

    Johnson, J.; Schillinger, L.

    1980-07-01

    The system (ERIS) provides an inventory of the energy related programs and research activities from 1974 to the present in the states of Montana, Nebraska, North Dakota, South Dakota and Wyoming. Areas of research covered include coal, reclamation, water resources, environmental impacts, socioeconomic impacts, energy conversion, mining methodology, petroleum, natural gas, oilshale, renewable energy resources, nuclear energy, energy conservation and land use. Each project description lists title, investigator(s), research institution, sponsor, funding, time frame, location, a descriptive abstract of the research and title reports and/or publications generated by the research. All projects are indexed by location, personal names, organizations and subject keywords.

  9. Synthesis of the 1. ANR Energy Assessment colloquium - Which research for tomorrow's energy?

    International Nuclear Information System (INIS)

    Lecourtier, Jacqueline; Pappalardo, Michele; Bucaille, Alain; Falanga, Anne; Fouillac, Christian; Amouroux, Jacques; Bouchard, Patrick; Cadet, Daniel; Fioni, Gabriele; Appert, Olivier; Le Quere, Patrick; Bernard, Herve; Moisan, Francois; Witte, Marc de; Cochevelou, Gilles; Bastien, Remi; Heitzmann, Martha; Lefebvre, Thierry; Michon, Ulysse; Perrier, Olivier; Tarascon, Jean-Marie; Lincot, Daniel; Hadziioannou, Georges; Jacquemelle, Michele; Mermilliod, Nicole; Saulnier, Jean-Bernard

    2009-11-01

    Proposed by representatives of the main involved companies, agencies and institutions, the contributions of this colloquium addressed the following issues: the role of new energy technologies in the French and World sustainable development; The programmes 'New energy technologies'; Research priorities for these new technologies; Industry Perspectives and challenges; SMEs and the ANR; Research perspectives and challenges (electrochemical storage of energy, solar photovoltaic energy, new materials for energy, integration of renewable energies in electric systems, technological innovations for new energy technologies)

  10. Research on high energy density plasmas and applications

    International Nuclear Information System (INIS)

    1999-01-01

    Recently, technologies on lasers, accelerators, and pulse power machines have been significantly advanced and input power density covers the intensity range from 10 10 W/cm 2 to higher than 10 20 W/cm 2 . As the results, high pressure gas and solid targets can be heated up to very high temperature to create hot dense plasmas which have never appeared on the earth. The high energy density plasmas opened up new research fields such as inertial confinement fusion, high brightness X-ray radiation sources, interiors of galactic nucleus,supernova, stars and planets, ultra high pressure condensed matter physics, plasma particle accelerator, X-ray laser, and so on. Furthermore, since these fields are intimately connected with various industrial sciences and technologies, the high energy density plasma is now studied in industries, government institutions, and so on. This special issue of the Journal of Plasma Physics and Nuclear Fusion Research reviews the high energy density plasma science for the comprehensive understanding of such new fields. In May, 1998, the review committee for investigating the present status and the future prospects of high energy density plasma science was established in the Japan Society of Plasma Science and Nuclear Fusion Research. We held three committee meetings to discuss present status and critical issues of research items related to high energy density plasmas. This special issue summarizes the understandings of the committee. This special issue consists of four chapters: They are Chapter 1: Physics important in the high energy density plasmas, Chapter 2: Technologies related to the plasma generation; drivers such as lasers, pulse power machines, particle beams and fabrication of various targets, Chapter 3: Plasma diagnostics important in high energy density plasma experiments, Chapter 4: A variety of applications of high energy density plasmas; X-ray radiation, particle acceleration, inertial confinement fusion, laboratory astrophysics

  11. Swiss Energy research 2007 - Overview from the Heads of the Programs; Energie-Forschung 2007. Ueberblicksberichte der Programmleiter

    Energy Technology Data Exchange (ETDEWEB)

    Calisesi, Y

    2008-04-15

    This comprehensive document issued by the Swiss Federal Office of Energy (SFOE) presents the overview reports elaborated by the heads of the various Swiss energy research programmes. Topics covered include the efficient use of energy, with reports covering energy in buildings, traffic and accumulators, electrical technologies, applications and grids, ambient heat, combined heat and power, cooling, combustion, the 'power station 2000', fuel cells and hydrogen and process engineering. Renewable energy topics reported on include solar heat, photovoltaics, industrial solar energy, biomass and wood energy, hydropower, geothermal heat and wind energy. Nuclear energy topics include safety, regulatory safety research and nuclear fusion. Finally, energy economics basics are reviewed. The report is completed with annexes on the Swiss Energy Research Commission, energy research organisations and a list of important addresses.

  12. Swiss Energy research 2007 - Overview from the Heads of the Programs; Energie-Forschung 2007. Ueberblicksberichte der Programmleiter

    Energy Technology Data Exchange (ETDEWEB)

    Calisesi, Y.

    2008-04-15

    This comprehensive document issued by the Swiss Federal Office of Energy (SFOE) presents the overview reports elaborated by the heads of the various Swiss energy research programmes. Topics covered include the efficient use of energy, with reports covering energy in buildings, traffic and accumulators, electrical technologies, applications and grids, ambient heat, combined heat and power, cooling, combustion, the 'power station 2000', fuel cells and hydrogen and process engineering. Renewable energy topics reported on include solar heat, photovoltaics, industrial solar energy, biomass and wood energy, hydropower, geothermal heat and wind energy. Nuclear energy topics include safety, regulatory safety research and nuclear fusion. Finally, energy economics basics are reviewed. The report is completed with annexes on the Swiss Energy Research Commission, energy research organisations and a list of important addresses.

  13. RESEARCH OF GLOBAL NEW INVESTMENT IN RENEWABLE ENERGY

    Directory of Open Access Journals (Sweden)

    О. Chernyak

    2015-10-01

    Full Text Available This article contains results of studying experiences of the leading countries in renewable energy technologies’ development. The classification of renewable energy was presented. In this article we investigated modern trends and prospects of wind power, solar energy, hydropower, bioenergy and geothermal energy. Authors analyzed different national strategies for attracting investments in “green” energy. Rating of the 10 countries with the largest investments in alternative energy was presented. Authors researched investments in developed countries and developing countries, depending on the type of renewable energy. A model for research and forecasting of investment in renewable energy based on annual data for the period 1990-2012 years was built. In addition, authors used methods such as moving average, exponential smoothing, Holt- Winters method and different types of trends based on quarterly data for 2004-2014 years.

  14. Energy research shows the way to sustainable energy policy

    International Nuclear Information System (INIS)

    Glatthard, T.

    2000-01-01

    This article takes a look at the work of the Swiss research programme on energy economics basics that aims to provide advice for policy makers. The programme investigates not only the technological but also the social and economic factors to be taken into consideration. In particular, the article reviews the programme's work on promotion strategies for sustainability in the energy area in connection with a proposed levy on energy. Examples are given of possible implementation strategies concerning new and existing buildings. The responsibilities of the parties to be involved in the implementation of promotional measures such as cantonal authorities, professional associations and agencies are discussed

  15. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    International Nuclear Information System (INIS)

    Delbecq, J.M.; Banner, D.

    2003-01-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  16. Energy Technology Division research summary 1997

    International Nuclear Information System (INIS)

    1997-01-01

    The Energy Technology Division provides materials and engineering technology support to a wide range of programs important to the US Department of Energy. As shown on the preceding page, the Division is organized into ten sections, five with concentrations in the materials area and five in engineering technology. Materials expertise includes fabrication, mechanical properties, corrosion, friction and lubrication, and irradiation effects. Our major engineering strengths are in heat and mass flow, sensors and instrumentation, nondestructive testing, transportation, and electromechanics and superconductivity applications. The Division Safety Coordinator, Environmental Compliance Officers, Quality Assurance Representative, Financial Administrator, and Communication Coordinator report directly to the Division Director. The Division Director is personally responsible for cultural diversity and is a member of the Laboratory-wide Cultural Diversity Advisory Committee. The Division's capabilities are generally applied to issues associated with energy production, transportation, utilization or conservation, or with environmental issues linked to energy. As shown in the organization chart on the next page, the Division reports administratively to the Associate Laboratory Director (ALD) for Energy and Environmental Science and Technology (EEST) through the General Manager for Environmental and Industrial Technologies. While most of our programs are under the purview of the EEST ALD, we also have had programs funded under every one of the ALDs. Some of our research in superconductivity is funded through the Physical Research Program ALD. We also continue to work on a number of nuclear-energy-related programs under the ALD for Engineering Research. Detailed descriptions of our programs on a section-by-section basis are provided in the remainder of this book. This Overview highlights some major trends. Research related to the operational safety of commercial light water nuclear

  17. Energy Technology Division research summary 1997.

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-21

    The Energy Technology Division provides materials and engineering technology support to a wide range of programs important to the US Department of Energy. As shown on the preceding page, the Division is organized into ten sections, five with concentrations in the materials area and five in engineering technology. Materials expertise includes fabrication, mechanical properties, corrosion, friction and lubrication, and irradiation effects. Our major engineering strengths are in heat and mass flow, sensors and instrumentation, nondestructive testing, transportation, and electromechanics and superconductivity applications. The Division Safety Coordinator, Environmental Compliance Officers, Quality Assurance Representative, Financial Administrator, and Communication Coordinator report directly to the Division Director. The Division Director is personally responsible for cultural diversity and is a member of the Laboratory-wide Cultural Diversity Advisory Committee. The Division's capabilities are generally applied to issues associated with energy production, transportation, utilization or conservation, or with environmental issues linked to energy. As shown in the organization chart on the next page, the Division reports administratively to the Associate Laboratory Director (ALD) for Energy and Environmental Science and Technology (EEST) through the General Manager for Environmental and Industrial Technologies. While most of our programs are under the purview of the EEST ALD, we also have had programs funded under every one of the ALDs. Some of our research in superconductivity is funded through the Physical Research Program ALD. We also continue to work on a number of nuclear-energy-related programs under the ALD for Engineering Research. Detailed descriptions of our programs on a section-by-section basis are provided in the remainder of this book. This Overview highlights some major trends. Research related to the operational safety of commercial light water

  18. 3rd programme 'Energy research and energy technologies'

    International Nuclear Information System (INIS)

    1990-01-01

    In the light of developments in the 80s, the questions of dependence and available resources seem less grave in the long and medium term; on the other hand, a further problem has arisen which might prove even more serious with a view to the safeguarding of long-term energy supply: the use of fossil energy sources such as coal; petroleum, and natural gas involves effects constituting a considerable threat to the environment and the world climate. Examples are acid rain and the greenhouse effect. Furthermore, new safety issues and, to a larger extent, also acceptance issues have arisen as regards nuclear energy utilization. To contribute towards solving these problems by research and development is the main objective of this programme. The strategy adopted comprices two approaches complementary to each other: elaboration of scientific bases, system connections, and new techniques permitting - continued use of primary and secondary energy sources to the extent required while taking into account the needs of an increasingly more vulnerable environment; - to ensure the lowest possible energy consumption in the future, reducing, at the same time, considerably the amount of greenhouse gases emitted. (orig./UA) [de

  19. Energy Frontier Research Centers: Helping Win the Energy Innovation Race (2011 EFRC Summit Keynote Address, Secretary of Energy Chu)

    International Nuclear Information System (INIS)

    Chu, Steven

    2011-01-01

    Secretary of Energy Steven Chu gave the keynote address at the 2011 EFRC Summit and Forum. In his talk, Secretary Chu highlighted the need to 'unleash America's science and research community' to achieve energy breakthroughs. The 2011 EFRC Summit and Forum brought together the EFRC community and science and policy leaders from universities, national laboratories, industry and government to discuss 'Science for our Nation's Energy Future.' In August 2009, the Office of Science established 46 Energy Frontier Research Centers. The EFRCs are collaborative research efforts intended to accelerate high-risk, high-reward fundamental research, the scientific basis for transformative energy technologies of the future. These Centers involve universities, national laboratories, nonprofit organizations, and for-profit firms, singly or in partnerships, selected by scientific peer review. They are funded at $2 to $5 million per year for a total planned DOE commitment of $777 million over the initial five-year award period, pending Congressional appropriations. These integrated, multi-investigator Centers are conducting fundamental research focusing on one or more of several 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The purpose of the EFRCs is to integrate the talents and expertise of leading scientists in a setting designed to accelerate research that transforms the future of energy and the environment.

  20. [Research in high energy physics

    International Nuclear Information System (INIS)

    1991-01-01

    This report discusses progress in the following research in high energy physics: The crystal ball experiment; delco at PEP; proton decay experiment; MACRO detector; mark III detector; SLD detector; CLEO II detector; and the caltech L3 group

  1. Energy Frontier Research Centers: Science for Our Nation's Energy Future, September 2016

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-09-01

    As world demand for energy rapidly expands, transforming the way energy is collected, stored, and used has become a defining challenge of the 21st century. At its heart, this challenge is a scientific one, inspiring the U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) to establish the Energy Frontier Research Center (EFRC) program in 2009. The EFRCs represent a unique approach, bringing together creative, multidisciplinary scientific teams to perform energy-relevant basic research with a complexity beyond the scope of single-investigator projects. These centers take full advantage of powerful new tools for characterizing, understanding, modeling, and manipulating matter from atomic to macroscopic length scales. They also train the next-generation scientific workforce by attracting talented students and postdoctoral researchers interested in energy science. The EFRCs have collectively demonstrated the potential to substantially advance the scientific understanding underpinning transformational energy technologies. Both a BES Committee of Visitors and a Secretary of Energy Advisory Board Task Force have found the EFRC program to be highly successful in meeting its goals. The scientific output from the EFRCs is impressive, and many centers have reported that their results are already impacting both technology research and industry. This report on the EFRC program includes selected highlights from the initial 46 EFRCs and the current 36 EFRCs.

  2. Application of diffusion research to solar energy policy issues

    Energy Technology Data Exchange (ETDEWEB)

    Roessner, J. D.; Posner, D.; Shoemaker, F.; Shama, A.

    1979-03-01

    This paper examines two types of information requirements that appear to be basic to DOE solar-energy-policy decisions: (1) how can the future market success of solar energy technologies be estimated, and (2) what factors influence the adoption of solar energy technologies, and what specific programs could promote solar energy adoption most effectively. This paper assesses the ability of a body of research, referred to here as diffusion research, to supply information that could partially satisfy these requirements. This assessment proceeds, first, by defining in greater detail a series of policy issues that face DOE. These are divided into cost reduction and performance improvement issues which include issues confronting the technology development component of the solar energy program, and barriers and incentives issues which are most relevant to problems of solar energy application. Second, these issues are translated into a series of questions that the diffusion approach can help resolve. Third, various elements within diffusion research are assessed in terms of their abilities to answer policy questions. Finally, the strengths and limitations of current knowledge about the diffusion of innovations are summarized, the applicability of both existing knowledge and the diffusion approach to the identified solar-energy-policy issues are discussed, and ways are suggested in which diffusion approaches can be modified and existing knowledge employed to meet short- and long-term goals of DOE. The inquiry covers the field of classical diffusion research, market research and consumer behavior, communication research, and solar-energy market-penetration modeling.

  3. Energy in Ireland: context, strategy and research; Energie en Irlande: contexte, strategie et recherche

    Energy Technology Data Exchange (ETDEWEB)

    Saintherant, N.; Lerouge, Ch.; Welcker, A

    2008-01-15

    In the present day situation of sudden awareness about climatic change and announced fossil fuels shortage, Ireland has defined a new strategy for its energy future. Context: Ireland is strongly dependent of oil and gas imports which increase regularly to meet the demand. A small part of the electricity consumed is imported from Ulster. The share of renewable energies remains weak but is increasing significantly. Therefore, from 1990 to 2006, the proportion of renewable energies increased from 1.9% (mainly of hydroelectric origin) to 4.5%. Wind power represents now the main renewable energy source. The transportation sector is the most energy consuming and the biggest source of greenhouse gases. Strategy: the Irish policy is driven by pluri-annual strategic plans which define the objectives and means. Priority is given to the security of supplies at affordable prices: 8.5 billion euros will be invested during the 2007-2013 era for the modernization of existing energy infrastructures and companies, and in a lesser extent for the development of renewable energy sources. During this period, 415 million euros more will be devoted to the research, development and demonstration (RD and D) of new energy solutions. Research: in 2005 the energy RD and D expenses reached 12.8 million euros shared between 54% for R and D and 46% for demonstration projects. Half of the financing is given to higher education schools and is devoted to energy saving purposes (33%) and to renewable energies (29%, mainly wind power and biomass). Academic research gives a particular attention to ocean energy which represents an important potential resource in Ireland and which has already led to the creation of innovative companies. The integration of renewable energy sources to the power grid and the stability of supplies are also the object of active researches. (J.S.)

  4. SOFTWARE FOR SUPERCOMPUTER SKIF “ProLit-lC” and “ProNRS-lC” FOR FOUNDRY AND METALLURGICAL PRODUCTIONS

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2008-01-01

    Full Text Available The data of modeling on supercomputer system SKIF of technological process of  molds filling by means of computer system 'ProLIT-lc', and also data of modeling of the steel pouring process by means ofTroNRS-lc'are presented. The influence of number of  processors of  multinuclear computer system SKIF on acceleration and time of  modeling of technological processes, connected with production of castings and slugs, is shown.

  5. Palacios and Kitten : high performance operating systems for scalable virtualized and native supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Widener, Patrick (University of New Mexico); Jaconette, Steven (Northwestern University); Bridges, Patrick G. (University of New Mexico); Xia, Lei (Northwestern University); Dinda, Peter (Northwestern University); Cui, Zheng.; Lange, John (Northwestern University); Hudson, Trammell B.; Levenhagen, Michael J.; Pedretti, Kevin Thomas Tauke; Brightwell, Ronald Brian

    2009-09-01

    Palacios and Kitten are new open source tools that enable applications, whether ported or not, to achieve scalable high performance on large machines. They provide a thin layer over the hardware to support both full-featured virtualized environments and native code bases. Kitten is an OS under development at Sandia that implements a lightweight kernel architecture to provide predictable behavior and increased flexibility on large machines, while also providing Linux binary compatibility. Palacios is a VMM that is under development at Northwestern University and the University of New Mexico. Palacios, which can be embedded into Kitten and other OSes, supports existing, unmodified applications and operating systems by using virtualization that leverages hardware technologies. We describe the design and implementation of both Kitten and Palacios. Our benchmarks show that they provide near native, scalable performance. Palacios and Kitten provide an incremental path to using supercomputer resources that is not performance-compromised.

  6. Measuring scientific research in emerging nano-energy field

    Science.gov (United States)

    Guan, Jiancheng; Liu, Na

    2014-04-01

    The purpose of this paper is to comprehensively explore scientific research profiles in the field of emerging nano-energy during 1991-2012 based on bibliometrics and social network analysis. We investigate the growth pattern of research output, and then carry out across countries/regions comparisons on research performances. Furthermore, we examine scientific collaboration across countries/regions by analyzing collaborative intensity and networks in 3- to 4-year intervals. Results indicate with an impressively exponential growth pattern of nano-energy articles, the world share of scientific "giants," such as the USA, Germany, England, France and Japan, display decreasing research trends, especially in the USA. Emerging economies, including China, South Korea and India, exhibit a rise in terms of the world share, illustrating strong development momentum of these countries in nano-energy research. Strikingly, China displays a remarkable rise in scientific influence rivaling Germany, Japan, France, and England in the last few years. Finally, the scientific collaborative network in nano-energy research has expanded steadily. Although the USA and several major European countries play significantly roles on scientific collaboration, China and South Korea exert great influence on scientific collaboration in recent years. The findings imply that emerging economies can earn competitive advantages in some emerging fields by properly engaging a catch-up strategy.

  7. On energy conservation and energy research. Om energioekonomisering og energiforskning

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    This report to the Storting (Parliament) is the third one on energy conservation during the last 10 years. As earlier, the report mainly treats the use of energy for stationary objects. The background for this report is, above all, the increased environmental requirements to the energy policy attached to the use of fossil fuels. The economic energy conservation potential of Norway is estimated on the basis of the present energy prices and available technology. For stationary energy use it amounts to ca 23 TWh, of which 16 TWh refer to electric power and 7 TWh to oil. Among the measures of the authorities to realize this potential, information about energy economy and energy technology is one of the most important. Other important measures are research and development activities as well as temporary arrangements for economic support. Energy conservation efforts, and efforts for a better environment should often be considered together, because higher energy efficiency in general can result in important positive environmental impacts. In the long term, the global enviromental problems may be the strongest motive power for an increased effort in energy conservation. 38 figs., 22 tabs.

  8. Research opportunities to advance solar energy utilization.

    Science.gov (United States)

    Lewis, Nathan S

    2016-01-22

    Major developments, as well as remaining challenges and the associated research opportunities, are evaluated for three technologically distinct approaches to solar energy utilization: solar electricity, solar thermal, and solar fuels technologies. Much progress has been made, but research opportunities are still present for all approaches. Both evolutionary and revolutionary technology development, involving foundational research, applied research, learning by doing, demonstration projects, and deployment at scale will be needed to continue this technology-innovation ecosystem. Most of the approaches still offer the potential to provide much higher efficiencies, much lower costs, improved scalability, and new functionality, relative to the embodiments of solar energy-conversion systems that have been developed to date. Copyright © 2016, American Association for the Advancement of Science.

  9. Department of Defense energy policy and research: A framework to support strategy

    International Nuclear Information System (INIS)

    Strakos, Joshua K.; Quintanilla, Jose A.; Huscroft, Joseph R.

    2016-01-01

    The Department of Defense (DOD) is the major consumer of energy within the Federal government, and it has been directed to implement cost cutting measures related to energy dependence through numerous Executive Orders and Congressional legislation. As a result, the DOD released an Energy Strategy which outlines ways to reduce energy requirements in order to meet both Presidential and Congressional mandates for energy security. With this research, we provide a historical review (1973–2014) of energy policy, legislation, and research. Additionally we identify gaps between strategy and research. The results show that DOD energy research lacks a unifying structure and guiding framework. We propose a knowledge management framework to unify and guide research efforts in direct support of the DOD Energy Strategy. - Highlights: •Unification of effort is needed to support strategic goals. •Provides the current state of DOD energy research. •Proposes a framework to guide DOD energy research. •Frames the DOD energy research context and landscape. •Promotes a unifying structure for DOD energy research.

  10. A Strategy for Nuclear Energy Research and Development

    International Nuclear Information System (INIS)

    Bennett, Ralph G.

    2008-01-01

    The United States is facing unprecedented challenges in climate change and energy security. President-elect Obama has called for a reduction of CO2 emissions to 1990 levels by 2020, with a further 80% reduction by 2050. Meeting these aggressive goals while gradually increasing the overall energy supply requires that all non-emitting technologies must be advanced. The development and deployment of nuclear energy can, in fact, help the United States meet several key challenges: (1) Increase the electricity generated by non-emitting sources to mitigate climate change, (2) Foster the safe and proliferation-resistant use of nuclear energy throughout the world, (3) Reduce the transportation sector's dependence on imported fossil fuels, and (4) Reduce the demand on natural gas for process heat and hydrogen production. However, because of the scale, cost, and time horizons involved, increasing nuclear energy's share will require a coordinated research effort-combining the efforts of industry and government, supported by innovation from the research community. This report outlines the significant nuclear energy research and development (R and D) necessary to create options that will allow government and industrial decision-makers to set policies and create nuclear energy initiatives that are decisive and sustainable. The nuclear energy R and D strategy described in this report adopts the following vision: Safe and economical nuclear energy in the United States will expand to address future electric and non-electric needs, significantly reduce greenhouse gas emissions and provide energy diversity, while providing leadership for safe, secure and responsible expansion of nuclear energy internationally

  11. Energy Technology Division research summary 2004

    International Nuclear Information System (INIS)

    Poeppel, R. B.; Shack, W. J.

    2004-01-01

    The Energy Technology (ET) Division provides materials and engineering technology support to a wide range of programs important to the US Department of Energy (DOE). The Division's capabilities are generally applied to technical issues associated with energy systems, biomedical engineering, transportation, and homeland security. Research related to the operational safety of commercial light water nuclear reactors (LWRs) for the US Nuclear Regulatory Commission (NRC) remains another significant area of interest for the Division. The pie chart below summarizes the ET sources of funding for FY 2004

  12. Council of Energy Engineering Research. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, Richard J.

    2003-08-22

    The Engineering Research Program, a component program of the DOE Office of Basic Energy Sciences (BES), was established in 1979 to aid in resolving the numerous engineering issues arising from efforts to meet U.S. energy needs. The major product of the program became part of the body of knowledge and data upon which the applied energy technologies are founded; the product is knowledge relevant to energy exploration, production, conversion and use.

  13. Research for the energy turnaround. Phase transitions actively shape. Contributions

    International Nuclear Information System (INIS)

    Szczepanski, Petra; Wunschick, Franziska; Martin, Niklas

    2015-01-01

    The Annual Conference 2014 of the Renewable Energy Research Association was held in Berlin on 6 and 7 November 2014. This book documents the contributions of the conference on research for the energy turnaround, phase transitions actively shape. After an introduction and two contributions to the political framework, the contributions to the economic phases of the energy transition, the phase of the current turn, the phases of social energy revolution, the stages of heat turnaround (Waermewende), and the stages of the mobility turn deal with the stages of development of the energy system. Finally, the Research Association Renewable Energy is briefly presented. [de

  14. New energy technologies. Research program proposition

    International Nuclear Information System (INIS)

    2005-02-01

    This document presents the most promising program propositions of research and development and the public financing needed for their realization. The concerned technologies are: the hydrogen and the fuel cell PAN-H, the separation and the storage of the CO 2 , the photovoltaic solar electricity, the PREBAT program of the building energy recovery and the bio-energies. (A.L.B.)

  15. Assessment Report on the national research strategy for energy

    International Nuclear Information System (INIS)

    2009-01-01

    This report was issued in 2009 by the French Parliament commission in charge of evaluating the scientific and technological choices of France's research in the field of energy. With environmental, economical and national independence concerns in view, the objective of the report is to assess the national research strategy for energy and to propose some directions for its future development. The scientific priority given in France to nuclear energy, petroleum, photovoltaic energy, second generation bio fuels and energy storage should be maintained. Mass energy storage should be considered as an essential condition for the development of renewable energies, such as offshore wind farms and storage systems

  16. Energy research projects in the Nordic countries - catalogue 1983

    International Nuclear Information System (INIS)

    1983-01-01

    The Nordic energy ministers at their meeting February 9, 1982 agreed upon a working plan for the Nordic energy cooperation. As part of this plan a contact group was established in order to maintain coordination and cooperation within the area of energy research and development. This group decided April 1982 to establish a catalogue of energy research projects in the Nordic countries. A pilot catalogue was published in June 1982. The 1983 catalogue gives an up-to-date survey of energy research and development projects in the Nordic countries. About 2125 projects are described, and information is given on investigator(s), performing organization, financing body, funds, and period. The catalogue is prepared by the Nordic energy libraries through their cooperation in Nordic Atomic Libraries Joint Secretariat. The information is also included in the data base Nordic Energy Index (NEI), which is online accessible at I/S Datacentralen, Copenhagen, via EURONET, SCANNET, TYMNET, AND TELENET. (BP)

  17. The Austrian Research Centers activities in energy risks

    International Nuclear Information System (INIS)

    Sdouz, Gert

    1998-01-01

    Among the institutions involved in energy analyses in Austria the risk context is being treated by three different entities: the Energy Consumption Agency, internationally known as EVA, the Federal Environmental Protection Agency, or Urnweltbundesarnt assessing mainly the environmental risks involved and the Austrian Research Centers, working on safety and risk evaluation. The Austrian Research Center Seibersdorf draws form its proficiency in Reactor Safety and Fusion Research, two fields of experience it has been involved in since its foundation, for some 40 years now. Nuclear energy is not well accepted by the Austrian population. Therefore in our country only energy systems with advanced safety level might be accepted in the far future. This means that the development of methods to compare risks is an important task. The characteristics of energy systems featuring advanced safety levels are: A very low hazard potential and a focus on deterministic safety instead of probabilistic safety, meaning to rely on inherently safe physics concepts, confirmed by probabilistic safety evaluation results. This can be achieved by adequate design of fusion reactors, advanced fission reactors and all different renewable sources of energy

  18. Advanced Energy Projects: FY 1993, Research summaries

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    AEP has been supporting research on novel materials for energy technology, renewable and biodegradable materials, new uses for scientific discoveries, alternate pathways to energy efficiency, alternative energy sources, innovative approaches to waste treatment and reduction, etc. The summaries are grouped according to projects active in FY 1993, Phase I SBIR projects, and Phase II SBIR projects. Investigator and institutional indexes are included.

  19. Advanced Energy Projects: FY 1993, Research summaries

    International Nuclear Information System (INIS)

    1993-09-01

    AEP has been supporting research on novel materials for energy technology, renewable and biodegradable materials, new uses for scientific discoveries, alternate pathways to energy efficiency, alternative energy sources, innovative approaches to waste treatment and reduction, etc. The summaries are grouped according to projects active in FY 1993, Phase I SBIR projects, and Phase II SBIR projects. Investigator and institutional indexes are included

  20. Energy Harvesting Research: The Road from Single Source to Multisource.

    Science.gov (United States)

    Bai, Yang; Jantunen, Heli; Juuti, Jari

    2018-06-07

    Energy harvesting technology may be considered an ultimate solution to replace batteries and provide a long-term power supply for wireless sensor networks. Looking back into its research history, individual energy harvesters for the conversion of single energy sources into electricity are developed first, followed by hybrid counterparts designed for use with multiple energy sources. Very recently, the concept of a truly multisource energy harvester built from only a single piece of material as the energy conversion component is proposed. This review, from the aspect of materials and device configurations, explains in detail a wide scope to give an overview of energy harvesting research. It covers single-source devices including solar, thermal, kinetic and other types of energy harvesters, hybrid energy harvesting configurations for both single and multiple energy sources and single material, and multisource energy harvesters. It also includes the energy conversion principles of photovoltaic, electromagnetic, piezoelectric, triboelectric, electrostatic, electrostrictive, thermoelectric, pyroelectric, magnetostrictive, and dielectric devices. This is one of the most comprehensive reviews conducted to date, focusing on the entire energy harvesting research scene and providing a guide to seeking deeper and more specific research references and resources from every corner of the scientific community. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Nuclear energy research in Indonesia

    International Nuclear Information System (INIS)

    Supadi, S.; Soentono, S.; Djokolelono, M.

    1988-01-01

    Indonesia's National Atomic Energy Authority, BATAN (Badan Tenaga Atom Nasional), was founded to implement, regulate and monitor the development and launching of programs for the peaceful uses of nuclear power. These programs constitute part of the efforts made to change to a more industrialized level the largely agricultural society of Indonesia. BATAN elaborated extensive nuclear research and development programs in a variety of fields, such as medicine, the industrial uses of isotopes and radiation, the nuclear fuel cycle, nuclear technology and power generation, and in fundamental research. The Puspiptek Nuclear Research Center has been equipped with a multi-purpose research reactor and will also have a fuel element fabrication plant, a facility for treating radioactive waste, a radiometallurgical laboratory, and laboratories for working with radioisotopes and for radiopharmaceutical research. (orig.) [de

  2. Energy-related indoor environmental quality research: A priority agenda

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, W.J.; Brager, G.; Burge, H.; Cummings, J.; Levin, H.; Loftness, V.; Mendell, M.J.; Persily, A.; Taylor, S.; Zhang, J.S.

    2002-08-01

    A multidisciplinary team of IEQ and energy researchers has defined a program of priority energy-related IEQ research. This paper describes the methods employed to develop the agenda, and 35 high priority research and development (R&D) project areas related to four broad goals: (1) identifying IEQ problems and opportunities; (2) developing and evaluating energy-efficient technologies for improving IEQ; (3) developing and evaluating energy-efficient practices for improving IEQ; and (4) encouraging or assisting the implementation of technologies or practices for improving IEQ. The identified R&D priorities reflect a strong need to benchmark IEQ conditions in small commercial buildings, schools, and residences. The R&D priorities also reflect the need to better understand how people are affected by IEQ conditions and by the related building characteristics and operation and maintenance practices. The associated research findings will provide a clearer definition of acceptable IEQ that is required to guide the development of technologies, practices, standards, and guidelines. Quantifying the effects of building characteristics and practices on IEQ conditions, in order to provide the basis for development of energy efficient and effective IEQ control measures, was also considered a priority. The development or advancement in a broad range of IEQ tools, technologies, and practices are also a major component of the priority research agenda. Consistent with the focus on ''energy-related'' research priorities, building ventilation and heating, ventilating and air conditioning (HVAC) systems and processes are very prominent in the agenda. Research related to moisture and microbiological problems, particularly within hot and humid climates, is also prominent within the agenda. The agenda tends to emphasize research on residences, small commercial buildings, and schools because these types of buildings have been underrepresented in prior research. Most of

  3. Accelerator Center for Energy Research (ACER)

    Data.gov (United States)

    Federal Laboratory Consortium — The Accelerator Center for Energy Research (ACER) exploits radiation chemistry techniques to study chemical reactions (and other phenomena) by subjecting samples to...

  4. Research challenges for energy data management (panel)

    DEFF Research Database (Denmark)

    Pedersen, Torben Bach; Lehner, Wolfgang

    2013-01-01

    This panel paper aims at initiating discussion at the Second International Workshop on Energy Data Management (EnDM 2013) about the important research challenges within Energy Data Management. The authors are the panel organizers, extra panelists will be recruited before the workshop...

  5. A review study of the current research on energy hub for energy positive neighborhoods

    NARCIS (Netherlands)

    Walker, S.W.W.; Labeodan, T.; Maassen, W.H.; Zeiler, W.

    2017-01-01

    Energy positive neighborhoods and cities are emerging concepts aimed at addressing the current energy and environmental sustainability challenges. In this paper, the concept and current research on energy hubs relating to energy positive neighborhoods are presented. In addition to discussing

  6. The Swiss Federal Energy Research Concept for the Years 2000-2003

    International Nuclear Information System (INIS)

    1999-05-01

    The Swiss Federal Energy Research Concept provides details within the framework set by the Swiss Parliament and the Swiss Federal Council (Government). It maps out how publicly supported research shall be used to achieve politically decided energy goals. Information is provided on the manner in which energy education, research and technology developments will be supported during the period from 2000-2003. The Concept facilitates coordination among federal and cantonal decision makers as well as municipal authorities. Swiss energy research is dedicated to sustainable development, including the massive reduction of CO 2 emissions. This is also implicit in the concept of the '2000 W society'. A two-pronged approach strives to reduce pollution by energy systems and increase system efficiencies. Technical progress is buttressed by socio-economic measures. Priorities for publicly funded energy research have been set in the context of long-term perspectives, harmonized with European and worldwide goals. Swiss energy research must be high-level research and this requires adequate means being made available to assure both quality and continuity. It is important that the attractiveness and competitiveness of Switzerland as a home for science and technology be maintained, indeed strengthened. It has been proved worldwide that energy research needs public funding. Particularly favored is application oriented research, including pilot and demonstration projects. (author)

  7. The Swiss Federal Energy Research Concept for the Years 2000-2003

    International Nuclear Information System (INIS)

    1999-05-01

    The Swiss Federal Energy Research Concept provides details within the framework set by the Swiss Parliament and the Swiss Federal Council (Government). It maps out how publicly supported research shall be used to achieve politically decided energy goals. Information is provided on the manner in which energy education, research and technology developments will be supported during the period from 2000-2003. The concept facilitates coordination among federal and cantonal decision makers as well as municipal authorities. Swiss energy research is dedicated to sustainable development, including the massive reduction of CO 2 emissions. This is also implicit in the concept of the '2000 W society'. A two-pronged approach strives to reduce pollution by energy systems and increase system efficiencies. Technical progress is buttressed by socio-economic measures. Priorities for publicly funded energy research have been set in the context of long-term perspectives, harmonized with European and worldwide goals. Swiss energy research must be high-level research and this requires adequate means being made available to assure both quality and continuity. It is important that the attractiveness and competitiveness of Switzerland as a home for science and technology be maintained, indeed strengthened. It has been proved worldwide that energy research needs public funding. Particularly favored is application oriented research, including pilot and demonstration projects. (author)

  8. Research Facilities for the Future of Nuclear Energy

    International Nuclear Information System (INIS)

    Ait Abderrahim, H.

    1996-01-01

    The proceedings of the ENS Class 1 Topical Meeting on Research facilities for the Future of Nuclear Energy include contributions on large research facilities, designed for tests in the field of nuclear energy production. In particular, issues related to facilities supporting research and development programmes in connection to the operation of nuclear power plants as well as the development of new concepts in material testing, nuclear data measurement, code validation, fuel cycle, reprocessing, and waste disposal are discussed. The proceedings contain 63 papers

  9. University of Maryland Energy Research Center |

    Science.gov (United States)

    breakthroughs into commercial, clean energy solutions. The Clark School Celebrates Women's History Month The Clark School is featuring our female engineering faculty members throughout March. UMD Researchers

  10. California Energy Commission Public Interest EnergyResearch/Energy System Integration -- Transmission-Planning Research&Development Scoping Project

    Energy Technology Data Exchange (ETDEWEB)

    Eto, Joseph H.; Lesieutre, Bernard; Widergren, Steven

    2004-07-01

    The objective of this Public Interest Energy Research (PIER)scoping project is to identify options for public-interest research and development (R&D) to improve transmission-planning tools, techniques, and methods. The information presented was gathered through a review of current California utility, California Independent System Operator (ISO), and related western states electricity transmission-planning activities and emerging needs. This report presents the project teams findings organized under six topic areas and identifies 17 distinct R&D activities to improve transmission-planning in California and the West. The findings in this report are intended for use, along with other materials, by PIER staff, to facilitate discussions with stakeholders that will ultimately lead to development of a portfolio of transmission-planning R&D activities for the PIER program.

  11. Nuclear methods in environmental and energy research

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, J. R. [ed.

    1977-01-01

    The topics considered in the seven sessions were nuclear methods in atmospheric research; nuclear and atomic methodology; nuclear methods in tracer applications; energy exploration, production, and utilization; nuclear methods in environmental monitoring; nuclear methods in water research; and nuclear methods in biological research. Individual abstracts were prepared for each paper. (JSR)

  12. Developing Research Capabilities in Energy Biosciences

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Donald D.

    2008-01-01

    Scientists founded the Life Sciences Research Foundation (LSRF) in 1983 as a non-profit pass through foundation that awards post doctoral fellowships in all areas of the life sciences. LSRF scientists review hundreds of applications each year from PhDs seeking support. For example this year, our 26th, we received 800 applications and our peer review committee will choose about 50 finalists who are eligible for these awards. We have no endowment so we solicit sponsors each year. The fellowships are sponsored by research oriented companies, foundations, philanthropists, the Howard Hughes Medical Institute, and other organizations who believe in the value of awarding fellowships to the best and the brightest young scientists. Our web site has a complete listing of all details about LSRF (http://www.lsrf.org/). In the late 1980s the Division of Bioscience in the Office of Basic Energy Science, a granting agency of the Department of Energy, joined this partnership. Bioscience's mandate was to support non-medical microbiology and plant sciences. LSRF received a series of 5 year grants from DOE to award fellowships to our top applicants in these fields of research. We began to support DOE-Energy Bioscience post doctoral fellows in 1989. From 1989 through 2004 when DOE funding ended our partnership awarded 41 DOE-Energy Bioscience Fellows of the Life Sciences Research Foundation. Each of these was a three year fellowship. DOE-Energy Biosciences was well matched with LSRF. Our extensive peer review screened applicants in all areas of the life sciences. Most LSRF sponsors are interested in supporting fellows who work on diseases. At the time that we began our partnership with DOE we had no sponsors willing to support plant biology and non medical microbiology. For 15 years DOE played a major role in the training of the very best young scientists in these important fields of research simply through its support of LSRF post doctoral fellows. Young scientists interested in

  13. Tomorrow the energy. Words of researchers

    International Nuclear Information System (INIS)

    Metenier, Beatrice; Huret, Christophe; Bordenave, Aurelie; Tourrasse, Corinne; Nourry, Didier; Bellet, Daniel; Blanquet, Elisabeth; Bonjour, Jocelyn; Brochier, Elisabeth; Fave, Alain; Grunenwald, Perrine; Herri, Jean-Michel; Menanteau, Philippe; Normand, Bernard; Raison, Bertrand; Stutz, Benoit

    2015-01-01

    Based on interviews of researchers in various disciplines and areas, this book proposes a prospective vision of energy. It starts with a presentation of points of view of a philosopher, a climatologist, an economist and a scientific on the definition of energy transition. The second part addresses how to be committed in energy efficiency by saving energy in buildings (towards an inter-seasonal storage and an active management of energy), in transports (a change of behaviours, lighter materials), and in industry (optimised air conditioning, a more efficient industry). The next part discusses how to diversify resources: hydraulic resources where the main issue or challenge is to produce and store a more flexible production, nuclear energy (to improve safety and to develop technologies towards the use of extreme materials), solar energy (to capture this energy at a reduced cost by using highly efficient cells), fossil energies (to optimize the exploitation and to decrease emissions by capturing CO 2 ), and biomass (to assess the resource). The last chapter discusses the challenges related to energy storage and distribution: how to store energy and for which use (towards solid hydrogen storage), and how to adapt the grid to the emergence of renewable energies (towards a grid self-healing)

  14. Annual report of the Japan Atomic Energy Research Institute for fiscal 2000

    International Nuclear Information System (INIS)

    2001-01-01

    The Japan Atomic Energy Research Institute (JAERI) promotes some researches such as neutron science research, light quantum/synchrotron radiation science research, radiation application research, science research, advanced basic research, and so on, based on nuclear energy R and D and contributing to general development on scientific technology, along the Long-term program on research, development and application of nuclear energy' established on June, 1994, as a general organization on nuclear energy R and D in Japan. And, as an R and D on advanced energy system bringing breakthrough on nuclear energy technology, JAERI also promotes research on future type energy system, R and D on nuclear fusion, and trial research on high temperature engineering. Furthermore, JAERI progresses research on safety and health physics, as occupying both fields of general nuclear energy science and nuclear energy. In addition, by carrying out not only interdisciplinary cooperation in Japan but also versatile international one, various research assisting business and effective R and D are promoted. Here were described in details in fiscal year 2000, on 6 items on the neutron science research (SR), 13 items on light quantum/radiation light SR, 13 items on radiation application SR, 6 items on matter SR, 3 items on environment SR, 19 items on advanced basic SR, and so on. (G.K.)

  15. The BlueGene/L Supercomputer and Quantum ChromoDynamics

    International Nuclear Information System (INIS)

    Vranas, P; Soltz, R

    2006-01-01

    In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD

  16. Modeling radiative transport in ICF plasmas on an IBM SP2 supercomputer

    International Nuclear Information System (INIS)

    Johansen, J.A.; MacFarlane, J.J.; Moses, G.A.

    1995-01-01

    At the University of Wisconsin-Madison the authors have integrated a collisional-radiative-equilibrium model into their CONRAD radiation-hydrodynamics code. This integrated package allows them to accurately simulate the transport processes involved in ICF plasmas; including the important effects of self-absorption of line-radiation. However, as they increase the amount of atomic structure utilized in their transport models, the computational demands increase nonlinearly. In an attempt to meet this increased computational demand, they have recently embarked on a mission to parallelize the CONRAD program. The parallel CONRAD development is being performed on an IBM SP2 supercomputer. The parallelism is based on a message passing paradigm, and is being implemented using PVM. At the present time they have determined that approximately 70% of the sequential program can be executed in parallel. Accordingly, they expect that the parallel version will yield a speedup on the order of three times that of the sequential version. This translates into only 10 hours of execution time for the parallel version, whereas the sequential version required 30 hours

  17. Mantle Convection on Modern Supercomputers

    Science.gov (United States)

    Weismüller, J.; Gmeiner, B.; Huber, M.; John, L.; Mohr, M.; Rüde, U.; Wohlmuth, B.; Bunge, H. P.

    2015-12-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures is handled successfully only in an interdisciplinary context. A new priority program - named SPPEXA - by the German Research Foundation (DFG) addresses this issue, and brings together computer scientists, mathematicians and application scientists around grand challenges in HPC. Here we report from the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection and assess the impact of small scale processes on global mantle flow.

  18. Socio-economic research for innovative energy technologies

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Yuichi [Tokyo Univ., High Temperature Plasma Center, Kashiwa, Chiba (Japan); Okano, Kunihiko [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    2006-10-15

    In the 21st century global environment and energy issues become very important, and this is characterized by the long-term (in the scale of a few tens years) and world-wide issue. In addition, future prospect of these issues might be quite uncertain, and scientific prediction could be very difficult. For these issues vigorous researches and various efforts have been carried out from various aspects; e.g., world-wide discussion such as COP3 in Kyoto, promotion of the energy-saving technology and so on. Development of environment-friendly energy has been promoted, and new innovative technologies are explored. Nuclear fusion is, of course, a promising candidate. While, there might be some criticism for nuclear fusion from the socio-economic aspect; e.g., it would take long time and huge cost for the fusion reactor development. In addition, other innovative energy technologies might have their own criticism, as well. Therefore, socio-economic research might be indispensable for future energy resources. At first we have selected six items as for the characteristics, which might be important for future energy resources; i.e., energy resource, environmental load, economics, reliability/stability, flexibility on operation and safety/security. Concerning to innovative energy technologies, we have nominated seven candidates; i.e., advanced coal technology with CO2 recovery system, SOFC top combined cycle, solar power, wind power, space solar power station, advanced fission and fusion. Based on questionnaires for ordinary people and fusion scientists, we have tried to assess the fusion energy development, comparing with other innovative energy technologies. (author)

  19. National Renewable Energy Laboratory 2002 Research Review (Booklet)

    Energy Technology Data Exchange (ETDEWEB)

    Cook, G.; Epstein, K.; Brown, H.

    2002-07-01

    America is making a long transition to a future in which conventional, fossil fuel technologies will be displaced by new renewable energy and energy efficiency technologies. This first biannual research review describes NREL's R&D in seven technology areas--biorefineries, transportation, hydrogen, solar electricity, distributed energy, energy-efficient buildings, and low-wind-speed turbines.

  20. Integrated modelling of ecosystem services and energy systems research

    Science.gov (United States)

    Agarwala, Matthew; Lovett, Andrew; Bateman, Ian; Day, Brett; Agnolucci, Paolo; Ziv, Guy

    2016-04-01

    The UK Government is formally committed to reducing carbon emissions and protecting and improving natural capital and the environment. However, actually delivering on these objectives requires an integrated approach to addressing two parallel challenges: de-carbonising future energy system pathways; and safeguarding natural capital to ensure the continued flow of ecosystem services. Although both emphasise benefiting from natural resources, efforts to connect natural capital and energy systems research have been limited, meaning opportunities to improve management of natural resources and meet society's energy needs could be missed. The ecosystem services paradigm provides a consistent conceptual framework that applies in multiple disciplines across the natural and economic sciences, and facilitates collaboration between them. At the forefront of the field, integrated ecosystem service - economy models have guided public- and private-sector decision making at all levels. Models vary in sophistication from simple spreadsheet tools to complex software packages integrating biophysical, GIS and economic models and draw upon many fields, including ecology, hydrology, geography, systems theory, economics and the social sciences. They also differ in their ability to value changes in natural capital and ecosystem services at various spatial and temporal scales. Despite these differences, current models share a common feature: their treatment of energy systems is superficial at best. In contrast, energy systems research has no widely adopted, unifying conceptual framework that organises thinking about key system components and interactions. Instead, the literature is organised around modelling approaches, including life cycle analyses, econometric investigations, linear programming and computable general equilibrium models. However, some consistencies do emerge. First, often contain a linear set of steps, from exploration to resource supply, fuel processing, conversion

  1. Annual report of the Japan Atomic Energy Research Institute, for fiscal 1988

    International Nuclear Information System (INIS)

    1989-01-01

    At present, a half century has elapsed since the discovery of nuclear fission, and atomic energy has taken the position of basic energy already, accordingly the development and utilization of atomic energy is very important as the energy source which can supply energy for long term economically and stably. Along the long term plan of atomic energy development and utilization decided in 1987, Japan Atomic Energy Research Institute (JAERI) advanced the research and development, thus it has borne the role as the nucleus general research institute in atomic energy fields. It has exerted efforts to obtain the understanding and trust of the nation on atomic energy, and has promoted the pioneering project research, such as safety research, high temperature engineering test and research, the research and development of nuclear fusion, the research on radiation utilization and the research and development of nuclear-powered ships. In the safety research, in order to contribute to the further rooting of LWRs and the establishment of nuclear fuel cycle, the research on the engineering safety of nuclear facilities and environmental safety has been advanced. The activities in respective research fields are summarized. Also the international cooperation with USA, FRG, China and others were carried out smoothly. (K.I.)

  2. Energy research and technology in Bavaria; Energieforschung und -technologie in Bayern

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    The intensification of research and development of new energy conversion technologies contribute significantly to the energy supply. In particular, the research and development in the fields of power generation, energy supply, energy conservation and efficient use of energy in buildings and production processes, innovation in grids and infrastructure as well as improved, promoted innovative storage technologies are intensively reported. This brochure shows how the research and development have an important contribution to the success of the energy policy tunaround in Bavaria.

  3. Report on the national strategy of research in the energy domain

    International Nuclear Information System (INIS)

    2007-05-01

    This report presents the energy situation in France and the place of the research in the energy policy. It discusses the political and legal context, the strategy orientations, the energy efficiency, the renewable energies, the fossil energies, the nuclear energy and the socio-economic factors. The actors of the energy research are detailed. (A.L.B.)

  4. Colloborative International Resesarch on the Water Energy Nexus: Lessons Learned from the Clean Energy Research Center - Water Energy Technologies (CERC-WET)

    Science.gov (United States)

    Remick, C.

    2017-12-01

    The U.S.-China Clean Energy Research Center - Water and Energy Technologies (CERC-WET) is a global research partnership focused on developing and deploying technologies that to allow the U.S. and China to thrive in a future with constrained energy and water resources in a changing global climate. This presentation outlines and addresses the opportunities and challenges for international research collaboration on the so called "water-energy nexus", with a focus on industrial partnership, market readiness, and intellectual property. The U.S. Department of Energy created the CERC program as a research and development partnership between the United States and China to accelerate the development and deployment of advanced clean energy technologies. The United States and China are not only the world's largest economies; they are also the world's largest energy producers and energy consumers. Together, they account for about 40% of annual global greenhouse gas emissions. The bilateral investment in CERC-WET will total $50 million over five years and will target on the emerging issues and cut-edge research on the topics of (1) water use reduction at thermoelectric plants; (2) treatment and management of non-traditional waters; (3) improvements in sustainable hydropower design and operation; (4) climate impact modeling, methods, and scenarios to support improved understanding of energy and water systems; and (5) data and analysis to inform planning and policy.

  5. Preface: photosynthesis and hydrogen energy research for sustainability.

    Science.gov (United States)

    Tomo, Tatsuya; Allakhverdiev, Suleyman I

    2017-09-01

    Energy supply, climate change, and global food security are among the main chalenges facing humanity in the twenty-first century. Despite global energy demand is continuing to increase, the availability of low cost energy is decreasing. Together with the urgent problem of climate change due to CO 2 release from the combustion of fossil fuels, there is a strong requirement of developing the clean and renewable energy system for the hydrogen production. Solar fuel, biofuel, and hydrogen energy production gained unlimited possibility and feasibility due to understanding of the detailed photosynthetic system structures. This special issue contains selected papers on photosynthetic and biomimetic hydrogen production presented at the International Conference "Photosynthesis Research for Sustainability-2016", that was held in Pushchino (Russia), during June 19-25, 2016, with the sponsorship of the International Society of Photosynthesis Research (ISPR) and of the International Association for Hydrogen Energy (IAHE). This issue is intended to provide recent information on the photosynthetic and biohydrogen production to our readers.

  6. Solar energy in progress and future research trends

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Zekai [Istanbul Technical Univ., Dept. of Meteorology, Istanbul (Turkey)

    2004-07-01

    Extensive fossil fuel consumption in almost all human activities led to some undesirable phenomena such as atmospheric and environmental pollutions, which have not been experienced before in known human history. Consequently, global warming, greenhouse affect, climate change, ozone layer depletion and acid rain terminologies started to appear in the literature frequently. Since 1970, it has been understood scientifically by experiments and researches that these phenomena are closely related to fossil fuel uses because they emit greenhouse gases such as carbon dioxide (CO{sub 2}) and methane (CH{sub 4}) which hinder the long wave terrestrial radiation to escape into space, and consequently, the earth troposphere becomes warmer. In order to avoid further impacts of these phenomena, the two concentrative alternatives are either to improve the fossil fuel quality with reductions in their harmful emissions into the atmosphere or more significantly to replace fossil fuel usage as much as possible with environmentally friendly, clean and renewable energy sources. Among these sources, solar energy comes at the top of the list due to its abundance, and more evenly distribution in nature than any other renewable energy types such as wind, geothermal, hydro, wave and tidal energies. It must be the main and common purpose of humanity to sustain environment for the betterment of future generations with sustainable energy developments. On the other hand, the known limits of fossil fuels compel the societies of the world in the long run to work jointly for their gradual replacement by renewable energy alternatives rather than the quality improvement of fossil sources. Solar radiation is an integral part of different renewable energy resources. It is the main and continuous input variable from practically inexhaustible sun. Solar energy is expected to play a very significant role in the future especially in developing countries, but it has also potential prospects for developed

  7. Energy-Performance-Based Design-Build Process: Strategies for Procuring High-Performance Buildings on Typical Construction Budgets: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Scheib, J.; Pless, S.; Torcellini, P.

    2014-08-01

    NREL experienced a significant increase in employees and facilities on our 327-acre main campus in Golden, Colorado over the past five years. To support this growth, researchers developed and demonstrated a new building acquisition method that successfully integrates energy efficiency requirements into the design-build requests for proposals and contracts. We piloted this energy performance based design-build process with our first new construction project in 2008. We have since replicated and evolved the process for large office buildings, a smart grid research laboratory, a supercomputer, a parking structure, and a cafeteria. Each project incorporated aggressive efficiency strategies using contractual energy use requirements in the design-build contracts, all on typical construction budgets. We have found that when energy efficiency is a core project requirement as defined at the beginning of a project, innovative design-build teams can integrate the most cost effective and high performance efficiency strategies on typical construction budgets. When the design-build contract includes measurable energy requirements and is set up to incentivize design-build teams to focus on achieving high performance in actual operations, owners can now expect their facilities to perform. As NREL completed the new construction in 2013, we have documented our best practices in training materials and a how-to guide so that other owners and owner's representatives can replicate our successes and learn from our experiences in attaining market viable, world-class energy performance in the built environment.

  8. Swiss Federal Energy Research Commission - Annual report 2008

    International Nuclear Information System (INIS)

    Maus, K.

    2009-01-01

    This annual report presents a review of the activities carried out by the Swiss Federal Energy Research Commission CORE in the year 2008. Main points of interest were the definition of a new CORE vision, a review of all research programmes, co-operation and co-ordination with public and private institutes, active consultancy, recommendations for further education and training, improved international information exchange and good communication with business, politics and the general public. The definition of a concept for Swiss energy research for the period 2012 to 2016 is mentioned. The annual report also reports on an internal visit made to various laboratories of the Swiss Federal Institute of Technology in Lausanne and the Energy Center in Zurich. The focussing of CORE activities on particular themes is discussed

  9. Research and Development Financing in the Renewable Energy Industry in Brazil

    Directory of Open Access Journals (Sweden)

    Muriel de Oliveira Gavira

    2014-09-01

    Full Text Available In the last decades, the Brazilian government has put many public policies in place in order to create a favourable environment to promote energy efficiency and clean energy. In this paper we discuss the use of research and development financing support by the clean energy industry in Brazil. To do so, we carried out an empirical research analysing secondary data from legislation, literature case studies, and public and industry reports in order to determine if the companies of the clean energy industry have public financial support to research and development. Our ongoing research shows that, despite incentives to stimulate the dissemination of clean energy, the participation of some of the clean energy is very small (especially solar. We believe that the contributions of this study will assist policy makers, and the whole industry, to improve clean energy research and development investments in Brazil.

  10. Energy Smart Schools--Applied Research, Field Testing, and Technology Integration

    Energy Technology Data Exchange (ETDEWEB)

    Nebiat Solomon; Robin Vieira; William L. Manz; Abby Vogen; Claudia Orlando; Kimberlie A. Schryer

    2004-12-01

    The National Association of State Energy Officials (NASEO) in conjunction with the California Energy Commission, the Energy Center of Wisconsin, the Florida Solar Energy Center, the New York State Energy Research and Development Authority, and the Ohio Department of Development's Office of Energy Efficiency conducted a four-year, cost-share project with the U.S. Department of Energy (USDOE), Office of Energy Efficiency and Renewable Energy to focus on energy efficiency and high-performance technologies in our nation's schools. NASEO was the program lead for the MOU-State Schools Working group, established in conjunction with the USDOE Memorandum of Understanding process for collaboration among state and federal energy research and demonstration offices and organizations. The MOU-State Schools Working Group included State Energy Offices and other state energy research organizations from all regions of the country. Through surveys and analyses, the Working Group determined the school-related energy priorities of the states and established a set of tasks to be accomplished, including the installation and evaluation of microturbines, advanced daylighting research, testing of schools and classrooms, and integrated school building technologies. The Energy Smart Schools project resulted in the adoption of advanced energy efficiency technologies in both the renovation of existing schools and building of new ones; the education of school administrators, architects, engineers, and manufacturers nationwide about the energy-saving, economic, and environmental benefits of energy efficiency technologies; and improved the learning environment for the nation's students through use of better temperature controls, improvements in air quality, and increased daylighting in classrooms. It also provided an opportunity for states to share and replicate successful projects to increase their energy efficiency while at the same time driving down their energy costs.

  11. 10 CFR 605.5 - The Office of Energy Research Financial Assistance Program.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false The Office of Energy Research Financial Assistance Program. 605.5 Section 605.5 Energy DEPARTMENT OF ENERGY (CONTINUED) ASSISTANCE REGULATIONS THE OFFICE OF ENERGY RESEARCH FINANCIAL ASSISTANCE PROGRAM § 605.5 The Office of Energy Research Financial Assistance Program. (a) DOE may issue, under the Office o...

  12. Fiscal 1998 research report on the basic research on energy saving for Huta Katowice, Poland; 1998 nendo Poland Katowice seitetsusho sho energy kihon chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For reduction of greenhouse effect gas emission by Japan- Poland joint project, research was made on the improvement plan of Huta Katowice, a typical ironworks in Poland. The applicability of energy saving facilities was studied for the coke factory, sintering factory, blast furnace and converter which are consuming the most amount of energy in Huta Katowice, from the viewpoints of the amount of energy saving and CO{sub 2} reduction, and an investment effect. Research was also made on the power plant, flow of by-product gas, generated energy and purchased energy for obtaining the total energy consumption. From the energy saving research results for every process, the priority of the improvement plan was prepared. The proposed plan showed huge reduction of energy and greenhouse effect gas emission. Although the plan is too serious for comfort because of current low energy cost in Poland, it becomes reasonable if a lower-interest fund loan is granted, from the viewpoint of an investment effect. (NEDO)

  13. Solving sparse linear least squares problems on some supercomputers by using large dense blocks

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Ostromsky, T; Sameh, A

    1997-01-01

    technique is preferable to sparse matrix technique when the matrices are not large, because the high computational speed compensates fully the disadvantages of using more arithmetic operations and more storage. For very large matrices the computations must be organized as a sequence of tasks in each......Efficient subroutines for dense matrix computations have recently been developed and are available on many high-speed computers. On some computers the speed of many dense matrix operations is near to the peak-performance. For sparse matrices storage and operations can be saved by operating only...... and storing only nonzero elements. However, the price is a great degradation of the speed of computations on supercomputers (due to the use of indirect addresses, to the need to insert new nonzeros in the sparse storage scheme, to the lack of data locality, etc.). On many high-speed computers a dense matrix...

  14. Incentives for research. Three projects awarded the 'BP Energy Research Prize'

    Energy Technology Data Exchange (ETDEWEB)

    1980-07-01

    Three projects are described that have been awarded the BP-energy-research prize. These are: absorption heat pumps with a high heat ratio, fuels from sewage sludge, chemical heat storage of solar energy.

  15. Australian Atomic Energy Commission: A new energy research establishment at Lucas Heights

    Energy Technology Data Exchange (ETDEWEB)

    Moyal, A [Sydney Univ. (Australia). Dept. of Government and Public Administration

    1980-02-01

    A review of the role of the Atomic Energy Commission has recommended that the Lucas Heights establishment should engage in research on energy sources in general, rather than nuclear only as at present, and that certain of its present functions (regulatory and manufacturing) should be handled by other organisations.

  16. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  17. Green energy and hydrogen research at University of Waterloo

    International Nuclear Information System (INIS)

    Fowler, M.

    2006-01-01

    This paper summarises Green Energy and Hydrogen Research at the University of Waterloo in Canada. Green energy includes solar, wind, bio fuels, hydrogen economy and conventional energy sources with carbon dioxide sequestration

  18. National Renewable Energy Laboratory 2005 Research Review

    Energy Technology Data Exchange (ETDEWEB)

    Brown, H.; Gwinner, D.; Miller, M.; Pitchford, P.

    2006-06-01

    Science and technology are at the heart of everything we do at the National Renewable Energy Laboratory, as we pursue innovative, robust, and sustainable ways to produce energy--and as we seek to understand and illuminate the physics, chemistry, biology, and engineering behind alternative energy technologies. This year's Research Review highlights the Lab's work in the areas of alternatives fuels and vehicles, high-performing commercial buildings, and high-efficiency inverted, semi-mismatched solar cells.

  19. ANCRE - Stage report 2011, coordination of research on energy

    International Nuclear Information System (INIS)

    Bigot, Bernard; Fuchs, Alain; Ouabdesselam, Farid; Appert, Olivier; Freyssinet, Philippe; Moisan, Francois

    2011-11-01

    This document aims at proposing an assessment of works performed by ANCRE (the French National Alliance of Coordination of Research on Energy) after its first two years of existence. The main objective is to prepare the energy transition by boosting the French research in the field of energy. The report presents the ANCRE's organization as an efficient one, based on strong relationships with the industry sector. It indicates the various thematic work-groups (five 'energy sources' groups and three 'usages' groups), and the different objectives. It comments the contribution to the European research. These works and activities are commented by some high representatives of the alliance. A second part proposes an overview of the current status for the different energy sources group (biomass, fossil and geothermal, nuclear, solar, sea, hydraulic and wind) and usages groups (transports, buildings, industries and agriculture). It also presents the different actions related to coordination, programmatic synergies and prospective

  20. Netherlands Energy Research Foundation Annual Report 1987

    International Nuclear Information System (INIS)

    1988-06-01

    This Annual Report includes a brief survey of the nuclear research activities of the Netherlands Energy Research Center (ECN) in Petten during 1987. They cover the following subjects: reactor safety, processing, storage and disposal of radioactive waste, advanced nuclear reactors, radiation protection, nuclear analysis, and contributions to the European thermonuclear-fusion research. (H.W.). 20 figs.; 18 fotos; 1 tab

  1. Experimental program to stimulate competitive energy research in North Dakota: Summary and significance of DOE Trainee research

    Energy Technology Data Exchange (ETDEWEB)

    Boudjouk, Philip

    1999-07-01

    The general goals of the North Dakota DOE/EPSCoR Program are to enhance the capabilities of North Dakota's researchers to conduct nationally competitive energy-related research and to develop science and engineering human resources to meet current and future needs in energy-related areas. Doctoral students were trained and energy research was conducted.

  2. Residential Energy Efficiency Research Planning Meeting Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-02-01

    This report summarizes key findings and outcomes from the U.S. Department of Energy's Building America Residential Energy Efficiency Research Planning meeting, held on October 28-29, 2011, in Washington, D.C.

  3. Program for Energy Research and Technologies 1977--1980. Annual report 1977 on efficient uses of energy fossil sources of primary energy new sources of energy

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The main objectives within the policy of the Federal Government Program for Energy Research and Technologies 1977--1980 can be summarized as follows: guaranteeing the continuity of energy supply in the medium to long term in the Federal Republic at economically favourable costs considering the requirements necessary for the protection of the environment and population. The financial support is effected under the general headings of Development of Energy Resources, Energy Conservation and Efficient Use of Energy. An additional aspect of the support policy is the development of technologies which are of importance for other countries, specifically for the developing countries. Support of a project is effected through a research and development grant from the Federal Government and this can range from less than 50% to 100%. For this the Government receives an irrevocable, free of charge and non-exclusive right to make use of research and development results. In special cases full repayment is agreed subject to commercial success. Based on agreements signed by the Federal Minister of Research and Technology and the Federal Minister of Economic Affairs on the one hand and the Juelich Nuclear Research Establishment (KFA) on the other, the Project Management for Energy Research (PLE) in KFA Juelich is acting on behalf of these Ministries. The Project Management's activities in non-nuclear energy research in general (for the Federal Ministry of Research and Technology) and development and innovation in coal mining and preparation (for the Federal Ministry of Economic Affairs) have the following general objectives: to improve the efficiency of Government support; to ensure that projects are efficiently handled; and to reduce the workload of the Ministries. The individual projects are listed and described briefly.

  4. Fusion Energy Postdoctoral Research Program, Professional Development Program: FY 1987 annual report

    International Nuclear Information System (INIS)

    1988-01-01

    In FY 1986, Oak Ridge Associated Universities (ORAU) initiated two programs for the US Department of Energy (DOE), Office of Fusion Energy (OFE): the Fusion Energy Postdoctoral Research Program and the Fusion Energy Professional Development Program. These programs provide opportunities to conduct collaborative research in magnetic fusion energy research and development programs at DOE laboratories and contractor sites. Participants become trained in advanced fusion energy research, interact with outstanding professionals, and become familiar with energy-related national issues while making personal contributions to the search for solutions to scientific problems. Both programs enhance the national fusion energy research and development effort by providing channels for the exchange of scientists and engineers, the diffusion of ideas and knowledge, and the transfer of relevant technologies. These programs, along with the Magnetic Fusion Energy Science and Technology Fellowship Programs, compose the fusion energy manpower development programs administered by ORAU for DOE/OFE

  5. Fiscal year 2013 energy department budget: Proposed investments in clean energy research

    Science.gov (United States)

    Balcerak, Ernie

    2012-03-01

    Energy and environmental research programs generally fared well in President Barack Obama's proposed budget for the Department of Energy (DOE) for fiscal year (FY) 2013. In his State of the Union address, Obama called for the United States to pursue an "all of the above" energy strategy that includes fossil fuels, as well as a variety of renewable sources of energy. The DOE budget request supports that strategy, Energy Secretary Steven Chu said in a 13 February press briefing announcing the budget proposal. The proposed budget gives DOE 27.2 billion overall, a 3.2% increase from the FY 2012 enacted budget (see Table 1). This budget "reflects some tough choices," Chu said. The proposed budget would cut 4 billion in subsidies for oil and gas companies; many Republican members of Congress have already indicated that they oppose such cuts, suggesting that congressional approval of this budget may run into stumbling blocks. The budget would also cut funding for research and development projects that are already attracting private-sector investment or that are not working, and would reduce some of the department's operational costs.

  6. Long tracks Energy: Views on ten years of research on green energy; Lange spor Energi: Blikk paa ti aar med forskning paa miljoevennlig energi

    Energy Technology Data Exchange (ETDEWEB)

    Coldevin, Grete Haakonsen

    2011-07-01

    RENERGI is the The Research Council of Norway's strategic research program aimed the energy sector. The program has given and gives us central knowledge about technologies, solutions, policies and instruments that could help solve the energy and climate challenges and support the Norwegian business opportunities for value creation.The program period extending from 2004 to 2013. This booklet presents an analysis Research shows the development of the research program has funded from the beginning to the present. The analysis of RENERGI, we call it 'Long Track' follows selected projects and portfolios of projects through more year to track the effectiveness Research Council funding had. And what are the key findings? Experiences from RENERGI and from the precursors to the program, show that increased funding for research in this field triggers innovative research and innovation.The program means adapted to industry characteristics: Energy requires strong communities, and the program has helped to build up such over time. These strong communities nest ground for the establishment of Centres for environmentally friendly energy (FME), the newest addition by the Research Council of instruments in this area. Analysis of research over time are important because they show effect of investing public funds in research. When initiate research, there is great expectations to short-term results. We know it can take time before results come. What gives returns today often based on research many years. Research using these analyzes as part of our knowledge base for future priorities and in our dialogue with research funding ministries. Research teams can use the results to summarize own operations over time. Businesses can benefit greatly Long-term analysis showing that energy research has opened for business start-ups and established companies given new products they can create value from. 'Long Track' emphasizes the importance of thinking long-term: When a research starts up, it

  7. Long tracks Energy: Views on ten years of research on green energy; Lange spor Energi: Blikk paa ti aar med forskning paa miljoevennlig energi

    Energy Technology Data Exchange (ETDEWEB)

    Coldevin, Grete Haakonsen

    2011-07-01

    RENERGI is the The Research Council of Norway's strategic research program aimed the energy sector. The program has given and gives us central knowledge about technologies, solutions, policies and instruments that could help solve the energy and climate challenges and support the Norwegian business opportunities for value creation.The program period extending from 2004 to 2013. This booklet presents an analysis Research shows the development of the research program has funded from the beginning to the present. The analysis of RENERGI, we call it 'Long Track' follows selected projects and portfolios of projects through more year to track the effectiveness Research Council funding had. And what are the key findings? Experiences from RENERGI and from the precursors to the program, show that increased funding for research in this field triggers innovative research and innovation.The program means adapted to industry characteristics: Energy requires strong communities, and the program has helped to build up such over time. These strong communities nest ground for the establishment of Centres for environmentally friendly energy (FME), the newest addition by the Research Council of instruments in this area. Analysis of research over time are important because they show effect of investing public funds in research. When initiate research, there is great expectations to short-term results. We know it can take time before results come. What gives returns today often based on research many years. Research using these analyzes as part of our knowledge base for future priorities and in our dialogue with research funding ministries. Research teams can use the results to summarize own operations over time. Businesses can benefit greatly Long-term analysis showing that energy research has opened for business start-ups and established companies given new products they can create value from. 'Long Track' emphasizes the importance of thinking long-term: When

  8. Energy Frontier Research Centers: Impact Report, January 2017

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-01-31

    Since its inception in 2009, the U. S. Department of Energy’s Energy Frontier Research Center (EFRC) program has become an important research modality in the Department’s portfolio, enabling high impact research that addresses key scientific challenges for energy technologies. Funded by the Office of Science’s Basic Energy Sciences program, the EFRCs are located across the United States and are led by universities, national laboratories, and private research institutions. These multi-investigator, multidisciplinary centers bring together world-class teams of researchers, often from multiple institutions, to tackle the toughest scientific challenges preventing advances in energy technologies. The EFRCs’ fundamental scientific advances are having a significant impact that is being translated to industry. In 2009 five-year awards were made to 46 EFRCs, including 16 that were fully funded by the American Recovery and Reinvestment Act (ARRA). An open recompetition of the program in 2014 resulted in fouryear awards to 32 centers, 22 of which are renewals of existing EFRCs and 10 of which are new EFRCs. In 2016, DOE added four new centers to accelerate the scientific breakthroughs needed to support the Department’s environmental management and nuclear cleanup mission, bringing the total number of active EFRCs to 36. The impact reports in this document describe some of the many scientific accomplishments and greater impacts of the class of 2009 – 2018 EFRCs and early outcomes from a few of the class of 2014 – 2018 EFRCs.

  9. A preliminary assessment of the potential for 'team science' in DOE Energy Innovation Hubs and Energy Frontier Research Centers

    International Nuclear Information System (INIS)

    Boardman, Craig; Ponomariov, Branco

    2011-01-01

    President Obama has called for the development of new energy technologies to address our national energy needs and restore US economic competitiveness. In response, the Department of Energy has established new R and D modalities for energy research and development designed to facilitate collaboration across disciplinary, institutional, and sectoral boundaries. In this research note, we provide a preliminary assessment of the potential for essential mechanisms for coordinated problem solving among diverse actors within two new modalities at the DOE: Energy Innovation Hubs and Energy Frontier Research Centers. - Highlights: → Energy Frontier Research Centers may lack the basic mechanisms for coordinating diverse actors. → Divergent goals across diverse actors may hinder coordination in Energy Innovation Hubs. → The implementation of these and similar energy policies require further investigation.

  10. Which research for tomorrow's energy? 2012 Energy Colloquium 2. release

    International Nuclear Information System (INIS)

    Antonini, Gerard; Arrif, Teddy; Bain, Pascal; Beguin, Francois; Bruneaux, Gilles; Cetin, Derya; Czernichowski, Isabelle; Escudie, Dany; Folacci, Marie-Ange; Gosse, Kevin; Hareux, Sylvie; Metaye, Romain; Morel, Herve; Odru, Pierre; Oukacine, Linda; Pons, Liz; Tournier, Aline; Corgier, David; Thollin, Jacques; Barret, Mickael; Mosdale, Renaut; Hervouet, Veronique; Pourcelly, Gerald; Brousse, Thierry; Lincot, Daniel; Schmidt-Laine, Claudine; Artero, Vincent; Robinson, Darren; Bigot, Bernard; Salha, Bernard; Minster, Jean-Francois; Hauet, Bertrand

    2012-01-01

    This huge publication gathers interventions and contributions of a colloquium which notably addressed the following issues: bio-energies, hydrogen and fuel cells, energy storage, photovoltaic solar energy, energy efficiency in buildings, transports and industry, CO 2 capture and storage. On the first day, after two interventions on Energies Programmes at the ANR and an overview of R and D world challenges regarding energy, the contributions addressed the above mentioned issues. During the next day, besides these issues, contributions addressed challenges for tomorrow's society and perspectives for research. Thematic sessions addressed bio-energies (optimized production of cellulose ethanol, the third generation, technological deadlocks for the thermal-chemical route), photovoltaic solar energy (new concepts, massive crystalline silicon and photovoltaic thin layers), high energy efficiency buildings, energetic equipment and climate engineering, CO 2 storage, CO 2 capture, fuel cells, hydrogen production, transport and storage, electrochemical and non-electrochemical storage of energy, transports (internal combustion engine and power units, electric transports)

  11. Energy research program: energy in buildings for the years 2008-2011; Energieforschungsprogramm. Energie in Gebaeuden fuer die Jahre 2008-2011

    Energy Technology Data Exchange (ETDEWEB)

    Filleux, Ch.

    2009-08-15

    In Switzerland, existing buildings account for approximately 50% of primary energy consumption. Climate change, as well as the demand on supply, require that Swiss construction practices be immediately adapted. For new buildings, innovative technologies are now widely available. However, their integration into new construction is still too slow due to the fact that current construction practices still lack a holistic approach. Today there also lacks practical solutions for renovations of existing buildings. Therefore, the great challenge for research and development today are 1.5 million pre-existing buildings, which will dictate the future energy consumption for decades. The Federal Energy Research Commission (CORE) has recognized the situation and has considered these issues in its 2008 - 2011 concept for federal energy research. The present research programme Energy in Buildings of the Swiss Federal Office of Energy focuses on the long-term objectives of CORE. This results in the following actions in the building sector: (a) Reducing energy consumption and improving energy efficiency; (b) Integration of renewable energy sources; (c) Reduction of CO{sub 2} emissions through the use of improved technologies. The research programme is therefore focused on concepts and technologies that have long-term objectives, without neglecting the short and medium term goals. The objectives for the period 2008 - 2011 are: (i) Concepts for buildings and housing developments concerning the development of construction methods that are compatible with the goal of a 2,000-watt society (preservation of architectural diversity, use of passive solar energy and daylight); (ii) Concepts, technologies and planning tools for the improvement of energy systems in buildings; (iii) Heating, cooling and ventilation systems in buildings that are compatible with the goal of a 2,000-watt society (efficient cooling systems, heat pumps, etc.); (iv) Increase in efficient use of electricity in

  12. Energy Efficient Community Development in California: Chula Vista Research Project

    Energy Technology Data Exchange (ETDEWEB)

    Gas Technology Institute

    2009-03-31

    In 2007, the U.S. Department of Energy joined the California Energy Commission in funding a project to begin to examine the technical, economic and institutional (policy and regulatory) aspects of energy-efficient community development. That research project was known as the Chula Vista Research Project for the host California community that co-sponsored the initiative. The researches proved that the strategic integration of the selected and economically viable buildings energy efficiency (EE) measures, photovoltaics (PV), distributed generation (DG), and district cooling can produce significant reductions in aggregate energy consumption, peak demand and emissions, compared to the developer/builder's proposed baseline approach. However, the central power plant emission reductions achieved through use of the EE-DG option would increase local air emissions. The electric and natural gas utility infrastructure impacts associated with the use of the EE and EE-PV options were deemed relatively insignificant while use of the EE-DG option would result in a significant reduction of necessary electric distribution facilities to serve a large-scale development project. The results of the Chula Vista project are detailed in three separate documents: (1) Energy-Efficient Community Development in California; Chula Vista Research Project report contains a detailed description of the research effort and findings. This includes the methodologies, and tools used and the analysis of the efficiency, economic and emissions impacts of alternative energy technology and community design options for two development sites. Research topics covered included: (a) Energy supply, demand, and control technologies and related strategies for structures; (b) Application of locally available renewable energy resources including solar thermal and PV technology and on-site power generation with heat recovery; (c) Integration of local energy resources into district energy systems and existing

  13. Energy Savings Potential and Research & Development Opportunities for Commercial Refrigeration

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2009-09-01

    This study documents the energy consumption of commercial refrigeration equipment (CRE) in the U.S. and evaluated the energy savings potential of various technologies and energy efficiency measures that could be applied to such equipment. The study provided an overview of CRE applications, assessed the energy-savings potential of CRE in the U.S., outline key barriers to adoption of energy-savings technologies, and recommended opportunities for advanced energy saving technology research. The study was modeled after an earlier 1996 report by Arthur D. Little, Inc., and updated key information, examined more equipment types, and outlined long-term research and development opportunities.

  14. Nuclear methods in environmental and energy research

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, J R [ed.

    1980-01-01

    A total of 75 papers were presented on nuclear methods for analysis of environmental and biological samples. Sessions were devoted to software and mathematical methods; nuclear methods in atmospheric and water research; nuclear and atomic methodology; nuclear methods in biology and medicine; and nuclear methods in energy research.

  15. Nuclear methods in environmental and energy research

    International Nuclear Information System (INIS)

    Vogt, J.R.

    1980-01-01

    A total of 75 papers were presented on nuclear methods for analysis of environmental and biological samples. Sessions were devoted to software and mathematical methods; nuclear methods in atmospheric and water research; nuclear and atomic methodology; nuclear methods in biology and medicine; and nuclear methods in energy research

  16. Fiscal 1974 research report. General research on hydrogen energy subsystems; 1974 nendo suiso riyo subsystem sogoteki kento hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1975-03-01

    Based on the contract research 'General research on hydrogen energy subsystems and their peripheral technologies' with Agency of Industrial Science and Technology, each of 7 organizations including Denki Kagaku Kyokai (Electrochemical Association) promoted the research on hydrogen energy subsystem, combustion, fuel cell, car engine, aircraft engine, gas turbine and chemical energy, respectively. This report summarizes the research result on the former of 2 committees on hydrogen energy and peripheral technologies promoted by Denki Kagaku Kyokai. The first part describes the merit, demerit, domestic and overseas R and D states, technical problems, and future research issue for every use form of hydrogen. This part also outlines the short-, medium- and long-term prospects for use of hydrogen and oxygen energy, and describes the whole future research issue. The second part summarizes the content of each committee report. Although on details the original reports of each committee should be lead, this report is useful for obtaining the outline of utilization of hydrogen energy. (NEDO)

  17. Final Report. Research in Theoretical High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, Jeffrey P. [San Francisco State Univ., CA (United States); Golterman, Maarten F.L. [San Francisco State Univ., CA (United States)

    2015-04-30

    Grant-supported research in theoretical high-energy physics, conducted in the period 1992-2015 is briefly described, and a full listing of published articles result from those research activities is supplied.

  18. The potential for quantitative sociological research on residential energy consumption in Denmark

    DEFF Research Database (Denmark)

    Hansen, Anders Rhiger

    2013-01-01

    sociological analysis into energy consumption, enabling researchers in Denmark to use information on energy consumption derived from the energy-supply companies. Furthermore, I present a preliminary research design that employs both a quantitative sociological perspective and the newly available data on actual...... energy consumption. The research design contains a descriptive analysis of how energy demand differs between different types of households. In my conclusion, I claim that quantitative sociological research on energy consumption has great potential for obtaining more knowledge on energy consumption......In this paper, I begin with a description of how a sociological perspective can be employed to understand energy consumption while taking into account that energy consumption is embedded in everyday social practices. Next, I describe how newly available data enhances the potential of quantitative...

  19. Department of Energy - Office of Science Early Career Research Program

    Science.gov (United States)

    Horwitz, James

    The Department of Energy (DOE) Office of Science Early Career Program began in FY 2010. The program objectives are to support the development of individual research programs of outstanding scientists early in their careers and to stimulate research careers in the disciplines supported by the DOE Office of Science. Both university and DOE national laboratory early career scientists are eligible. Applicants must be within 10 years of receiving their PhD. For universities, the PI must be an untenured Assistant Professor or Associate Professor on the tenure track. DOE laboratory applicants must be full time, non-postdoctoral employee. University awards are at least 150,000 per year for 5 years for summer salary and expenses. DOE laboratory awards are at least 500,000 per year for 5 years for full annual salary and expenses. The Program is managed by the Office of the Deputy Director for Science Programs and supports research in the following Offices: Advanced Scientific and Computing Research, Biological and Environmental Research, Basic Energy Sciences, Fusion Energy Sciences, High Energy Physics, and Nuclear Physics. A new Funding Opportunity Announcement is issued each year with detailed description on the topical areas encouraged for early career proposals. Preproposals are required. This talk will introduce the DOE Office of Science Early Career Research program and describe opportunities for research relevant to the condensed matter physics community. http://science.energy.gov/early-career/

  20. Road map for renewable energy research and development in Egypt

    Directory of Open Access Journals (Sweden)

    Adel K. Khalil

    2010-01-01

    Full Text Available Egypt possesses excellent potential for renewable energy (RE including solar, wind and biomass energy. Renewable energy technologies (RETs and systems have different needs for support in terms of research and development, demonstration and market development. For this purpose, the Energy Research Center (ERC at Cairo University has carried out a study with the ultimate goal of formulating a national development strategy and action plan for the local manufacture of renewable energy systems (RESs and components. The present study positions the different RETs and RESs and identifies the research and development needs for each technology. The study also suggests how to establish a competitive market for RET. For this purpose it builds and analyses a set of likely scenarios, and proposes a practical development strategy and a detailed action plan for achieving it.

  1. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M.; Banner, D. [Electricite de France (EDF)- R and D Division, 92 - Clamart (France)

    2003-07-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  2. Performance Evaluation of an Intel Haswell- and Ivy Bridge-Based Supercomputer Using Scientific and Engineering Applications

    Science.gov (United States)

    Saini, Subhash; Hood, Robert T.; Chang, Johnny; Baron, John

    2016-01-01

    We present a performance evaluation conducted on a production supercomputer of the Intel Xeon Processor E5- 2680v3, a twelve-core implementation of the fourth-generation Haswell architecture, and compare it with Intel Xeon Processor E5-2680v2, an Ivy Bridge implementation of the third-generation Sandy Bridge architecture. Several new architectural features have been incorporated in Haswell including improvements in all levels of the memory hierarchy as well as improvements to vector instructions and power management. We critically evaluate these new features of Haswell and compare with Ivy Bridge using several low-level benchmarks including subset of HPCC, HPCG and four full-scale scientific and engineering applications. We also present a model to predict the performance of HPCG and Cart3D within 5%, and Overflow within 10% accuracy.

  3. Office of Energy Research collaborative research programs administered by Oak Ridge Associated Universities: Annual report, FY 1987

    International Nuclear Information System (INIS)

    1988-02-01

    The US Department of Energy's (DOE) Office of Energy Research (OER) sponsors programs designed to encourage and support interaction between US colleges and universities and DOE research facilities. Faculty members, graduate students, undergraduates, and recent postgraduates participate in research and receive advanced training at DOE laboratories. Staff members from DOE laboratories visit campuses to deliver energy-related lectures and participate in seminars and classroom discussions. Oak Ridge Associated Universities (ORAU) has been involved in the developemnt and administration of these collaborative research programs since their inception. During FY 1987, ORAU administered appointments for the Office of Energy Research under the following two umbrella programs: University/DOE Laboratory Cooperative Program (Lab Co-op); Science and Engineering Research Semester (SERS). In addition, ORAU participated in a project to collect and assess information from individuals who had held research appointment as undergraduate students during a four-year period from 1979 to 1982. All of these activities are summarized in this report

  4. 369 TFlop/s molecular dynamics simulations on the Roadrunner general-purpose heterogeneous supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Swaminarayan, Sriram [Los Alamos National Laboratory; Germann, Timothy C [Los Alamos National Laboratory; Kadau, Kai [Los Alamos National Laboratory; Fossum, Gordon C [IBM CORPORATION

    2008-01-01

    The authors present timing and performance numbers for a short-range parallel molecular dynamics (MD) code, SPaSM, that has been rewritten for the heterogeneous Roadrunner supercomputer. Each Roadrunner compute node consists of two AMD Opteron dual-core microprocessors and four PowerXCell 8i enhanced Cell microprocessors, so that there are four MPI ranks per node, each with one Opteron and one Cell. The interatomic forces are computed on the Cells (each with one PPU and eight SPU cores), while the Opterons are used to direct inter-rank communication and perform I/O-heavy periodic analysis, visualization, and checkpointing tasks. The performance measured for our initial implementation of a standard Lennard-Jones pair potential benchmark reached a peak of 369 Tflop/s double-precision floating-point performance on the full Roadrunner system (27.7% of peak), corresponding to 124 MFlop/Watt/s at a price of approximately 3.69 MFlops/dollar. They demonstrate an initial target application, the jetting and ejection of material from a shocked surface.

  5. French National Alliance for Energy Research Coordination - Ancre, Activity Report 2015-2016

    International Nuclear Information System (INIS)

    Alazard-Toux, Nathalie; Allard, Francis; Becue, Thierry; Bernard, Herve; Bourgoin, Jean-Philippe; Brault, Pascal; Carre, Franck; Chabrelie, Marie-Francoise; Charrue, Herve; Colonna, Paul; Compere, Chantal; Criqui, Patrick; David, Sylvain; Devezeaux, Jean-Guy; Dollet, Alain; Duplan, Jean-Luc; Fabre, Francoise; Ferrant, Pierre; Flamant, Gilles; Forti, Laurent; Gentier, Sylvie; Gouy, Jean-Philippe; Hadj-Said, Nouredine; Lacour, Jean-Jacques; Latroche, Michel; Legrand, Jack; Lemoine, Fabrice; Le Net, Elisabeth; Le Thiez, Pierre; Lhomme-Maublanc, Julie; Lucchese, Paul; Malbranche, Philippe; Mermilliod, Nicole; Most, Jean-Michel; Rondot, Yolande; Tilagone, Richard; Touboul, Francoise; Uster, Guillaume; Vidal, Olivier

    2017-01-01

    Created on 17 July 2009, ANCRE (French National Alliance for Energy Research Coordination) brings together 19 research and innovation bodies and higher education institution consortia in the field of energy. Its missions, carried out in liaison with competitiveness clusters and funding agencies, are to: - reinforce synergies and partnerships between research bodies, universities and companies, - identify scientific and technical challenges hampering industrial development, - propose research and innovation programs and approaches to their implementation, - contribute to the development of national research strategy in the field of energy, as well as funding agency program development. Its 2 main societal challenges are: Clean, secure and efficient energy, and Sustainable mobility and urban systems. ANCRE mobilizes 200 scientists involved in 10 programmatic groups (1 - Energy from biomass, 2 - Fossil energy, geothermal energy, critical metals, 3 - Nuclear energy, 4 - Solar energy, 5 - Ocean, hydraulic and wind energy, 6 - Transport, 7 - Buildings, 8 - Industries and agriculture, 9 - Energy forecasting and economics, 10 - Energy networks and associated storage) and 2 cross-disciplinary groups (Strategy, Europe and international). This activity report presents the ANCRE's 2015-2016 Highlights, its future challenges, its contribution to public policy-making, its close cooperation with the French national research agency and active participation in European programs, its mobilizing, structuring and uniting communities, and its knowledge production and dissemination

  6. Energy Technologies Research and Education Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Ghassemi, Abbas [New Mexico State Univ., Las Cruces, NM (United States); Ranade, Satish [New Mexico State Univ., Las Cruces, NM (United States)

    2014-12-31

    For this project, the intended goal of the microgrid component was to investigate issues in policy and technology that would drive higher penetration of renewable energy, and to demonstrate implementation in a utility system. The work accomplished on modeling the dynamics of photovoltaic (PV) penetration can be expanded for practical application. Using such a tool those involved in public policy can examine what the effect of a particular policy initiative, e.g., renewable portfolio standards (RPS) requirements, might be in terms of the desired targets. The work in the area of microgrid design, protection, and operation is fundamental to the development of microgrids. In particular the “Energy Delivery” paradigm provides new opportunities and business models for utilities. Ultimately, Energy Delivery could accrue significant benefits in terms of costs and resiliency. The experimental microgrid will support continued research and allow the demonstration of technology for better integration of renewables. The algal biofuels component of the project was developed to enhance the test facility and to investigate the technical and economic feasibility of a commercial-scale geothermal algal biofuels operation for replication elsewhere in the arid Southwest. The project was housed at New Mexico State University’s (NMSU’s) Geothermal Aquaculture Facility (GAF) and a design for the inoculation train and algae grow-out process was developed. The facility was upgraded with modifications to existing electrical, plumbing and structural components on the GAF and surrounding grounds. The research work was conducted on biomass-processing, harvesting, dewatering, and extraction. Additionally, research was conducted to determine viability of using low-cost, wastewater from municipal treatment plants in the cultivation units as make-up water and as a source of nutrients, including nitrogen and soluble phosphorus. Data was collected on inputs and outputs, growth evaluation and

  7. Research activities on dosimetry for high energy neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    The external dosimetry research group of JAERI has been calculating dose conversion coefficients for high-energy radiations using particle transport simulation codes. The group has also been developing radiation dose measurement techniques for high-energy neutrons in collaboration with some university groups. (author)

  8. Japanese Strategy for Nuclear Energy Research and Development For the Future

    Energy Technology Data Exchange (ETDEWEB)

    Ihara, Yoshinori [Japan Atomic Energy Research Institute, Tokyo (Japan)

    1988-04-15

    As for the research and development of nuclear energy, the future is, I believe, very broad, deep and promising and there are still unnoticed frontiers whose development will give rise to the evolution of human society. In order to cultivate the frontiers we should have insight to distinguish what is fundamental and essential from what in not. We should also have a fighting spirit to challenge our dream. The Japan Atomic Energy Research Institute really wishes to become the place where many scientists and engineers from abroad meet and work with US with insight and a pioneering spirit. About thirty years ago, the first version of the Japanese 'Long-Term Program for Development and Utilization of Nuclear Energy' was drawn up by the Atomic Energy Commission for the first time. Since then, the Long-Term Program has been revised once every five years. The research, development and utilization of nuclear energy in Japan have been guided by the Long-Term Program, and it has clearly shown the Japanese strategy for Nuclear Energy R and D for the future at each stage of the for Nuclear Energy R and D for the future at each stage of the history. The latest version of the Long-Term Program was published in June 1987. It defines the outline of the philosophy and the scheme for promoting the basic measures related to the research, development and utilization of nuclear energy up to the year 2000 based on the long-range nuclear energy policy towards the 21st century. This Long-Term Program was drawn up by taking into consideration the essential changes of the by taking into consideration the essential changes of the environment surrounding nuclear energy during recent years from the viewpoints of the supply and demand for energy, the rise of public concern for nuclear safety, the role of nuclear research and development for the advancement of science and technology, and the international nuclear energy issues. In this article, the author would like to describe the basic

  9. Japanese Strategy for Nuclear Energy Research and Development For the Future

    International Nuclear Information System (INIS)

    Ihara, Yoshinori

    1988-01-01

    As for the research and development of nuclear energy, the future is, I believe, very broad, deep and promising and there are still unnoticed frontiers whose development will give rise to the evolution of human society. In order to cultivate the frontiers we should have insight to distinguish what is fundamental and essential from what in not. We should also have a fighting spirit to challenge our dream. The Japan Atomic Energy Research Institute really wishes to become the place where many scientists and engineers from abroad meet and work with US with insight and a pioneering spirit. About thirty years ago, the first version of the Japanese 'Long-Term Program for Development and Utilization of Nuclear Energy' was drawn up by the Atomic Energy Commission for the first time. Since then, the Long-Term Program has been revised once every five years. The research, development and utilization of nuclear energy in Japan have been guided by the Long-Term Program, and it has clearly shown the Japanese strategy for Nuclear Energy R and D for the future at each stage of the for Nuclear Energy R and D for the future at each stage of the history. The latest version of the Long-Term Program was published in June 1987. It defines the outline of the philosophy and the scheme for promoting the basic measures related to the research, development and utilization of nuclear energy up to the year 2000 based on the long-range nuclear energy policy towards the 21st century. This Long-Term Program was drawn up by taking into consideration the essential changes of the by taking into consideration the essential changes of the environment surrounding nuclear energy during recent years from the viewpoints of the supply and demand for energy, the rise of public concern for nuclear safety, the role of nuclear research and development for the advancement of science and technology, and the international nuclear energy issues. In this article, the author would like to describe the basic

  10. Decentralized energy studies: compendium of international studies and research

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, C.

    1980-03-01

    The purpose of the compendium is to provide information about research activities in decentralized energy systems to researchers, government officials, and interested citizens. The compendium lists and briefly describes a number of studies in other industrialized nations that involve decentralized energy systems. A contact person is given for each of the activities listed so that interested readers can obtain more information.

  11. USGS research on energy resources, 1986; program and abstracts

    Science.gov (United States)

    Carter, Lorna M.H.

    1986-01-01

    The extended abstracts in this volume are summaries of the papers presented orally and as posters in the second V. E. McKelvey Forum on Mineral and Energy Resources, entitled "USGS Research on Energy Resources-1986." The Forum has been established to improve communication between the USGS and the earth science community by presenting the results of current USGS research on nonrenewable resources in a timely fashion and by providing an opportunity for individuals from other organizations to meet informally with USGS scientists and managers. It is our hope that the McKelvey Forum will help to make USGS programs more responsive to the needs of the earth science community, particularly the mining and petroleum industries, and Win foster closer cooperation between organizations and individuals. The Forum was named after former Director Vincent E. McKelvey in recognition of his lifelong contributions to research, development, and administration in mineral and energy resources, as a scientist, as Chief Geologist, and as Director of the U.S. Geological Survey. The Forum will be an annual event, and its subject matter will alternate between mineral and energy resources. We expect that the format will change somewhat from year to year as various approaches are tried, but its primary purpose will remain the same: to encourage direct communication between USGS scientists and the representatives of other earth-science related organizations. Energy programs of the USGS include oil and gas, coal, geothermal, uranium-thorium, and oil shale; work in these programs spans the national domain, including surveys of the offshore Exclusive Economic Zone. The topics selected for presentation at this McKelvey Forum represent an overview of the scientific breadth of USGS research on energy resources. They include aspects of petroleum occurrence in Eastern United States rift basins, the origin of magnetic anomalies over oil fields, accreted terranes and energy-resource implications, coal

  12. Latin American research and development in the energy field

    International Nuclear Information System (INIS)

    Torres, J.E.

    1984-08-01

    This report is divided into six main sections. The first outlines the conceptual framework and methodology stressing the limitations that impede greater depth of analysis. The second, on the types and directions of research and development (R and D) activities in Latin America, is divided into three subsections, covering New and Renewable Sources of Energy (NRSE); conventional energy (including nuclear energy); and integrated energy resource R and D (primarily energy conservation and substitution, as well as energy policy and planning studies). In each subsection, I endeavoured to describe and critically assess R and D activities, achievements, and failures within the context of the limitations. Conclusions and recommendations in each case are implicitly or explicitly made depending on the field. In the third section, the state of science and technology policy on energy resources is presented. The fourth section draws together the conclusions and recommendations on further work to be done. The fifth section is a bibliography of 64 annotated and 52 unannotated items and the sixth, an appendix, is a directory of people working in the field of energy research

  13. Fiscal 1976 Sunshine Project research report. Interim report (hydrogen energy); 1976 nendo chukan hokokushoshu. Suiso energy

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1976-11-01

    This report summarizes the Sunshine Project research interim reports on hydrogen energy of every organizations. The report includes research items, laboratories, institutes and enterprises concerned, research targets, research plans, and progress conditions. The research items are as follows. (1) Hydrogen production technology (electrolysis, high- temperature high-pressure water electrolysis, 4 kinds of thermochemical techniques, direct thermolysis). (2) Hydrogen transport and storage technology (2 kinds of solidification techniques). (3) Hydrogen use technology (combustion technology, fuel cell, solid electrolyte fuel cell, fuel cell power system, hydrogen fuel engine). (4) Hydrogen safety measures technology (disaster preventive technology for gaseous and liquid hydrogen, preventing materials from embrittlement due to hydrogen, hydrogen refining, transport and storage systems, their safety technology). (5) Hydrogen energy system (hydrogen energy system, hydrogen use subsystems, peripheral technologies). (NEDO)

  14. The law for the Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    1977-01-01

    The law establishes the Japan Atomic Energy Research Institute in accordance with the Basic Act on Atomic Energy as a government corporation for the purpose of promoting R and D and utilizations of atomic energy (first chapter). The second chapter concerns the directors, advisers and personnel of the institute, namely a chairman of the board of directors, a vice-chairman, directors not more than seven persons, and auditors not more than two persons. The chairman represents and supervises the intitute, whom the prime minister appoints with the agreement of Atomic Energy Commission. The vice-chairman and other directors are nominated by the chairman with the approval of the prime minister, while the auditors are appointed by the prime minister with the advice of the Atomic Energy Commission. Their terms of office are 4 years for directors and 2 years for auditors. The third chapter defines the scope of activities of the institute as follows: basic and applied researches on atomic energy; design, construction and operation of nuclear reactors; training of researchers and technicians; and import, production and distribution of radioisotopes. Those activities should be done in accordance with the basic development and utilization plans of atomic energy established by the prime minister with the determination of Atomic Energy Commission. The fourth chapter provides for the finance and accounting of the institute, and the fifth chapter requires the supervision of the institute by the prime minister. (Matsushima, A.)

  15. Proceedings of Nova Scotia's 2006 energy research and development forum

    International Nuclear Information System (INIS)

    2006-01-01

    The Nova Scotia 2006 energy research and development forum provided a venue for experts from industry, research institutions and government to discuss how research and development will shape the future of energy in the province. The forum was divided into 3 sessions: (1) building knowledge about the marine environment, (2) building knowledge about geoscience, and (3) building knowledge about sustainable energy. A wide ranges of issues related to the Nova Scotia region included whale identification; fisheries mapping; the commercialization of hydrocarbon discoveries; carbon capture and storage and petroleum system analysis and prospect evaluation. Keynote addresses were presented on produced water in Norway; deepwater exploration in Morocco; renewable energy and Canada's role as an energy superpower. The conference featured more than 57 presentations, of which 4 have been catalogued separately for inclusion in this database. refs., tabs., figs

  16. Energy and Environmental Systems Division 1981 research review

    International Nuclear Information System (INIS)

    1982-04-01

    To effectively manage the nation's energy and natural resources, government and industry leaders need accurate information regarding the performance and economics of advanced energy systems and the costs and benefits of public-sector initiatives. The Energy and Environmental Systems Division (EES) of Argonne National Laboratory conducts applied research and development programs that provide such information through systems analysis, geophysical field research, and engineering studies. During 1981, the division: analyzed the production economics of specific energy resources, such as biomass and tight sands gas; developed and transferred to industry economically efficient techniques for addressing energy-related resource management and environmental protection problems, such as the reclamation of strip-mined land; determined the engineering performance and cost of advanced energy-supply and pollution-control systems; analyzed future markets for district heating systems and other emerging energy technologies; determined, in strategic planning studies, the availability of resources needed for new energy technologies, such as the imported metals used in advanced electric-vehicle batteries; evaluated the effectiveness of strategies for reducing scarce-fuel consumption in the transportation sector; identified the costs and benefits of measures designed to stabilize the financial condition of US electric utilities; estimated the costs of nuclear reactor shutdowns and evaluated geologic conditions at potential sites for permanent underground storage of nuclear waste; evaluated the cost-effectiveness of environmental regulations, particularly those affecting coal combustion; and identified the environmental effects of energy technologies and transportation systems

  17. Research Note on the Energy Infrastructure Attack Database (EIAD

    Directory of Open Access Journals (Sweden)

    Jennifer Giroux

    2013-12-01

    Full Text Available The January 2013 attack on the In Amenas natural gas facility drew international attention. However this attack is part of a portrait of energy infrastructure targeting by non-state actors that spans the globe. Data drawn from the Energy Infrastructure Attack Database (EIAD shows that in the last decade there were, on average, nearly 400 annual attacks carried out by armed non-state actors on energy infrastructure worldwide, a figure that was well under 200 prior to 1999. This data reveals a global picture whereby violent non-state actors target energy infrastructures to air grievances, communicate to governments, impact state economic interests, or capture revenue in the form of hijacking, kidnapping ransoms, theft. And, for politically motivated groups, such as those engaged in insurgencies, attacking industry assets garners media coverage serving as a facilitator for international attention. This research note will introduce EIAD and position its utility within various research areas where the targeting of energy infrastructure, or more broadly energy infrastructure vulnerability, has been addressed, either directly or indirectly. We also provide a snapshot of the initial analysis of the data between 1980-2011, noting specific temporal and spatial trends, and then conclude with a brief discussion on the contribution of EIAD, highlighting future research trajectories. 

  18. Energy research and development projects in the Nordic countries. Directory 1987

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    This is the fifth directory of research, development and demonstration projects in the Nordic countries within the field of energy. The directory includes projects running in 1987. 2378 projects are described, all of them financed through special public funds (i.e. external funding). The energy research organisation in each Nordic country is briefly reviewed in the appendixes, and a list of relevant newsletters are given. The directory is published at the request of the Nordic Council of Ministers and a special Energy Research Committee set up by the Nordic energy ministers in order to coordinate and promote Nordic information sharing in the energy field. (author)

  19. Summaries of research in high energy physics

    International Nuclear Information System (INIS)

    1987-11-01

    The compilation of summaries of research and technology R and D efforts contained in this volume is intended to present a detailed narrative description of the scope and nature of the HEP activities funded by the Department of Energy in the FY 1985/FY 1986 time period. Topic areas covered include the following: experimental research using the accelerators and particle detector facilities and other related research; theoretical research; conception, design, construction, and operation of particle accelerators and detectors facilities; and research and development programs intended to advance accelerator technology, particle detector technology, and data analysis capabilities

  20. Basic Energy Sciences FY 2011 Research Summaries

    Energy Technology Data Exchange (ETDEWEB)

    None

    2011-01-01

    This report provides a collection of research abstracts for more than 1,300 research projects funded by the Office of Basic Energy Sciences (BES) in Fiscal Year 2011 at some 180 institutions across the U.S. This volume is organized along the three BES divisions: Materials Sciences and Engineering; Chemical Sciences, Geosciences, and Biosciences; and Scientific User Facilities.

  1. Should we quit our jobs? Challenges, barriers and recommendations for interdisciplinary energy research

    International Nuclear Information System (INIS)

    Schuitema, Geertje; Sintov, Nicole D.

    2017-01-01

    Many plea for a better integration of social sciences in energy research, which would imply more comprehensive interdisciplinary energy research. We argue that in order to achieve this, institutional barriers and research challenges need to be recognised and addressed. We identify six challenges and barriers, and provide recommendations for working towards solutions. We conclude that to engage in interdisciplinary research implies extra costs and fewer rewards for all researchers, particularly early and mid-career academics. We propose a new conceptualisation of practices and incentive structures among academic institutions, funding agencies, and publication outlets, and urge all energy researchers to join this debate. - Highlights: • Interdisciplinary energy research currently does not reach its full potential. • Social sciences are underutilised in energy research. • Barriers and challenges need to be addressed to stimulate interdisciplinary energy research. • High costs and small rewards for interdisciplinary (early and mid-career) researchers.

  2. High Energy Astrophysics Science Archive Research Center

    Data.gov (United States)

    National Aeronautics and Space Administration — The High Energy Astrophysics Science Archive Research Center (HEASARC) is the primary archive for NASA missions dealing with extremely energetic phenomena, from...

  3. Annual report of the Japan Atomic Energy Research Institute for fiscal 1999

    International Nuclear Information System (INIS)

    2000-01-01

    The Japan Atomic Energy Research Institute (JAERI) has promoted some researches for contributing to general development of science and technology based on nuclear research and development such as neutron science research, light quantum and radiation beam science research, radiation application research, high level computational science research, advanced basic research, and so forth, along the 'Long-term plan on nuclear research, development and application' established on June, 1994. And, researches and developments on leading energy system bringing breakthrough of nuclear technology such as study on future type energy system, research and development of nuclear fusion, and high temperature engineering test research. In addition, as a research containing both fields of general nuclear science and nuclear energy, safety research and health and physics research were also promoted. Furthermore, together with not only inland co-operation with industry, university and institute, but also promotion of diverse international co-operation, effective research and development has been carried out by various research assistant business. Here were described in details on researches on neutron science, light quantum and radiation beam science, radiation application, material science, environmental science, advanced basic research, high level computational science, nuclear fusion, future type energy system, high-temperature engineering test, safety, and relative research, and on operation and safety management, relative technology and outsider operation, and construction arrangement. (G.K.)

  4. Status of Avian Research at the National Renewable Energy Laboratory

    International Nuclear Information System (INIS)

    Sinclair, K.

    2001-01-01

    As the use of wind energy expands across the United States, concerns about the impacts of commercial wind farms on bird and bat populations are frequently raised. Two primary areas of concern are (1) possible litigation resulting from the killing of even one bird if it is protected by the Migratory Bird Treaty Act, the Endangered Species Act, or both; and (2) the effect of avian mortality on bird populations. To properly address these concerns, the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) supports scientifically based avian/wind power interaction research. In this paper I describe NREL's field-based research projects and summarize the status of the research. I also summarize NREL's other research activities, including lab-based vision research to increase the visibility of moving turbine blades and avian acoustic research, as well as our collaborative efforts with the National Wind Coordinating Committee's Avian Subcommittee

  5. Research on energy supply, demand and economy forecasting in Japan

    International Nuclear Information System (INIS)

    Shiba, Tsuyoshi; Kamezaki, Hiroshi; Yuyama, Tomonori; Suzuki, Atsushi

    1999-10-01

    This project aims to do research on forecasts of energy demand structure and electricity generation cost in each power plant in Japan in the 21st century, considering constructing successful FBR scenario. During the process of doing research on forecasts of energy demand structure in Japan, documents published from organizations in inside and outside of Japan were collected. These documents include prospects of economic growth rate, forecasts of amount for energy supply and demand, the maximum amount of introducing new energy resources, CO2 regulation, and evaluation of energy best mixture. Organizations in Japan such as Economic Council and Japan Energy Economic Research Institute have provided long-term forecasts until the early 21st century. Meanwhile, organizations overseas have provided forecasts of economic structure, and demand and supply for energy in OECD and East Asia including Japan. In connection with forecasts of electricity generation cost in each power plant, views on the ultimate reserves and cost of resources are reviewed in this report. According to some views on oil reserves, making assumptions based on reserves/production ratio, the maximum length of the time that oil reserves will last is 150 years. In addition, this report provides summaries of cost and potential role of various resources, including solar energy and wind energy; and views on waste, safety, energy security-related externality cost, and the price of transferring CO2 emission right. (author)

  6. KEK (High Energy Accelerator Research Organization) annual report, 2005

    International Nuclear Information System (INIS)

    2006-01-01

    This report summarizes research activities of KEK (High Energy Accelerator Research Organization) in the fiscal year 2005. Two years have passed since the KEK was reorganized as an inter-university research institute corporation, and KEK continue to facilitate a wide range of research programs based on high-energy accelerators for users from universities. KEK consists of two research institutes, the Institute of Particle and Nuclear Studies (IPNS) and the Institute of Materials Science (IMSS); and two laboratories, the Accelerator Laboratory and the Applied Research Laboratory. KEK has been operating four major accelerator facilities in Tsukuba: the 12 GeV Proton Synchrotron (PS), the KEK B-factory (KEKB), the Photon Factory (PF), and the Electron/Positron Injector Linac. We are now engaged in the construction of the Japan Proton Accelerator Research Complex (J-PARC) in Tokai in cooperation with the Japan Atomic Energy Agency (JAEA). The J-PARC Center was established in February 2006 to take full responsibility for the operation of J-PARC. With the progress of construction, the PS ceased operation at the end of March 2006 after a history of 26 years. The task of KEK is to play a key role in the fields of elementary particle, nuclei, materials and life science as one of leading research facilities of the world. The fiscal year 2005 activities of both KEK employees and visiting researchers yielded excellent outcomes in these research fields. (J.P.N.)

  7. Swiss energy research in 2008; Energie-Forschung 2008 - Ueberblicksberichte der Programmleiter / Recherche energetique 2008 - Rapports de synthese des chefs de programme

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-06-15

    This comprehensive document published by the Swiss Federal Office of Energy (SFOE) reports on Swiss energy research in the year 2008. The overview reports made by the programme leaders are presented. In the area of efficient energy use, programme reports are presented for the following areas: Energy in buildings, traffic, electricity technologies and their usage, networks, heat-pumps and combined heat and power, combustion technologies, power station 2020 and carbon capture and storage, fuel cells and hydrogen as well as process engineering. In the renewables sector, work in the following areas is reported on: Solar thermal energy and storage, photovoltaics, industrial use of solar energy, biomass and wood energy, hydropower, geothermal energy and wind energy. Research in the area of nuclear energy and nuclear safety is reported on, as is research in the areas of regulatory safety, fusion and nuclear wastes. Finally, a report on energy-economics research is presented. The report is completed with a list of projects and an appendix containing details on the Swiss Energy Research Commission CORE and a list of those responsible for the various research programmes.

  8. Research for the energy transition. The organization of the energy systems; Forschung fuer die Energiewende. Die Gestaltung des Energiesystems

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2017-03-15

    The volume on research for the energy transition includes contributions to the FVEE annual meeting 2016 concerning the following issues: status and perspectives of the energy transition, key technologies for the energy transition, political boundary conditions, development trends in photovoltaics, components for the energy supply (wind energy, hydrogen technologies, smart bioenergy concept, contribution of the geosphere), grids and storage systems for the energy transition, research network renewable energies.

  9. The national strategy synthesis on the research in the energy domain

    International Nuclear Information System (INIS)

    2007-01-01

    The energy research strategy takes into account two main orientations: the identification, the design and the industrial validation of new technologies generating no or less greenhouse gases, progresses relative to the today technologies in order to decrease the energy consumption. The report discusses the following axis of research: technologies of poor greenhouse gases emission and alternative energy resources, the nuclear energy for the electric power production, the biomass, the photovoltaic energy by the development of less expensive technologies, the CO 2 capture and storage, the energy efficiency, the energy storage, the transport sector and the fuel cells development. (A.L.B.)

  10. Summary of international energy research and development activities, 1974--1976

    International Nuclear Information System (INIS)

    1977-11-01

    This directory includes information covering 3017 ongoing and recently completed energy research projects conducted in Canada, Italy, the Federal Republic of Germany, France, The Netherlands, the United Kingdom, Denmark, Sweden, Israel, and 18 other countries. This information was registered with the Smithsonian Science Information Exchange (SSIE) by supporting organizations in the nine countries listed and by international organizations such as the International Atomic Energy Agency. All narrative information presented in the directory and, in some cases, organization names were translated into English. In addition to the title and text of project summaries, the directory contains the following indexes: Subject Index, Investigator Index, Performing Organization Index, and Supporting Organization Index. To reflect particular facets of energy research, the Subject Index is cross-referenced. The Subject Index is based upon the SSIE classification system, which organizes index terms in hierarchies to relate groups of narrow subject areas within broad areas. The following types of energy information are included: organic sources of energy (gas and oil; coal; peat, hydrocarbons, and nonfossil organic sources); thermonuclear energy and plasma physics; fission sources and energy production (reactor fuels assemblies and fuel management; reactor materials; reactor components; reactor thermodynamics, thermohydraulics, and mechanics; reactor safety and control; reactor testing, operations, and analysis; reactor and nuclear physics; uranium exploration and mining; reactors--general); geophysical energy sources (geothermal, hydro, solar, wave, and wind); conversion technology; environmental aspects of energy conversion and use; transport and transmission of energy; energy utilization and conservation; and energy systems and other energy research

  11. Academic Design Of Canada's Energy Systems And Nuclear Science Research Centre

    International Nuclear Information System (INIS)

    Bereznai, G.; Perera, S.

    2010-01-01

    The University of Ontario Institute of Technology (UOIT) is at the forefront of alternative energy and nuclear research that focuses on the energy challenges that are faced by the province of Ontario, the industrial heartland of Canada. While the university was established as recently as 2002 and opened its doors to its first students in 2003, it has already developed a comprehensive set of undergraduate and graduate programs, and a reputation for research intensiveness. UOIT offers dedicated programs in nuclear engineering and energy systems engineering to ensure a continued supply of trained employees in these fields. The ability to provide talented and skilled personnel to the energy sector has emerged as a critical requirement of ensuring Ontario's energy future, and to meet this need UOIT requires additional teaching and research space in order to offer its energy related programs. The Governments of Canada and of the Province of Ontario recognized UOIT's achievements and contributions to post-secondary education in the field of clean energy in general and nuclear power in particular, and as part of the economic stimuli funded by both levels of government, approved $45 M CAD for the construction of a 10,000 m 2 'Energy Systems and Nuclear Science Research Centre' at UOIT. The building is scheduled to be ready for occupancy in the summer of 2011. The paper presents the key considerations that lead to the design of the building, and gives details of the education and research programs that were the key in determining the design and layout of the research centre. (authors)

  12. Research and development conference: California Institute for Energy Efficiency (CIEE) program

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    CIEE's first Research and Development Conference will introduce you to some of the results achieved to date through CIEE-sponsored multiyear research performed in three programs: building energy efficiency, air quality impacts of energy efficiency, and end-use resource planning. Results from scoping studies, Director's discretionary research, and exploratory research will also be featured.

  13. New Mexico High School Supercomputing Challenge, 1990--1995: Five years of making a difference to students, teachers, schools, and communities. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Foster, M.; Kratzer, D.

    1996-02-01

    The New Mexico High School Supercomputing Challenge is an academic program dedicated to increasing interest in science and math among high school students by introducing them to high performance computing. This report provides a summary and evaluation of the first five years of the program, describes the program and shows the impact that it has had on high school students, their teachers, and their communities. Goals and objectives are reviewed and evaluated, growth and development of the program are analyzed, and future directions are discussed.

  14. Experimental Research of a New Wave Energy Conversion Device

    Science.gov (United States)

    Lu, Zhongyue; Shang, Jianzhong; Luo, Zirong; Sun, Chongfei; Chen, Gewei

    2018-01-01

    With the increasing tension of contemporary social energy, the development and utilization of renewable energy has become an important development direction. As an important part of renewable energy, wave energy has the characteristics of green environmental protection and abundant reserves, attracting more investment and research. For small marine equipment energy supply problem, this paper puts forward a micro wave energy conversion device as the basic of heaving motion of waves in the ocean. This paper designed a new type of power output device can solve the micro wave energy conversion problem.

  15. Advanced energy projects FY 1997 research summaries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    The mission of the Advanced Energy Projects (AEP) program is to explore the scientific feasibility of novel energy-related concepts that are high risk, in terms of scientific feasibility, yet have a realistic potential for a high technological payoff. The concepts supported by the AEP are typically at an early stage of scientific development. They often arise from advances in basic research and are premature for consideration by applied research or technology development programs. Some are based on discoveries of new scientific phenomena or involve exploratory ideas that span multiple scientific and technical disciplines which do not fit into an existing DOE program area. In all cases, the objective is to support evaluation of the scientific or technical feasibility of the novel concepts involved. Following AEP support, it is expected that each concept will be sufficiently developed to attract further funding from other sources to realize its full potential. Projects that involve evolutionary research or technology development and demonstration are not supported by AEP. Furthermore, research projects more appropriate for another existing DOE research program are not encouraged. There were 65 projects in the AEP research portfolio during Fiscal Year 1997. Eigheen projects were initiated during that fiscal year. This document consists of short summaries of projects active in FY 1997. Further information of a specific project may be obtained by contacting the principal investigator.

  16. Consumer energy management: policy implications of research. 2 Vols

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, G.H.G.; Ritchie, J.R.B.

    1982-12-01

    This report provides a framework for understanding the practical implications of consumer energy conservation research in Canada. A review of such research was undertaken to determine its implications for increasing the effectiveness of Canadian conservation policies and programs. The major conclusions and recommendations were as follows. Conservation has been acknowledged as the single most important element in solving Canada's petroleum shortfall in the 1980s. An analytic approach to the formulation of energy policies and the design of conservation programs will be essential if meaningful energy savings in the consumer sector are to be realized. Prior to designing any conservation program, it is essential that the components of consumer energy policy be understood. In order to assess the effectiveness of conservation efforts, it is necessary to assign relative priorities to the criteria of probable energy savings, cost effectiveness, impact by fuel type, impact on consumers, enforceability, and institutional considerations. Conservation efforts aimed at consumers must be based on understanding the basic processes which underlie how they perceive and respond to various types of conservation initiatives. This understanding is gained through consumer impact analysis and program research. The latter action attempts to analyze the effectiveness and acceptability of programs involving information, financial incentives, energy standards, and energy usage restrictions. Conservation programs must ensure that barriers to adoption, such as lack of time and knowledge, financial resources, and lifestyle impacts, will be minimized. 93 refs., 3 figs., 13 tabs.

  17. Basic Energy Sciences FY 2012 Research Summaries

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-01-01

    This report provides a collection of research abstracts and highlights for more than 1,400 research projects funded by the Office of Basic Energy Sciences (BES) in Fiscal Year 2012 at some 180 institutions across the U.S. This volume is organized along the three BES Divisions: Materials Sciences and Engineering; Chemical Sciences, Geosciences, and Biosciences; and Scientific User Facilities.

  18. Basic Energy Sciences FY 2014 Research Summaries

    Energy Technology Data Exchange (ETDEWEB)

    None

    2014-01-01

    This report provides a collection of research abstracts and highlights for more than 1,200 research projects funded by the Office of Basic Energy Sciences (BES) in Fiscal Year 2014 at some 200 institutions across the U.S. This volume is organized along the three BES Divisions: Materials Sciences and Engineering; Chemical Sciences, Geosciences, and Biosciences; and Scientific User Facilities.

  19. Parallel simulation of tsunami inundation on a large-scale supercomputer

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2013-12-01

    An accurate prediction of tsunami inundation is important for disaster mitigation purposes. One approach is to approximate the tsunami wave source through an instant inversion analysis using real-time observation data (e.g., Tsushima et al., 2009) and then use the resulting wave source data in an instant tsunami inundation simulation. However, a bottleneck of this approach is the large computational cost of the non-linear inundation simulation and the computational power of recent massively parallel supercomputers is helpful to enable faster than real-time execution of a tsunami inundation simulation. Parallel computers have become approximately 1000 times faster in 10 years (www.top500.org), and so it is expected that very fast parallel computers will be more and more prevalent in the near future. Therefore, it is important to investigate how to efficiently conduct a tsunami simulation on parallel computers. In this study, we are targeting very fast tsunami inundation simulations on the K computer, currently the fastest Japanese supercomputer, which has a theoretical peak performance of 11.2 PFLOPS. One computing node of the K computer consists of 1 CPU with 8 cores that share memory, and the nodes are connected through a high-performance torus-mesh network. The K computer is designed for distributed-memory parallel computation, so we have developed a parallel tsunami model. Our model is based on TUNAMI-N2 model of Tohoku University, which is based on a leap-frog finite difference method. A grid nesting scheme is employed to apply high-resolution grids only at the coastal regions. To balance the computation load of each CPU in the parallelization, CPUs are first allocated to each nested layer in proportion to the number of grid points of the nested layer. Using CPUs allocated to each layer, 1-D domain decomposition is performed on each layer. In the parallel computation, three types of communication are necessary: (1) communication to adjacent neighbours for the

  20. Grassroots Supercomputing

    CERN Multimedia

    Buchanan, Mark

    2005-01-01

    What started out as a way for SETI to plow through its piles or radio-signal data from deep space has turned into a powerful research tool as computer users acrosse the globe donate their screen-saver time to projects as diverse as climate-change prediction, gravitational-wave searches, and protein folding (4 pages)