WorldWideScience

Sample records for future supercomputer environments

  1. Supercomputers Of The Future

    Science.gov (United States)

    Peterson, Victor L.; Kim, John; Holst, Terry L.; Deiwert, George S.; Cooper, David M.; Watson, Andrew B.; Bailey, F. Ron

    1992-01-01

    Report evaluates supercomputer needs of five key disciplines: turbulence physics, aerodynamics, aerothermodynamics, chemistry, and mathematical modeling of human vision. Predicts these fields will require computer speed greater than 10(Sup 18) floating-point operations per second (FLOP's) and memory capacity greater than 10(Sup 15) words. Also, new parallel computer architectures and new structured numerical methods will make necessary speed and capacity available.

  2. Supercomputing - Use Cases, Advances, The Future (2/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the second day, we will focus on software and software paradigms driving supercomputers, workloads that need supercomputing treatment, advances in technology and possible future developments. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and i...

  3. Cooperative visualization and simulation in a supercomputer environment

    International Nuclear Information System (INIS)

    Ruehle, R.; Lang, U.; Wierse, A.

    1993-01-01

    The article takes a closer look on the requirements being imposed by the idea to integrate all the components into a homogeneous software environment. To this end several methods for the distribtuion of applications in dependence of certain problem types are discussed. The currently available methods at the University of Stuttgart Computer Center for the distribution of applications are further explained. Finally the aims and characteristics of a European sponsored project, called PAGEIN, are explained, which fits perfectly into the line of developments at RUS. The aim of the project is to experiment with future cooperative working modes of aerospace scientists in a high speed distributed supercomputing environment. Project results will have an impact on the development of real future scientific application environments. (orig./DG)

  4. Supercomputing - Use Cases, Advances, The Future (1/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the first day, we will focus on the history and theory of supercomputing, the top500 list and the hardware that makes supercomputers tick. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP an...

  5. Supercomputers and the future of computational atomic scattering physics

    International Nuclear Information System (INIS)

    Younger, S.M.

    1989-01-01

    The advent of the supercomputer has opened new vistas for the computational atomic physicist. Problems of hitherto unparalleled complexity are now being examined using these new machines, and important connections with other fields of physics are being established. This talk briefly reviews some of the most important trends in computational scattering physics and suggests some exciting possibilities for the future. 7 refs., 2 figs

  6. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  7. Supercomputational science

    CERN Document Server

    Wilson, S

    1990-01-01

    In contemporary research, the supercomputer now ranks, along with radio telescopes, particle accelerators and the other apparatus of "big science", as an expensive resource, which is nevertheless essential for state of the art research. Supercomputers are usually provided as shar.ed central facilities. However, unlike, telescopes and accelerators, they are find a wide range of applications which extends across a broad spectrum of research activity. The difference in performance between a "good" and a "bad" computer program on a traditional serial computer may be a factor of two or three, but on a contemporary supercomputer it can easily be a factor of one hundred or even more! Furthermore, this factor is likely to increase with future generations of machines. In keeping with the large capital and recurrent costs of these machines, it is appropriate to devote effort to training and familiarization so that supercomputers are employed to best effect. This volume records the lectures delivered at a Summer School ...

  8. The ETA10 supercomputer system

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, Inc. ETA 10 is a next-generation supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamics MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed. (orig.)

  9. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee; Kaushik, Dinesh; Winfer, Andrew

    2011-01-01

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  10. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee

    2011-11-15

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  11. Visualization environment of the large-scale data of JAEA's supercomputer system

    Energy Technology Data Exchange (ETDEWEB)

    Sakamoto, Kensaku [Japan Atomic Energy Agency, Center for Computational Science and e-Systems, Tokai, Ibaraki (Japan); Hoshi, Yoshiyuki [Research Organization for Information Science and Technology (RIST), Tokai, Ibaraki (Japan)

    2013-11-15

    On research and development of various fields of nuclear energy, visualization of calculated data is especially useful to understand the result of simulation in an intuitive way. Many researchers who run simulations on the supercomputer in Japan Atomic Energy Agency (JAEA) are used to transfer calculated data files from the supercomputer to their local PCs for visualization. In recent years, as the size of calculated data has gotten larger with improvement of supercomputer performance, reduction of visualization processing time as well as efficient use of JAEA network is being required. As a solution, we introduced a remote visualization system which has abilities to utilize parallel processors on the supercomputer and to reduce the usage of network resources by transferring data of intermediate visualization process. This paper reports a study on the performance of image processing with the remote visualization system. The visualization processing time is measured and the influence of network speed is evaluated by varying the drawing mode, the size of visualization data and the number of processors. Based on this study, a guideline for using the remote visualization system is provided to show how the system can be used effectively. An upgrade policy of the next system is also shown. (author)

  12. The ETA systems plans for supercomputers

    International Nuclear Information System (INIS)

    Swanson, C.D.

    1987-01-01

    The ETA Systems, is a class VII supercomputer featuring multiprocessing, a large hierarchical memory system, high performance input/output, and network support for both batch and interactive processing. Advanced technology used in the ETA 10 includes liquid nitrogen cooled CMOS logic with 20,000 gates per chip, a single printed circuit board for each CPU, and high density static and dynamic MOS memory chips. Software for the ETA 10 includes an underlying kernel that supports multiple user environments, a new ETA FORTRAN compiler with an advanced automatic vectorizer, a multitasking library and debugging tools. Possible developments for future supercomputers from ETA Systems are discussed

  13. Advanced parallel processing with supercomputer architectures

    International Nuclear Information System (INIS)

    Hwang, K.

    1987-01-01

    This paper investigates advanced parallel processing techniques and innovative hardware/software architectures that can be applied to boost the performance of supercomputers. Critical issues on architectural choices, parallel languages, compiling techniques, resource management, concurrency control, programming environment, parallel algorithms, and performance enhancement methods are examined and the best answers are presented. The authors cover advanced processing techniques suitable for supercomputers, high-end mainframes, minisupers, and array processors. The coverage emphasizes vectorization, multitasking, multiprocessing, and distributed computing. In order to achieve these operation modes, parallel languages, smart compilers, synchronization mechanisms, load balancing methods, mapping parallel algorithms, operating system functions, application library, and multidiscipline interactions are investigated to ensure high performance. At the end, they assess the potentials of optical and neural technologies for developing future supercomputers

  14. UbiWorld: An environment integrating virtual reality, supercomputing, and design

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.; Papka, M.E.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-07-01

    UbiWorld is a concept being developed by the Futures Laboratory group at Argonne National Laboratory that ties together the notion of ubiquitous computing (Ubicomp) with that of using virtual reality for rapid prototyping. The goal is to develop an environment where one can explore Ubicomp-type concepts without having to build real Ubicomp hardware. The basic notion is to extend object models in a virtual world by using distributed wide area heterogeneous computing technology to provide complex networking and processing capabilities to virtual reality objects.

  15. Enabling department-scale supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  16. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  17. Introduction to Reconfigurable Supercomputing

    CERN Document Server

    Lanzagorta, Marco; Rosenberg, Robert

    2010-01-01

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPe who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigur

  18. Computational Dimensionalities of Global Supercomputing

    Directory of Open Access Journals (Sweden)

    Richard S. Segall

    2013-12-01

    Full Text Available This Invited Paper pertains to subject of my Plenary Keynote Speech at the 17th World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI 2013 held in Orlando, Florida on July 9-12, 2013. The title of my Plenary Keynote Speech was: "Dimensionalities of Computation: from Global Supercomputing to Data, Text and Web Mining" but this Invited Paper will focus only on the "Computational Dimensionalities of Global Supercomputing" and is based upon a summary of the contents of several individual articles that have been previously written with myself as lead author and published in [75], [76], [77], [78], [79], [80] and [11]. The topics of these of the Plenary Speech included Overview of Current Research in Global Supercomputing [75], Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing [76], Data Mining Supercomputing with SAS™ JMP® Genomics ([77], [79], [80], and Visualization by Supercomputing Data Mining [81]. ______________________ [11.] Committee on the Future of Supercomputing, National Research Council (2003, The Future of Supercomputing: An Interim Report, ISBN-13: 978-0-309-09016- 2, http://www.nap.edu/catalog/10784.html [75.] Segall, Richard S.; Zhang, Qingyu and Cook, Jeffrey S.(2013, "Overview of Current Research in Global Supercomputing", Proceedings of Forty- Fourth Meeting of Southwest Decision Sciences Institute (SWDSI, Albuquerque, NM, March 12-16, 2013. [76.] Segall, Richard S. and Zhang, Qingyu (2010, "Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing", Proceedings of 5th INFORMS Workshop on Data Mining and Health Informatics, Austin, TX, November 6, 2010. [77.] Segall, Richard S., Zhang, Qingyu and Pierce, Ryan M.(2010, "Data Mining Supercomputing with SAS™ JMP®; Genomics: Research-in-Progress, Proceedings of 2010 Conference on Applied Research in Information Technology, sponsored by

  19. What is supercomputing ?

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1992-01-01

    Supercomputing means the high speed computation using a supercomputer. Supercomputers and the technical term ''supercomputing'' have spread since ten years ago. The performances of the main computers installed so far in Japan Atomic Energy Research Institute are compared. There are two methods to increase computing speed by using existing circuit elements, parallel processor system and vector processor system. CRAY-1 is the first successful vector computer. Supercomputing technology was first applied to meteorological organizations in foreign countries, and to aviation and atomic energy research institutes in Japan. The supercomputing for atomic energy depends on the trend of technical development in atomic energy, and the contents are divided into the increase of computing speed in existing simulation calculation and the acceleration of the new technical development of atomic energy. The examples of supercomputing in Japan Atomic Energy Research Institute are reported. (K.I.)

  20. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  1. Performance Analysis and Scaling Behavior of the Terrestrial Systems Modeling Platform TerrSysMP in Large-Scale Supercomputing Environments

    Science.gov (United States)

    Kollet, S. J.; Goergen, K.; Gasper, F.; Shresta, P.; Sulis, M.; Rihani, J.; Simmer, C.; Vereecken, H.

    2013-12-01

    In studies of the terrestrial hydrologic, energy and biogeochemical cycles, integrated multi-physics simulation platforms take a central role in characterizing non-linear interactions, variances and uncertainties of system states and fluxes in reciprocity with observations. Recently developed integrated simulation platforms attempt to honor the complexity of the terrestrial system across multiple time and space scales from the deeper subsurface including groundwater dynamics into the atmosphere. Technically, this requires the coupling of atmospheric, land surface, and subsurface-surface flow models in supercomputing environments, while ensuring a high-degree of efficiency in the utilization of e.g., standard Linux clusters and massively parallel resources. A systematic performance analysis including profiling and tracing in such an application is crucial in the understanding of the runtime behavior, to identify optimum model settings, and is an efficient way to distinguish potential parallel deficiencies. On sophisticated leadership-class supercomputers, such as the 28-rack 5.9 petaFLOP IBM Blue Gene/Q 'JUQUEEN' of the Jülich Supercomputing Centre (JSC), this is a challenging task, but even more so important, when complex coupled component models are to be analysed. Here we want to present our experience from coupling, application tuning (e.g. 5-times speedup through compiler optimizations), parallel scaling and performance monitoring of the parallel Terrestrial Systems Modeling Platform TerrSysMP. The modeling platform consists of the weather prediction system COSMO of the German Weather Service; the Community Land Model, CLM of NCAR; and the variably saturated surface-subsurface flow code ParFlow. The model system relies on the Multiple Program Multiple Data (MPMD) execution model where the external Ocean-Atmosphere-Sea-Ice-Soil coupler (OASIS3) links the component models. TerrSysMP has been instrumented with the performance analysis tool Scalasca and analyzed

  2. ATLAS Software Installation on Supercomputers

    CERN Document Server

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  3. NASA Advanced Supercomputing Facility Expansion

    Science.gov (United States)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  4. Operating Environment of the Future

    National Research Council Canada - National Science Library

    Hanson, Matthew

    1997-01-01

    ...), the Smart Surgical System (SSS), and the Intelligent Virtual Patient Environment (IVPE). The project is one of several targeting reduction in mortality and morbidity of the wounded soldier through improved far-forward combat casualty care...

  5. Japanese supercomputer technology

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Ewald, R.H.; Worlton, W.J.

    1982-01-01

    In February 1982, computer scientists from the Los Alamos National Laboratory and Lawrence Livermore National Laboratory visited several Japanese computer manufacturers. The purpose of these visits was to assess the state of the art of Japanese supercomputer technology and to advise Japanese computer vendors of the needs of the US Department of Energy (DOE) for more powerful supercomputers. The Japanese foresee a domestic need for large-scale computing capabilities for nuclear fusion, image analysis for the Earth Resources Satellite, meteorological forecast, electrical power system analysis (power flow, stability, optimization), structural and thermal analysis of satellites, and very large scale integrated circuit design and simulation. To meet this need, Japan has launched an ambitious program to advance supercomputer technology. This program is described

  6. ASCI's Vision for supercomputing future

    International Nuclear Information System (INIS)

    Nowak, N.D.

    2003-01-01

    The full text of publication follows. Advanced Simulation and Computing (ASC, formerly Accelerated Strategic Computing Initiative [ASCI]) was established in 1995 to help Defense Programs shift from test-based confidence to simulation-based confidence. Specifically, ASC is a focused and balanced program that is accelerating the development of simulation capabilities needed to analyze and predict the performance, safety, and reliability of nuclear weapons and certify their functionality - far exceeding what might have been achieved in the absence of a focused initiative. To realize its vision, ASC is creating simulation and proto-typing capabilities, based on advanced weapon codes and high-performance computing

  7. Supercomputers to transform Science

    CERN Multimedia

    2006-01-01

    "New insights into the structure of space and time, climate modeling, and the design of novel drugs, are but a few of the many research areas that will be transforned by the installation of three supercomputers at the Unversity of Bristol." (1/2 page)

  8. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  9. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  10. Future gripper needs in nuclear environments

    International Nuclear Information System (INIS)

    Ham, A.C. van der; Holweg, E.G.M.; Jongkind, W.

    1993-01-01

    This paper is concerned with the requirements of teleoperated grippers for work in hazardous situations and nuclear environments. A survey among users in the nuclear industry was performed by means of questionnaires of the present grippers in use and the future gripper needs. The survey covers reliability, tasks to be done, object properties, accuracy, environmental requirements, required grasps, mechanical and sensorial requirements. The paper will present the proposal for a future gripper. (author)

  11. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  12. OpenMP Performance on the Columbia Supercomputer

    Science.gov (United States)

    Haoqiang, Jin; Hood, Robert

    2005-01-01

    This presentation discusses Columbia World Class Supercomputer which is one of the world's fastest supercomputers providing 61 TFLOPs (10/20/04). Conceived, designed, built, and deployed in just 120 days. A 20-node supercomputer built on proven 512-processor nodes. The largest SGI system in the world with over 10,000 Intel Itanium 2 processors and provides the largest node size incorporating commodity parts (512) and the largest shared-memory environment (2048) with 88% efficiency tops the scalar systems on the Top500 list.

  13. Ultrascalable petaflop parallel supercomputer

    Science.gov (United States)

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  14. The Pawsey Supercomputer geothermal cooling project

    Science.gov (United States)

    Regenauer-Lieb, K.; Horowitz, F.; Western Australian Geothermal Centre Of Excellence, T.

    2010-12-01

    The Australian Government has funded the Pawsey supercomputer in Perth, Western Australia, providing computational infrastructure intended to support the future operations of the Australian Square Kilometre Array radiotelescope and to boost next-generation computational geosciences in Australia. Supplementary funds have been directed to the development of a geothermal exploration well to research the potential for direct heat use applications at the Pawsey Centre site. Cooling the Pawsey supercomputer may be achieved by geothermal heat exchange rather than by conventional electrical power cooling, thus reducing the carbon footprint of the Pawsey Centre and demonstrating an innovative green technology that is widely applicable in industry and urban centres across the world. The exploration well is scheduled to be completed in 2013, with drilling due to commence in the third quarter of 2011. One year is allocated to finalizing the design of the exploration, monitoring and research well. Success in the geothermal exploration and research program will result in an industrial-scale geothermal cooling facility at the Pawsey Centre, and will provide a world-class student training environment in geothermal energy systems. A similar system is partially funded and in advanced planning to provide base-load air-conditioning for the main campus of the University of Western Australia. Both systems are expected to draw ~80-95 degrees C water from aquifers lying between 2000 and 3000 meters depth from naturally permeable rocks of the Perth sedimentary basin. The geothermal water will be run through absorption chilling devices, which only require heat (as opposed to mechanical work) to power a chilled water stream adequate to meet the cooling requirements. Once the heat has been removed from the geothermal water, licensing issues require the water to be re-injected back into the aquifer system. These systems are intended to demonstrate the feasibility of powering large-scale air

  15. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  16. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  17. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  18. World's fastest supercomputer opens up to users

    Science.gov (United States)

    Xin, Ling

    2016-08-01

    China's latest supercomputer - Sunway TaihuLight - has claimed the crown as the world's fastest computer according to the latest TOP500 list, released at the International Supercomputer Conference in Frankfurt in late June.

  19. Supercomputing and related national projects in Japan

    International Nuclear Information System (INIS)

    Miura, Kenichi

    1985-01-01

    Japanese supercomputer development activities in the industry and research projects are outlined. Architecture, technology, software, and applications of Fujitsu's Vector Processor Systems are described as an example of Japanese supercomputers. Applications of supercomputers to high energy physics are also discussed. (orig.)

  20. Mistral Supercomputer Job History Analysis

    OpenAIRE

    Zasadziński, Michał; Muntés-Mulero, Victor; Solé, Marc; Ludwig, Thomas

    2018-01-01

    In this technical report, we show insights and results of operational data analysis from petascale supercomputer Mistral, which is ranked as 42nd most powerful in the world as of January 2018. Data sources include hardware monitoring data, job scheduler history, topology, and hardware information. We explore job state sequences, spatial distribution, and electric power patterns.

  1. Supercomputers and quantum field theory

    International Nuclear Information System (INIS)

    Creutz, M.

    1985-01-01

    A review is given of why recent simulations of lattice gauge theories have resulted in substantial demands from particle theorists for supercomputer time. These calculations have yielded first principle results on non-perturbative aspects of the strong interactions. An algorithm for simulating dynamical quark fields is discussed. 14 refs

  2. Computational plasma physics and supercomputers

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1984-09-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular codes, but parallel processing poses new coding difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematics

  3. Algorithms for supercomputers

    International Nuclear Information System (INIS)

    Alder, B.J.

    1986-01-01

    Better numerical procedures, improved computational power and additional physical insights have contributed significantly to progress in dealing with classical and quantum statistical mechanics problems. Past developments are discussed and future possibilities outlined

  4. Algorithms for supercomputers

    International Nuclear Information System (INIS)

    Alder, B.J.

    1985-12-01

    Better numerical procedures, improved computational power and additional physical insights have contributed significantly to progress in dealing with classical and quantum statistical mechanics problems. Past developments are discussed and future possibilities outlined

  5. An assessment of worldwide supercomputer usage

    Energy Technology Data Exchange (ETDEWEB)

    Wasserman, H.J.; Simmons, M.L.; Hayes, A.H.

    1995-01-01

    This report provides a comparative study of advanced supercomputing usage in Japan and the United States as of Spring 1994. It is based on the findings of a group of US scientists whose careers have centered on programming, evaluating, and designing high-performance supercomputers for over ten years. The report is a follow-on to an assessment of supercomputing technology in Europe and Japan that was published in 1993. Whereas the previous study focused on supercomputer manufacturing capabilities, the primary focus of the current work was to compare where and how supercomputers are used. Research for this report was conducted through both literature studies and field research in Japan.

  6. Environment, energy, economy. A sustainable future

    International Nuclear Information System (INIS)

    Luise, A.; Borrello, L.; Calef, D.; Cialani, C.; Di Majo, V.; Federio, A.; Lovisolo, G.; Musmeci, F.

    1998-01-01

    This paper is organized in five parts: 1. sustainable development from global point of view; 2. global problems and international instruments; 3. sustainable management of resources in economic systems; 4. forecasting and methods: models and index; 5. future urban areas [it

  7. Robotizing workforce in future built environments

    NARCIS (Netherlands)

    Maas, G.J.; Gassel, van F.J.M.; Lee, Junbok; Han, Chang-Soo

    2011-01-01

    The aim of this paper is to define challenges for Automation and Robotics in construction (A+R) to enhance client and social value. Construction contributes to a positive living environment for society and is the largest sector of Europe’s economy with a size of around 2,500 billion Euros. Ten

  8. China Debates the Future Security Environment

    Science.gov (United States)

    2000-01-01

    Bike, Zhot~uo da qushi (China megatrends )( Belling: Hualing chubanshe, 1996). For warnings on the need to conceal increasing national power, see Ma...became Japan’s prime minister in 1957. Troop 731, which had engaged in biological warfare experiments, was exempted from trial. In March 1950, all...gongye chubanshe, 1998. Lu Hui. He hua sheng wuqi de h’shi yu weilai CFhe history and future of nuclear, chemical, and biological weapons). Beijing

  9. Exploration and production environment. Preserving the future our responsibility

    International Nuclear Information System (INIS)

    2004-01-01

    This document presents the Total Group commitments to manage natural resources in a rational way, to preserve biodiversity for future generations and protect the environment. It contains the health, safety, environment and quality charter of Total, the 12 exploration and production health, safety and environment rules and the exploration and production environmental policy. (A.L.B.)

  10. Dynamic Optical Networks for Future Internet Environments

    Science.gov (United States)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  11. Status of supercomputers in the US

    International Nuclear Information System (INIS)

    Fernbach, S.

    1985-01-01

    Current Supercomputers; that is, the Class VI machines which first became available in 1976 are being delivered in greater quantity than ever before. In addition, manufacturers are busily working on Class VII machines to be ready for delivery in CY 1987. Mainframes are being modified or designed to take on some features of the supercomputers and new companies with the intent of either competing directly in the supercomputer arena or in providing entry-level systems from which to graduate to supercomputers are springing up everywhere. Even well founded organizations like IBM and CDC are adding machines with vector instructions in their repertoires. Japanese - manufactured supercomputers are also being introduced into the U.S. Will these begin to compete with those of U.S. manufacture. Are they truly competitive. It turns out that both from the hardware and software points of view they may be superior. We may be facing the same problems in supercomputers that we faced in videosystems

  12. TOP500 Supercomputers for June 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-06-23

    23rd Edition of TOP500 List of World's Fastest Supercomputers Released: Japan's Earth Simulator Enters Third Year in Top Position MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 23rd edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2004) at the International Supercomputer Conference in Heidelberg, Germany.

  13. Supercomputer applications in nuclear research

    International Nuclear Information System (INIS)

    Ishiguro, Misako

    1992-01-01

    The utilization of supercomputers in Japan Atomic Energy Research Institute is mainly reported. The fields of atomic energy research which use supercomputers frequently and the contents of their computation are outlined. What is vectorizing is simply explained, and nuclear fusion, nuclear reactor physics, the hydrothermal safety of nuclear reactors, the parallel property that the atomic energy computations of fluids and others have, the algorithm for vector treatment and the effect of speed increase by vectorizing are discussed. At present Japan Atomic Energy Research Institute uses two systems of FACOM VP 2600/10 and three systems of M-780. The contents of computation changed from criticality computation around 1970, through the analysis of LOCA after the TMI accident, to nuclear fusion research, the design of new type reactors and reactor safety assessment at present. Also the method of using computers advanced from batch processing to time sharing processing, from one-dimensional to three dimensional computation, from steady, linear to unsteady nonlinear computation, from experimental analysis to numerical simulation and so on. (K.I.)

  14. INTEL: Intel based systems move up in supercomputing ranks

    CERN Multimedia

    2002-01-01

    "The TOP500 supercomputer rankings released today at the Supercomputing 2002 conference show a dramatic increase in the number of Intel-based systems being deployed in high-performance computing (HPC) or supercomputing areas" (1/2 page).

  15. Requirements for user interaction support in future CACE environments

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, M.

    1994-01-01

    Based on a review of user interaction modes and the specific needs of the CACE domain the paper describes requirements for user interaction in future CACE environments. Taking another look at the design process in CACE key areas in need of more user interaction support are pointed out. Three...

  16. Energy-water-environment nexus underpinning future desalination sustainability

    KAUST Repository

    Shahzad, Muhammad Wakil

    2017-03-11

    Energy-water-environment nexus is very important to attain COP21 goal, maintaining environment temperature increase below 2°C, but unfortunately two third share of CO2 emission has already been used and the remaining will be exhausted by 2050. A number of technological developments in power and desalination sectors improved their efficiencies to save energy and carbon emission but still they are operating at 35% and 10% of their thermodynamic limits. Research in desalination processes contributing to fuel World population for their improved living standard and to reduce specific energy consumption and to protect environment. Recently developed highly efficient nature-inspired membranes (aquaporin & graphene) and trend in thermally driven cycle\\'s hybridization could potentially lower then energy requirement for water purification. This paper presents a state of art review on energy, water and environment interconnection and future energy efficient desalination possibilities to save energy and protect environment.

  17. Integration of Titan supercomputer at OLCF with ATLAS production system

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration

    2016-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this talk we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job...

  18. Integration of Titan supercomputer at OLCF with ATLAS Production System

    CERN Document Server

    AUTHOR|(SzGeCERN)643806; The ATLAS collaboration; De, Kaushik; Klimentov, Alexei; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Wenaus, Torre

    2017-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for jo...

  19. Extending ATLAS Computing to Commercial Clouds and Supercomputers

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Filipcic, A; Klimentov, A; Maeno, T; Oleynik, D; Panitkin, S; Wenaus, T; Wu, W

    2014-01-01

    The Large Hadron Collider will resume data collection in 2015 with substantially increased computing requirements relative to its first 2009-2013 run. A near doubling of the energy and the data rate, high level of event pile-up, and detector upgrades will mean the number and complexity of events to be analyzed will increase dramatically. A naive extrapolation of the Run 1 experience would suggest that a 5-6 fold increase in computing resources are needed - impossible within the anticipated flat computing budgets in the near future. Consequently ATLAS is engaged in an ambitious program to expand its computing to all available resources, notably including opportunistic use of commercial clouds and supercomputers. Such resources present new challenges in managing heterogeneity, supporting data flows, parallelizing workflows, provisioning software, and other aspects of distributed computing, all while minimizing operational load. We will present the ATLAS experience to date with clouds and supercomputers, and des...

  20. TOP500 Supercomputers for June 2005

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2005-06-22

    25th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/L LNL BlueGene/L and IBM gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 25th edition of the TOP500 list of the world's fastest supercomputers was released today (June 22, 2005) at the 20th International Supercomputing Conference (ISC2005) in Heidelberg Germany.

  1. TOP500 Supercomputers for November 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-11-16

    22nd Edition of TOP500 List of World s Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 22nd edition of the TOP500 list of the worlds fastest supercomputers was released today (November 16, 2003). The Earth Simulator supercomputer retains the number one position with its Linpack benchmark performance of 35.86 Tflop/s (''teraflops'' or trillions of calculations per second). It was built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan.

  2. Federal Market Information Technology in the Post Flash Crash Era: Roles for Supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Leinweber, David; Ruebel, Oliver; Wu, Kesheng

    2011-09-16

    This paper describes collaborative work between active traders, regulators, economists, and supercomputing researchers to replicate and extend investigations of the Flash Crash and other market anomalies in a National Laboratory HPC environment. Our work suggests that supercomputing tools and methods will be valuable to market regulators in achieving the goal of market safety, stability, and security. Research results using high frequency data and analytics are described, and directions for future development are discussed. Currently the key mechanism for preventing catastrophic market action are “circuit breakers.” We believe a more graduated approach, similar to the “yellow light” approach in motorsports to slow down traffic, might be a better way to achieve the same goal. To enable this objective, we study a number of indicators that could foresee hazards in market conditions and explore options to confirm such predictions. Our tests confirm that Volume Synchronized Probability of Informed Trading (VPIN) and a version of volume Herfindahl-Hirschman Index (HHI) for measuring market fragmentation can indeed give strong signals ahead of the Flash Crash event on May 6 2010. This is a preliminary step toward a full-fledged early-warning system for unusual market conditions.

  3. Advanced Architectures for Astrophysical Supercomputing

    Science.gov (United States)

    Barsdell, B. R.; Barnes, D. G.; Fluke, C. J.

    2010-12-01

    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×) in general-purpose computation - performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.

  4. Environment issues and the future of the transport industry

    Energy Technology Data Exchange (ETDEWEB)

    Shiller, J W [Ford Motor Company, Dearborn, MI (USA)

    1992-01-01

    The motor vehicle industry must make the necessary investment in products and technology to meet the competitive and environmental challenges of the future. Discussion is presented of: the history of motor vehicles, the relationship of motor vehicles to the environment, the state of climate change knowledge, future economic development and the transport sector, the changing structure of the motor vehicle fleet, traffic congestion, alternative fuels, investments in transport, the European Energy Charter, The US Energy Strategy, the North American free trade agreement, and the economics of the automobile industry in Japan/South East Asia and the developing countries. 61 refs., 29 figs., 28 tabs.

  5. TOP500 Supercomputers for November 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-11-08

    24th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/IBM BlueGene/L and NASA/SGI's Columbia gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 24th edition of the TOP500 list of the worlds fastest supercomputers was released today (November 8, 2004) at the SC2004 Conference in Pittsburgh, Pa.

  6. Quantum Hamiltonian Physics with Supercomputers

    International Nuclear Information System (INIS)

    Vary, James P.

    2014-01-01

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed

  7. Plasma turbulence calculations on supercomputers

    International Nuclear Information System (INIS)

    Carreras, B.A.; Charlton, L.A.; Dominguez, N.; Drake, J.B.; Garcia, L.; Leboeuf, J.N.; Lee, D.K.; Lynch, V.E.; Sidikman, K.

    1991-01-01

    Although the single-particle picture of magnetic confinement is helpful in understanding some basic physics of plasma confinement, it does not give a full description. Collective effects dominate plasma behavior. Any analysis of plasma confinement requires a self-consistent treatment of the particles and fields. The general picture is further complicated because the plasma, in general, is turbulent. The study of fluid turbulence is a rather complex field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples back to the fluid. Since the fluid is not a perfect conductor, this turbulence can lead to changes in the topology of the magnetic field structure, causing the magnetic field lines to wander radially. Because the plasma fluid flows along field lines, they carry the particles with them, and this enhances the losses caused by collisions. The changes in topology are critical for the plasma confinement. The study of plasma turbulence and the concomitant transport is a challenging problem. Because of the importance of solving the plasma turbulence problem for controlled thermonuclear research, the high complexity of the problem, and the necessity of attacking the problem with supercomputers, the study of plasma turbulence in magnetic confinement devices is a Grand Challenge problem

  8. Quantum Hamiltonian Physics with Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Vary, James P.

    2014-06-15

    The vision of solving the nuclear many-body problem in a Hamiltonian framework with fundamental interactions tied to QCD via Chiral Perturbation Theory is gaining support. The goals are to preserve the predictive power of the underlying theory, to test fundamental symmetries with the nucleus as laboratory and to develop new understandings of the full range of complex quantum phenomena. Advances in theoretical frameworks (renormalization and many-body methods) as well as in computational resources (new algorithms and leadership-class parallel computers) signal a new generation of theory and simulations that will yield profound insights into the origins of nuclear shell structure, collective phenomena and complex reaction dynamics. Fundamental discovery opportunities also exist in such areas as physics beyond the Standard Model of Elementary Particles, the transition between hadronic and quark–gluon dominated dynamics in nuclei and signals that characterize dark matter. I will review some recent achievements and present ambitious consensus plans along with their challenges for a coming decade of research that will build new links between theory, simulations and experiment. Opportunities for graduate students to embark upon careers in the fast developing field of supercomputer simulations is also discussed.

  9. Japanese issues on the future behavior of the geological environment

    International Nuclear Information System (INIS)

    Aoki, Kaz; Nakatsuka, Noboru; Ishimaru, Tsuneari

    1994-01-01

    Comprehending and predicting the future states of the geological environment is very important in ensuring a safe geological disposal of high level radioactive wastes (HLW). This paper is one in a series of studies required to ascertain the existence of a geologically stable area in Japan over the long term. In particular, interest is focussed on the aspect of accumulating data on behavior patterns of selected natural phenomena which will enable predictions of future behavior of geological processes and finding of areas of long term stability. While this paper limits itself to the second and part of the third step, the overall flow-chart of study on natural processes and events which may perturb the geological environment entails three major steps. They include: (i) identification of natural processes and events relevant to long term stability of geological environment to be evaluated; (ii) characterization of the identified natural processes and events; and (iii) prediction of the probability of occurrence, magnitude and influence of the natural processes and events which may perturb the geological environment. (J.P.N)

  10. A training program for scientific supercomputing users

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, F.; Moher, T.; Sabelli, N.; Solem, A.

    1988-01-01

    There is need for a mechanism to transfer supercomputing technology into the hands of scientists and engineers in such a way that they will acquire a foundation of knowledge that will permit integration of supercomputing as a tool in their research. Most computing center training emphasizes computer-specific information about how to use a particular computer system; most academic programs teach concepts to computer scientists. Only a few brief courses and new programs are designed for computational scientists. This paper describes an eleven-week training program aimed principally at graduate and postdoctoral students in computationally-intensive fields. The program is designed to balance the specificity of computing center courses, the abstractness of computer science courses, and the personal contact of traditional apprentice approaches. It is based on the experience of computer scientists and computational scientists, and consists of seminars and clinics given by many visiting and local faculty. It covers a variety of supercomputing concepts, issues, and practices related to architecture, operating systems, software design, numerical considerations, code optimization, graphics, communications, and networks. Its research component encourages understanding of scientific computing and supercomputer hardware issues. Flexibility in thinking about computing needs is emphasized by the use of several different supercomputer architectures, such as the Cray X/MP48 at the National Center for Supercomputing Applications at University of Illinois at Urbana-Champaign, IBM 3090 600E/VF at the Cornell National Supercomputer Facility, and Alliant FX/8 at the Advanced Computing Research Facility at Argonne National Laboratory. 11 refs., 6 tabs.

  11. The pan-European environment: glimpses into an uncertain future

    International Nuclear Information System (INIS)

    2007-01-01

    The rapidly changing nature of and increasing inter-linkages between many socio-economic phenomena - population growth and migration, globalisation and trade, personal consumption patterns and use of natural resources . are reflected in many of today's environment policy priorities: minimising and adapting to climate change; loss of biodiversity and ecosystem services; the degradation of such natural resources as land, freshwater and oceans; and the impacts of a wide range of pollutants on our environment and our health. The challenges that environmental policy makers are facing in this century are already very different from those of the last. Given the rapid change in socio.economic trends, both designing and implementing actions are becoming much more complex, and the way in which such policies deliver effective outcomes seems to be becoming increasingly uncertain. Alongside this, the time.lags between policy demands and institutional responses are often lengthening, with the institutional structures charged with designing and implementing agreed actions needing to change in order to keep up with this process. This report aims to contribute to the discussion about plausible future developments relevant to the wider European region and to stimulate medium to long-term thinking in policy-making circles. It does so by sketching some of the key environmental concerns for the pan-European region based on the EEA's Europe's environment - The fourth assessment, and by highlighting some of the many uncertainties the future holds. (au)

  12. TOP500 Supercomputers for June 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-06-23

    21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamos National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.

  13. TOP500 Supercomputers for June 2002

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2002-06-20

    19th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 19th edition of the TOP500 list of the worlds fastest supercomputers was released today (June 20, 2002). The recently installed Earth Simulator supercomputer at the Earth Simulator Center in Yokohama, Japan, is as expected the clear new number 1. Its performance of 35.86 Tflop/s (trillions of calculations per second) running the Linpack benchmark is almost five times higher than the performance of the now No.2 IBM ASCI White system at Lawrence Livermore National Laboratory (7.2 Tflop/s). This powerful leap frogging to the top by a system so much faster than the previous top system is unparalleled in the history of the TOP500.

  14. Status reports of supercomputing astrophysics in Japan

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Nagasawa, Mikio

    1990-01-01

    The Workshop on Supercomputing Astrophysics was held at National Laboratory for High Energy Physics (KEK, Tsukuba) from August 31 to September 2, 1989. More than 40 participants of physicists, astronomers were attendant and discussed many topics in the informal atmosphere. The main purpose of this workshop was focused on the theoretical activities in computational astrophysics in Japan. It was also aimed to promote effective collaboration between the numerical experimentists working on supercomputing technique. The various subjects of the presented papers of hydrodynamics, plasma physics, gravitating systems, radiative transfer and general relativity are all stimulating. In fact, these numerical calculations become possible now in Japan owing to the power of Japanese supercomputer such as HITAC S820, Fujitsu VP400E and NEC SX-2. (J.P.N.)

  15. The future regulatory environment - a South African perspective

    International Nuclear Information System (INIS)

    Van der Woude, S.; Leaver, J.; Metcalf, P.E.

    2000-01-01

    The South African nuclear regulatory authority, the National Nuclear Regulator, regulates nuclear fuel cycle facilities as well as a large variety of mining and minerals processing activities. The future political, social, economical and technological environment, within which these facilities operate, will present numerous challenges to those who will be regulating them. In our presentation the challenges to be fulfilled in discharging the regulatory function are discussed, particularly in the context of a country with a small nuclear programme and a substantial developing component. Amongst the challenges discussed are: As part of the growing internationalization, the need to harmonize standards applied in different countries and the need to balance standards and practice applied in developed countries with resources available in developing countries; The need to consider the impact on the environment and not only on human beings; The impact of rapid advances in information technology on regulation; The maintenance and development of the appropriate expertise in the face of uncertainties regarding the future of the nuclear industry; Public involvement; The demands by society for greater standards of safety but at the same time for more effective and cost-effective regulation; The need for regulators to match customer demands on operators in terms of quality, speed, flexibility and costs; The privatization of nuclear fuel cycle facilities; The increased trend for larger facilities to outsource work to smaller companies; and, The need to balance good practice considerations with quantitatively determined risks in regulatory decision-making. (author)

  16. The future regulatory environment - a South African perspective

    Energy Technology Data Exchange (ETDEWEB)

    Van der Woude, S.; Leaver, J.; Metcalf, P.E. [National Nuclear Regulator, Centurion (South Africa)

    2000-07-01

    The South African nuclear regulatory authority, the National Nuclear Regulator, regulates nuclear fuel cycle facilities as well as a large variety of mining and minerals processing activities. The future political, social, economical and technological environment, within which these facilities operate, will present numerous challenges to those who will be regulating them. In our presentation the challenges to be fulfilled in discharging the regulatory function are discussed, particularly in the context of a country with a small nuclear programme and a substantial developing component. Amongst the challenges discussed are: As part of the growing internationalization, the need to harmonize standards applied in different countries and the need to balance standards and practice applied in developed countries with resources available in developing countries; The need to consider the impact on the environment and not only on human beings; The impact of rapid advances in information technology on regulation; The maintenance and development of the appropriate expertise in the face of uncertainties regarding the future of the nuclear industry; Public involvement; The demands by society for greater standards of safety but at the same time for more effective and cost-effective regulation; The need for regulators to match customer demands on operators in terms of quality, speed, flexibility and costs; The privatization of nuclear fuel cycle facilities; The increased trend for larger facilities to outsource work to smaller companies; and, The need to balance good practice considerations with quantitatively determined risks in regulatory decision-making. (author)

  17. Global environment outlook GEO5. Environment for the future we want

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-05-15

    The main goal of UNEP's Global Environment Outlook (GEO) is to keep governments and stakeholders informed of the state and trends of the global environment. Over the past 15 years, the GEO reports have examined a wealth of data, information and knowledge about the global environment; identified potential policy responses; and provided an outlook for the future. The assessments, and their consultative and collaborative processes, have worked to bridge the gap between science and policy by turning the best available scientific knowledge into information relevant for decision makers. The GEO-5 report is made up of 17 chapters organized into three distinct but linked parts. Part 1 - State and trends of the global environment; Part 2 - Policy options from the regions; Part 3 - Opportunities for a global response.

  18. Global environment outlook GEO5. Environment for the future we want

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-05-15

    The main goal of UNEP's Global Environment Outlook (GEO) is to keep governments and stakeholders informed of the state and trends of the global environment. Over the past 15 years, the GEO reports have examined a wealth of data, information and knowledge about the global environment; identified potential policy responses; and provided an outlook for the future. The assessments, and their consultative and collaborative processes, have worked to bridge the gap between science and policy by turning the best available scientific knowledge into information relevant for decision makers. The GEO-5 report is made up of 17 chapters organized into three distinct but linked parts. Part 1 - State and trends of the global environment; Part 2 - Policy options from the regions; Part 3 - Opportunities for a global response.

  19. Role of nuclear fusion in future energy systems and the environment under future uncertainties

    International Nuclear Information System (INIS)

    Tokimatsu, Koji; Fujino, Jun'ichi; Konishi, Satoshi; Ogawa, Yuichi; Yamaji, Kenji

    2003-01-01

    Debates about whether or not to invest heavily in nuclear fusion as a future innovative energy option have been made within the context of energy technology development strategies. This is because the prospects for nuclear fusion are quite uncertain and the investments therefore carry the risk of quite large regrets, even though investment is needed in order to develop the technology. The timeframe by which nuclear fusion could become competitive in the energy market has not been adequately studied, nor has roles of the nuclear fusion in energy systems and the environment. The present study has two objectives. One is to reveal the conditions under which nuclear fusion could be introduced economically (hereafter, we refer to such introductory conditions as breakeven prices) in future energy systems. The other objective is to evaluate the future roles of nuclear fusion in energy systems and in the environment. Here we identify three roles that nuclear fusion will take on when breakeven prices are achieved: (i) a portion of the electricity market in 2100, (ii) reduction of annual global total energy systems cost, and (iii) mitigation of carbon tax (shadow price of carbon) under CO 2 constraints. Future uncertainties are key issues in evaluating nuclear fusion. Here we treated the following uncertainties: energy demand scenarios, introduction timeframe for nuclear fusion, capacity projections of nuclear fusion, CO 2 target in 2100, capacity utilization ratio of options in energy/environment technologies, and utility discount rates. From our investigations, we conclude that the presently designed nuclear fusion reactors may be ready for economical introduction into energy systems beginning around 2050-2060, and we can confirm that the favorable introduction of the reactors would reduce both the annual energy systems cost and the carbon tax (the shadow price of carbon) under a CO 2 concentration constraint

  20. Towards a future robotic home environment: a survey.

    Science.gov (United States)

    Güttler, Jörg; Georgoulas, Christos; Linner, Thomas; Bock, Thomas

    2015-01-01

    Demographic change has resulted in an increase of elderly people, while at the same time the number of active working people is falling. In the future, there will be less caretaking, which is necessary to support the aging population. In order to enable the aged population to live in dignity, they should be able to perform activities of daily living (ADLs) as independently as possible. The aim of this paper is to describe several solutions and concepts that can support elderly people in their ADLs in a way that allows them to stay self-sufficient for as long as possible. To reach this goal, the Building Realization and Robotics Lab is researching in the field of ambient assisted living. The idea is to implement robots and sensors in the home environment so as to efficiently support the inhabitants in their ADLs and eventually increase their independence. Through embedding vital sensors into furniture and using ICT technologies, the health status of elderly people can be remotely evaluated by a physician or family members. By investigating ergonomic aspects specific to elderly people (e.g. via an age-simulation suit), it is possible to develop and test new concepts and novel applications, which will offer innovative solutions. Via the introduction of mechatronics and robotics, the home environment can be made able to seamlessly interact with the inhabitant through gestures, vocal commands, and visual recognition algorithms. Meanwhile, several solutions have been developed that address how to build a smart home environment in order to create an ambient assisted environment. This article describes how these concepts were developed. The approach for each concept, proposed in this article, was performed as follows: (1) research of needs, (2) creating definitions of requirements, (3) identification of necessary technology and processes, (4) building initial concepts, (5) experiments in a real environment, and (6) development of the final concepts. To keep these concepts

  1. Computational plasma physics and supercomputers. Revision 1

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1985-01-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular models, but parallel processing poses new programming difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematical models

  2. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    International Nuclear Information System (INIS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-01-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers. (paper)

  3. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-05-15

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation.

  4. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    International Nuclear Information System (INIS)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho

    2008-01-01

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation

  5. Centralized supercomputer support for magnetic fusion energy research

    International Nuclear Information System (INIS)

    Fuss, D.; Tull, G.G.

    1984-01-01

    High-speed computers with large memories are vital to magnetic fusion energy research. Magnetohydrodynamic (MHD), transport, equilibrium, Vlasov, particle, and Fokker-Planck codes that model plasma behavior play an important role in designing experimental hardware and interpreting the resulting data, as well as in advancing plasma theory itself. The size, architecture, and software of supercomputers to run these codes are often the crucial constraints on the benefits such computational modeling can provide. Hence, vector computers such as the CRAY-1 offer a valuable research resource. To meet the computational needs of the fusion program, the National Magnetic Fusion Energy Computer Center (NMFECC) was established in 1974 at the Lawrence Livermore National Laboratory. Supercomputers at the central computing facility are linked to smaller computer centers at each of the major fusion laboratories by a satellite communication network. In addition to providing large-scale computing, the NMFECC environment stimulates collaboration and the sharing of computer codes and data among the many fusion researchers in a cost-effective manner

  6. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  7. Problem solving in nuclear engineering using supercomputers

    International Nuclear Information System (INIS)

    Schmidt, F.; Scheuermann, W.; Schatz, A.

    1987-01-01

    The availability of supercomputers enables the engineer to formulate new strategies for problem solving. One such strategy is the Integrated Planning and Simulation System (IPSS). With the integrated systems, simulation models with greater consistency and good agreement with actual plant data can be effectively realized. In the present work some of the basic ideas of IPSS are described as well as some of the conditions necessary to build such systems. Hardware and software characteristics as realized are outlined. (orig.) [de

  8. Mantle Convection on Modern Supercomputers

    Science.gov (United States)

    Weismüller, J.; Gmeiner, B.; Huber, M.; John, L.; Mohr, M.; Rüde, U.; Wohlmuth, B.; Bunge, H. P.

    2015-12-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures is handled successfully only in an interdisciplinary context. A new priority program - named SPPEXA - by the German Research Foundation (DFG) addresses this issue, and brings together computer scientists, mathematicians and application scientists around grand challenges in HPC. Here we report from the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection and assess the impact of small scale processes on global mantle flow.

  9. Radiation Environments for Future Human Exploration Throughout the Solar System.

    Science.gov (United States)

    Schwadron, N.; Gorby, M.; Linker, J.; Riley, P.; Torok, T.; Downs, C.; Spence, H. E.; Desai, M. I.; Mikic, Z.; Joyce, C. J.; Kozarev, K. A.; Townsend, L. W.; Wimmer-Schweingruber, R. F.

    2016-12-01

    Acute space radiation hazards pose one of the most serious risks to future human and robotic exploration. The ability to predict when and where large events will occur is necessary in order to mitigate their hazards. The largest events are usually associated with complex sunspot groups (also known as active regions) that harbor strong, stressed magnetic fields. Highly energetic protons accelerated very low in the corona by the passage of coronal mass ejection (CME)-driven compressions or shocks and from flares travel near the speed of light, arriving at Earth minutes after the eruptive event. Whether these particles actually reach Earth, the Moon, Mars (or any other point) depends on their transport in the interplanetary magnetic field and their magnetic connection to the shock. Recent contemporaneous observations during the largest events in almost a decade show the unique longitudinal distributions of this ionizing radiation broadly distributed from sources near the Sun and yet highly isolated during the passage of CME shocks. Over the last decade, we have observed space weather events as the solar wind exhibits extremely low densities and magnetic field strengths, representing states that have never been observed during the space age. The highly abnormal solar activity during cycles 23 and 24 has caused the longest solar minimum in over 80 years and continues into the unusually small solar maximum of cycle 24. As a result of the remarkably weak solar activity, we have also observed the highest fluxes of galactic cosmic rays in the space age and relatively small particle radiation events. We have used observations from LRO/CRaTER to examine the implications of these highly unusual solar conditions for human space exploration throughout the inner solar system. While these conditions are not a show-stopper for long-duration missions (e.g., to the Moon, an asteroid, or Mars), galactic cosmic ray radiation remains a significant and worsening factor that limits

  10. Energy-water-environment nexus underpinning future desalination sustainability

    KAUST Repository

    Shahzad, Muhammad Wakil; Burhan, Muhammad; Ang, Li; Ng, Kim Choon

    2017-01-01

    Energy-water-environment nexus is very important to attain COP21 goal, maintaining environment temperature increase below 2°C, but unfortunately two third share of CO2 emission has already been used and the remaining will be exhausted by 2050. A

  11. Cloud based spectrum manager for future wireless regulatory environment

    CSIR Research Space (South Africa)

    Masonta, MT

    2015-12-01

    Full Text Available The regulatory environment in radio frequency spectrum management lags the advancement of wireless technologies, especially in the area of cognitive radio and dynamic spectrum access. In this paper we argue that the solution towards spectrum Pareto...

  12. Energy, society and environment. Technology for a sustainable future

    International Nuclear Information System (INIS)

    Elliott, D.

    1997-04-01

    Energy, Society and Environment examines energy and energy use, and the interactions between technology, society and the environment. The book is clearly structured to examine; Key environmental issues, and the harmful impacts of energy use; New technological solutions to environmental problems; Implementation of possible solutions, and Implications for society in developing a sustainable approach to energy use. Social processes and strategic solutions to problems are located within a clear, technological context with topical case studies. (UK)

  13. Urban warming in Tokyo area and counterplan to improve future environment

    International Nuclear Information System (INIS)

    Saitoh, T.S.; Hoshi, H.

    1993-01-01

    The rapid progress in industrialization and concentration of economic and social functions in urban areas has stimulated a consistent increase in population and energy consumption. The sudden urbanization in modern cities has caused environmental problems including alternation of the local climate. This is a phenomenon peculiar to the urban areas, and is characterized by a consistent rise in the temperature of the urban atmosphere, an increase in air pollutants, a decrease in relative humidity, and so on. The phenomenon characterized by a noticeable temperature rise in the urban atmosphere has been called the urban heat island and analyzed by both observational and numerical approaches. The numerical model can be classified into two ways: the mechanical model and energy balance model. Since Howard reported on the urban heat island in London, there have been a number of observational studies and numerical studies based on the two-dimensional modeling. Recently, three-dimensional studies have been reported simultaneously with great the advancement of the supercomputer. The present paper reports the results of the field observation by automobiles in the Tokyo metropolitan area and also the results of the three-dimensional simulation for urban warming in Tokyo at present and in the future around 2030. Further, the authors also present the results of a simulation for the effect of tree planting and vegetation

  14. A workbench for tera-flop supercomputing

    International Nuclear Information System (INIS)

    Resch, M.M.; Kuester, U.; Mueller, M.S.; Lang, U.

    2003-01-01

    Supercomputers currently reach a peak performance in the range of TFlop/s. With but one exception - the Japanese Earth Simulator - none of these systems has so far been able to also show a level of sustained performance for a variety of applications that comes close to the peak performance. Sustained TFlop/s are therefore rarely seen. The reasons are manifold and are well known: Bandwidth and latency both for main memory and for the internal network are the key internal technical problems. Cache hierarchies with large caches can bring relief but are no remedy to the problem. However, there are not only technical problems that inhibit the full exploitation by scientists of the potential of modern supercomputers. More and more organizational issues come to the forefront. This paper shows the approach of the High Performance Computing Center Stuttgart (HLRS) to deliver a sustained performance of TFlop/s for a wide range of applications from a large group of users spread over Germany. The core of the concept is the role of the data. Around this we design a simulation workbench that hides the complexity of interacting computers, networks and file systems from the user. (authors)

  15. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  16. Creating Food Futures. Trade, Ethics and the Environment

    NARCIS (Netherlands)

    Farnworth, C.R.; Jiggins, J.L.S.; Thomas, E.V.

    2008-01-01

    A global transformation in food supply and consumption is placing our food security at risk. What changes need to be made to the ways we trade, process and purchase our food if everyone in the world is going to have enough wholesome food to eat? Is there genuine scope for creating food futures that

  17. The Millennial Generation: Developing Leaders for the Future Security Environment

    Science.gov (United States)

    2011-02-15

    While Millenials possess a number of admirable and positive traits that posture them well for the future, there are also some challenges with this...why the military isn‟t producing more of them. The article concluded that the most beneficial experiences were, “ sustained international experience

  18. Language Learning in Virtual Reality Environments: Past, Present, and Future

    Science.gov (United States)

    Lin, Tsun-Ju; Lan, Yu-Ju

    2015-01-01

    This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…

  19. The Future Role of Librarians in the Virtual Library Environment.

    Science.gov (United States)

    Burke, Liz

    2002-01-01

    Considers the role of librarians in a virtual library environment. Highlights include providing intellectual access to information in any format; evaluating available sources of information; organizing information; ensuring the preservation of information; providing specialized staff to help meet information needs; and the economic impact of…

  20. The future of levies in a digital environment: final report

    NARCIS (Netherlands)

    Hugenholtz, P.B.; Guibault, L.; van Geffen, S.

    2003-01-01

    Copyright levy systems have been premised on the assumption that private copying of protected works cannot be controlled and exploited individually. With the advent of digital rights management (DRM), this assumption must be re-examined. In the digital environment, technical protection measures and

  1. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  2. Value of the future: Discounting in random environments

    Science.gov (United States)

    Farmer, J. Doyne; Geanakoplos, John; Masoliver, Jaume; Montero, Miquel; Perelló, Josep

    2015-05-01

    We analyze how to value future costs and benefits when they must be discounted relative to the present. We introduce the subject for the nonspecialist and take into account the randomness of the economic evolution by studying the discount function of three widely used processes for the dynamics of interest rates: Ornstein-Uhlenbeck, Feller, and log-normal. Besides obtaining exact expressions for the discount function and simple asymptotic approximations, we show that historical average interest rates overestimate long-run discount rates and that this effect can be large. In other words, long-run discount rates should be substantially less than the average rate observed in the past, otherwise any cost-benefit calculation would be biased in favor of the present and against interventions that may protect the future.

  3. ASPECTS OF THE MANAGER ACTIVITIES WITHIN THE FUTURE COMPETITIVE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    GHEORGHE FLORIN BUŞE

    2012-01-01

    Full Text Available The first decade of this century was unsettled regarding the management concepts and instruments. From the points of view of the total quality projects, product development time, product power, adapting management, behaviors and values, teams, networks and alliances, this incertitude represents a permanent seek of the ways to deal with the significant competitive discontinuities. Although every initiative may contain important elements which go through the essence of things, until now there was no consensus related to the managerial changing nature. The sole conclusion after these studies is that the managerial work will be different in the future. This paper underlines the most important competitive discontinuities and draws a model of the future managerial work.

  4. Past successes and future challenges: Improving the urban environment

    Energy Technology Data Exchange (ETDEWEB)

    Gade, M.

    1994-12-31

    The author discusses issues related to the Chicago urban environment from her perspective in the Illinois Environmental Protection Agency. Understanding of the ozone air pollution problem in the Chicago area has undergone significant changes in the past three years, and there is still more to be understood about the complex factors which contribute to ozone pollution over urban areas such as Chicago. Ability to address these problems to present clean air standards is not in hand at present. The author asserts that information, and the ability of governmental agencies to ingest and respond to that information in a timely manner is a key to improvement of the environment in urban areas in reasonable time spans. In addition cost and price information on environmental control and protection needs to be more clearly presented to the people so they can understand the difficult choices which must be made in addressing these environmental problems.

  5. Past successes and future challenges: Improving the urban environment

    International Nuclear Information System (INIS)

    Gade, M.

    1994-01-01

    The author discusses issues related to the Chicago urban environment from her perspective in the Illinois Environmental Protection Agency. Understanding of the ozone air pollution problem in the Chicago area has undergone significant changes in the past three years, and there is still more to be understood about the complex factors which contribute to ozone pollution over urban areas such as Chicago. Ability to address these problems to present clean air standards is not in hand at present. The author asserts that information, and the ability of governmental agencies to ingest and respond to that information in a timely manner is a key to improvement of the environment in urban areas in reasonable time spans. In addition cost and price information on environmental control and protection needs to be more clearly presented to the people so they can understand the difficult choices which must be made in addressing these environmental problems

  6. 10. Symposium energy and environment - responsibility for the future

    International Nuclear Information System (INIS)

    2003-01-01

    The symposium discussed important aspects relating to the subject of energy and environment. The detailed and well-funded lectures and statements were received with great interest by the 120 attendants. The discussion focused on problems of power generation and consumption, increased shares of renewable energy sources, ethical and theological questions. The symposium received funds from Deutsche Bundesstiftung Umwelt and was well accepted by the press [de

  7. Scoping the future: a model for integrating learning environments

    OpenAIRE

    Honeychurch, Sarah; Barr, Niall

    2013-01-01

    The Virtual Learning Environment (VLE) has become synonymous with online learning in HE.However, with the rise of Web 2.0 technologies, social networking tools and cloud computing thearchitecture of the current VLEs is increasingly anachronistic. This paper suggests an alternative tothe traditional VLE: one which allows for flexibility and adaptation to the needs of individual teachers,while remaining resilient and providing students with a seamless experience. We present a prototypeof our vi...

  8. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-01-01

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  9. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-03-13

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  10. Supercomputing Centers and Electricity Service Providers

    DEFF Research Database (Denmark)

    Patki, Tapasya; Bates, Natalie; Ghatikar, Girish

    2016-01-01

    from a detailed, quantitative survey-based analysis and compare the perspectives of the European grid and SCs to the ones of the United States (US). We then show that contrary to the expectation, SCs in the US are more open toward cooperating and developing demand-management strategies with their ESPs......Supercomputing Centers (SCs) have high and variable power demands, which increase the challenges of the Electricity Service Providers (ESPs) with regards to efficient electricity distribution and reliable grid operation. High penetration of renewable energy generation further exacerbates...... this problem. In order to develop a symbiotic relationship between the SCs and their ESPs and to support effective power management at all levels, it is critical to understand and analyze how the existing relationships were formed and how these are expected to evolve. In this paper, we first present results...

  11. Multi-petascale highly efficient parallel supercomputer

    Science.gov (United States)

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen-Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2018-05-15

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaflop-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC). The ASIC nodes are interconnected by a five dimensional torus network that optimally maximize the throughput of packet communications between nodes and minimize latency. The network implements collective network and a global asynchronous network that provides global barrier and notification functions. Integrated in the node design include a list-based prefetcher. The memory system implements transaction memory, thread level speculation, and multiversioning cache that improves soft error rate at the same time and supports DMA functionality allowing for parallel processing message-passing.

  12. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  13. Environment and future of the nuclear energy in France

    International Nuclear Information System (INIS)

    Lebas, G.

    1999-01-01

    This work presents the problem of the renewal of the French electro-nuclear park with respect to the energetic, economical, environmental, political and ethical aspects. The theoretical framework chosen for this analysis is the one of sustainable development because of the uncertainty, irreversibility and equity aspects characterizing this choice. Thus, this work evaluates the capacity of the nuclear technology to ensure the simultaneous reproduction of the economical sphere, of the human sphere and of the biosphere. The past, present and future energy situation of France is analyzed in the first chapter together with the characteristics of the nuclear choice. In the second chapter, the analysis of the different possible energy options leads to the conclusion that the nuclear option remains the most suitable for a conciliation between economy and ecology, but that a diversification of the reactor technologies is necessary to take advantage of the efficiency of each technology with respect to its use. The nuclear choice has the advantage to limit the arbitration between the economical, ecological, political and human stakes. The realization of the diversification project supposes to leave opened all energy options and to be prepared to the replacement of the present day power plants by 2010-2020. The success of this policy will depend on the risk mastery and information efforts that public authorities and nuclear industry actors will carry on to avoid any social opposition with respect to nuclear energy. (J.S.)

  14. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  15. BSMBench: a flexible and scalable supercomputer benchmark from computational particle physics

    CERN Document Server

    Bennett, Ed; Del Debbio, Luigi; Jordan, Kirk; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2016-01-01

    Benchmarking plays a central role in the evaluation of High Performance Computing architectures. Several benchmarks have been designed that allow users to stress various components of supercomputers. In order for the figures they provide to be useful, benchmarks need to be representative of the most common real-world scenarios. In this work, we introduce BSMBench, a benchmarking suite derived from Monte Carlo code used in computational particle physics. The advantage of this suite (which can be freely downloaded from http://www.bsmbench.org/) over others is the capacity to vary the relative importance of computation and communication. This enables the tests to simulate various practical situations. To showcase BSMBench, we perform a wide range of tests on various architectures, from desktop computers to state-of-the-art supercomputers, and discuss the corresponding results. Possible future directions of development of the benchmark are also outlined.

  16. Building more powerful less expensive supercomputers using Processing-In-Memory (PIM) LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Richard C.

    2009-09-01

    This report details the accomplishments of the 'Building More Powerful Less Expensive Supercomputers Using Processing-In-Memory (PIM)' LDRD ('PIM LDRD', number 105809) for FY07-FY09. Latency dominates all levels of supercomputer design. Within a node, increasing memory latency, relative to processor cycle time, limits CPU performance. Between nodes, the same increase in relative latency impacts scalability. Processing-In-Memory (PIM) is an architecture that directly addresses this problem using enhanced chip fabrication technology and machine organization. PIMs combine high-speed logic and dense, low-latency, high-bandwidth DRAM, and lightweight threads that tolerate latency by performing useful work during memory transactions. This work examines the potential of PIM-based architectures to support mission critical Sandia applications and an emerging class of more data intensive informatics applications. This work has resulted in a stronger architecture/implementation collaboration between 1400 and 1700. Additionally, key technology components have impacted vendor roadmaps, and we are in the process of pursuing these new collaborations. This work has the potential to impact future supercomputer design and construction, reducing power and increasing performance. This final report is organized as follow: this summary chapter discusses the impact of the project (Section 1), provides an enumeration of publications and other public discussion of the work (Section 1), and concludes with a discussion of future work and impact from the project (Section 1). The appendix contains reprints of the refereed publications resulting from this work.

  17. 'Create the future': an environment for excellence in teaching future-oriented Industrial Design Engineering

    NARCIS (Netherlands)

    Eger, Arthur O.; Lutters, Diederick; van Houten, Frederikus J.A.M.

    2004-01-01

    In 2001, the University of Twente started a new course on Industrial Design Engineering. This paper describes the insights that have been employed in developing the curriculum, and in developing the environment in which the educational activities are facilitated. The University of Twente has a broad

  18. Integration of Titan supercomputer at OLCF with ATLAS Production System

    Science.gov (United States)

    Barreiro Megino, F.; De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wells, J.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job submission to Titan’s batch queues and local data management, with lightweight MPI wrappers to run single node workloads in parallel on Titan’s multi-core worker nodes. It provides for running of standard ATLAS production jobs on unused resources (backfill) on Titan. The system already allowed ATLAS to collect on Titan millions of core-hours per month, execute hundreds of thousands jobs, while simultaneously improving Titans utilization efficiency. We will discuss the details of the implementation, current experience with running the system, as well as future plans aimed at improvements in scalability and efficiency. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to

  19. The Potential of Simulated Environments in Teacher Education: Current and Future Possibilities

    Science.gov (United States)

    Dieker, Lisa A.; Rodriguez, Jacqueline A.; Lignugaris/Kraft, Benjamin; Hynes, Michael C.; Hughes, Charles E.

    2014-01-01

    The future of virtual environments is evident in many fields but is just emerging in the field of teacher education. In this article, the authors provide a summary of the evolution of simulation in the field of teacher education and three factors that need to be considered as these environments further develop. The authors provide a specific…

  20. Design of multiple sequence alignment algorithms on parallel, distributed memory supercomputers.

    Science.gov (United States)

    Church, Philip C; Goscinski, Andrzej; Holt, Kathryn; Inouye, Michael; Ghoting, Amol; Makarychev, Konstantin; Reumann, Matthias

    2011-01-01

    The challenge of comparing two or more genomes that have undergone recombination and substantial amounts of segmental loss and gain has recently been addressed for small numbers of genomes. However, datasets of hundreds of genomes are now common and their sizes will only increase in the future. Multiple sequence alignment of hundreds of genomes remains an intractable problem due to quadratic increases in compute time and memory footprint. To date, most alignment algorithms are designed for commodity clusters without parallelism. Hence, we propose the design of a multiple sequence alignment algorithm on massively parallel, distributed memory supercomputers to enable research into comparative genomics on large data sets. Following the methodology of the sequential progressiveMauve algorithm, we design data structures including sequences and sorted k-mer lists on the IBM Blue Gene/P supercomputer (BG/P). Preliminary results show that we can reduce the memory footprint so that we can potentially align over 250 bacterial genomes on a single BG/P compute node. We verify our results on a dataset of E.coli, Shigella and S.pneumoniae genomes. Our implementation returns results matching those of the original algorithm but in 1/2 the time and with 1/4 the memory footprint for scaffold building. In this study, we have laid the basis for multiple sequence alignment of large-scale datasets on a massively parallel, distributed memory supercomputer, thus enabling comparison of hundreds instead of a few genome sequences within reasonable time.

  1. Symbolic simulation of engineering systems on a supercomputer

    International Nuclear Information System (INIS)

    Ragheb, M.; Gvillo, D.; Makowitz, H.

    1986-01-01

    Model-Based Production-Rule systems for analysis are developed for the symbolic simulation of Complex Engineering systems on a CRAY X-MP Supercomputer. The Fault-Tree and Event-Tree Analysis methodologies from Systems-Analysis are used for problem representation and are coupled to the Rule-Based System Paradigm from Knowledge Engineering to provide modelling of engineering devices. Modelling is based on knowledge of the structure and function of the device rather than on human expertise alone. To implement the methodology, we developed a production-Rule Analysis System that uses both backward-chaining and forward-chaining: HAL-1986. The inference engine uses an Induction-Deduction-Oriented antecedent-consequent logic and is programmed in Portable Standard Lisp (PSL). The inference engine is general and can accommodate general modifications and additions to the knowledge base. The methodologies used will be demonstrated using a model for the identification of faults, and subsequent recovery from abnormal situations in Nuclear Reactor Safety Analysis. The use of the exposed methodologies for the prognostication of future device responses under operational and accident conditions using coupled symbolic and procedural programming is discussed

  2. Analyzing the Interplay of Failures and Workload on a Leadership-Class Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, Esteban [University of Pittsburgh; Ni, Xiang [University of Illinois at Urbana-Champaign; Jones, Terry R [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The unprecedented computational power of cur- rent supercomputers now makes possible the exploration of complex problems in many scientific fields, from genomic analysis to computational fluid dynamics. Modern machines are powerful because they are massive: they assemble millions of cores and a huge quantity of disks, cards, routers, and other components. But it is precisely the size of these machines that glooms the future of supercomputing. A system that comprises many components has a high chance to fail, and fail often. In order to make the next generation of supercomputers usable, it is imperative to use some type of fault tolerance platform to run applications on large machines. Most fault tolerance strategies can be optimized for the peculiarities of each system and boost efficacy by keeping the system productive. In this paper, we aim to understand how failure characterization can improve resilience in several layers of the software stack: applications, runtime systems, and job schedulers. We examine the Titan supercomputer, one of the fastest systems in the world. We analyze a full year of Titan in production and distill the failure patterns of the machine. By looking into Titan s log files and using the criteria of experts, we provide a detailed description of the types of failures. In addition, we inspect the job submission files and describe how the system is used. Using those two sources, we cross correlate failures in the machine to executing jobs and provide a picture of how failures affect the user experience. We believe such characterization is fundamental in developing appropriate fault tolerance solutions for Cray systems similar to Titan.

  3. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  4. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  5. Retail food environments research: Promising future with more work to be done.

    Science.gov (United States)

    Fuller, Daniel; Engler-Stringer, Rachel; Muhajarine, Nazeem

    2016-06-09

    As members of the scientific committee for the Food Environments in Canada conference, we reflect on the current state of food environments research in Canada. We are very encouraged that the field is growing and there have been many collaborative efforts to link researchers in Canada, including the 2015 Food Environments in Canada Symposium and Workshop. We believe there are 5 key challenges the field will need to collectively address: theory and causality; replication and extension; consideration of rural, northern and vulnerable populations; policy analysis; and intervention research. In addressing the challenges, we look forward to working together to conduct more sophisticated, complex and community-driven food environments research in the future.

  6. Performance analysis of job scheduling policies in parallel supercomputing environments

    Energy Technology Data Exchange (ETDEWEB)

    Naik, V.K.; Squillante, M.S. [IBM T.J. Watson Research Center, Yorktown Heights, NY (United States); Setia, S.K. [George Mason Univ., Fairfax, VA (United States). Dept. of Computer Science

    1993-12-31

    In this paper the authors analyze three general classes of scheduling policies under a workload typical of largescale scientific computing. These policies differ in the manner in which processors are partitioned among the jobs as well as the way in which jobs are prioritized for execution on the partitions. Their results indicate that existing static schemes do not perform well under varying workloads. Adaptive policies tend to make better scheduling decisions, but their ability to adjust to workload changes is limited. Dynamic partitioning policies, on the other hand, yield the best performance and can be tuned to provide desired performance differences among jobs with varying resource demands.

  7. Plastics, the environment and human health: current consensus and future trends

    OpenAIRE

    Thompson, Richard C.; Moore, Charles J.; vom Saal, Frederick S.; Swan, Shanna H.

    2009-01-01

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However...

  8. Re-inventing electromagnetics - Supercomputing solution of Maxwell's equations via direct time integration on space grids

    International Nuclear Information System (INIS)

    Taflove, A.

    1992-01-01

    This paper summarizes the present state and future directions of applying finite-difference and finite-volume time-domain techniques for Maxwell's equations on supercomputers to model complex electromagnetic wave interactions with structures. Applications so far have been dominated by radar cross section technology, but by no means are limited to this area. In fact, the gains we have made place us on the threshold of being able to make tremendous contributions to non-defense electronics and optical technology. Some of the most interesting research in these commercial areas is summarized. 47 refs

  9. JINR supercomputer of the module type for event parallel analysis

    International Nuclear Information System (INIS)

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  10. The Future of the Brigade Combat Team: Air-Ground Integration and the Operating Environment

    Science.gov (United States)

    2017-06-09

    coordinate, and control joint and multinational aircraft during CAS situations in combat and training. The current system which the CAS mission falls...current system , experiences from Vietnam, Operation Desert Storm, Afghanistan and Iraq help to identify future challenges to the operating environment ...multinational partners. 15. SUBJECT TERMS Air Ground Integration, Theater Air Ground System , Theater Air Control System , Army Air Ground System , Joint

  11. 2009 Community College Futures Assembly Focus: Leading Change--Leading in an Uncertain Environment

    Science.gov (United States)

    Campbell, Dale F.; Morris, Phillip A.

    2009-01-01

    The Community College Futures Assembly has served as a national, independent policy thinktank since 1995. Its purpose is to articulate the critical issues facing American community colleges and recognize innovative programs. Convening annually in January in Orlando, Florida, the Assembly offers a learning environment where tough questions are…

  12. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jacobsen, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  13. What about Place? Considering the Role of Physical Environment on Youth Imagining of Future Possible Selves

    Science.gov (United States)

    Prince, Dana

    2013-01-01

    Identity research indicates that development of well elaborated cognitions about oneself in the future, or one's possible selves, is consequential for youths' developmental trajectories, influencing a range of social, health, and educational outcomes. Although the theory of possible selves considers the role of social contexts in identity development, the potential influence of the physical environment is understudied. At the same time, a growing body of work spanning multiple disciplines points to the salience of place, or the meaningful physical environments of people's everyday lives, as an active contributor to self-identity. Bridging these two lines of inquiry, I provide evidence to show how place-based experiences, such as belonging, aversion, and entrapment, may be internalized and encoded into possible selves, thus producing emplaced future self-concept. I suggest that for young people, visioning self in the future is inextricably bound with place; place is an active contributor both in the present development of future self-concept and in enabling young people to envision different future possible places. Implications for practice and future research include place-making interventions and conceptualizing place beyond “neighborhood effects.” PMID:25642137

  14. Comments on the parallelization efficiency of the Sunway TaihuLight supercomputer

    OpenAIRE

    Végh, János

    2016-01-01

    In the world of supercomputers, the large number of processors requires to minimize the inefficiencies of parallelization, which appear as a sequential part of the program from the point of view of Amdahl's law. The recently suggested new figure of merit is applied to the recently presented supercomputer, and the timeline of "Top 500" supercomputers is scrutinized using the metric. It is demonstrated, that in addition to the computing performance and power consumption, the new supercomputer i...

  15. Convex unwraps its first grown-up supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Manuel, T.

    1988-03-03

    Convex Computer Corp.'s new supercomputer family is even more of an industry blockbuster than its first system. At a tenfold jump in performance, it's far from just an incremental upgrade over its first minisupercomputer, the C-1. The heart of the new family, the new C-2 processor, churning at 50 million floating-point operations/s, spawns a group of systems whose performance could pass for some fancy supercomputers-namely those of the Cray Research Inc. family. When added to the C-1, Convex's five new supercomputers create the C series, a six-member product group offering a performance range from 20 to 200 Mflops. They mark an important transition for Convex from a one-product high-tech startup to a multinational company with a wide-ranging product line. It's a tough transition but the Richardson, Texas, company seems to be doing it. The extended product line propels Convex into the upper end of the minisupercomputer class and nudges it into the low end of the big supercomputers. It positions Convex in an uncrowded segment of the market in the $500,000 to $1 million range offering 50 to 200 Mflops of performance. The company is making this move because the minisuper area, which it pioneered, quickly became crowded with new vendors, causing prices and gross margins to drop drastically.

  16. QCD on the BlueGene/L Supercomputer

    International Nuclear Information System (INIS)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-01-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented

  17. QCD on the BlueGene/L Supercomputer

    Science.gov (United States)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-03-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented.

  18. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  19. Mathematical methods and supercomputing in nuclear applications. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    Kuesters, H.; Stein, E.; Werner, W.

    1993-04-01

    All papers of the two volumes are separately indexed in the data base. Main topics are: Progress in advanced numerical techniques, fluid mechanics, on-line systems, artificial intelligence applications, nodal methods reactor kinetics, reactor design, supercomputer architecture, probabilistic estimation of risk assessment, methods in transport theory, advances in Monte Carlo techniques, and man-machine interface. (orig.)

  20. Role of supercomputers in magnetic fusion and energy research programs

    International Nuclear Information System (INIS)

    Killeen, J.

    1985-06-01

    The importance of computer modeling in magnetic fusion (MFE) and energy research (ER) programs is discussed. The need for the most advanced supercomputers is described, and the role of the National Magnetic Fusion Energy Computer Center in meeting these needs is explained

  1. Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer

    Science.gov (United States)

    Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.

  2. Integration of Panda Workload Management System with supercomputers

    Science.gov (United States)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads

  3. Behaviour and control of radionuclides in the environment: present state of knowledge and future needs

    International Nuclear Information System (INIS)

    Myttenaere, C.

    1983-01-01

    The Radiation Protection Programme of the European Communities is discussed in the context of the behaviour and control of radionuclides in the environment with reference to the aims of the programme, the results of current research activities and requirements for future studies. The summarised results of the radioecological research activities for 1976 - 1980 include the behaviour of α-emitters (Pu, Am, Cm), 99 Tc, 137 Cs, 144 Ce, 106 Ru and 125 Sb in marine environments; atmospheric dispersion of radionuclides; and the transport of radionuclides in components of freshwater and terrestrial ecosystems. (U.K.)

  4. Integration of PanDA workload management system with Titan supercomputer at OLCF

    Science.gov (United States)

    De, K.; Klimentov, A.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, the future LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). The current approach utilizes a modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multicore worker nodes. It also gives PanDA new capability to collect, in real time, information about unused worker nodes on Titan, which allows precise definition of the size and duration of jobs submitted to Titan according to available free resources. This capability significantly reduces PanDA job wait time while improving Titan's utilization efficiency. This implementation was tested with a variety of Monte-Carlo workloads on Titan and is being tested on several other supercomputing platforms. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  5. Design Rework Prediction in Concurrent Design Environment: Current Trends and Future Research Directions

    OpenAIRE

    Arundachawat, Panumas; Roy, Rajkumar; Al-Ashaab, Ahmed; Shehab, Essam

    2009-01-01

    Organised by: Cranfield University This paper aims to present state-of-the-art and formulate future research areas on design rework in concurrent design environment. Related literatures are analysed to extract the key factors which impact design rework. Design rework occurs due to changes from upstream design activities and/or by feedbacks from downstream design activities. Design rework is considered as negative iteration; therefore, value in design activities will be increase...

  6. Perspectives on Advanced Learning Technologies and Learning Networks and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    An overview of the advanced learning technologies is given in this presentation along with a brief description of their impact on future aerospace workforce development. The presentation is divided into five parts (see Figure 1). In the first part, a brief historical account of the evolution of learning technologies is given. The second part describes the current learning activities. The third part describes some of the future aerospace systems, as examples of high-tech engineering systems, and lists their enabling technologies. The fourth part focuses on future aerospace research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on learning technologies and learning networks.

  7. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  8. Anaesthesia in austere environments: literature review and considerations for future space exploration missions.

    Science.gov (United States)

    Komorowski, Matthieu; Fleming, Sarah; Mawkin, Mala; Hinkelbein, Jochen

    2018-01-01

    Future space exploration missions will take humans far beyond low Earth orbit and require complete crew autonomy. The ability to provide anaesthesia will be important given the expected risk of severe medical events requiring surgery. Knowledge and experience of such procedures during space missions is currently extremely limited. Austere and isolated environments (such as polar bases or submarines) have been used extensively as test beds for spaceflight to probe hazards, train crews, develop clinical protocols and countermeasures for prospective space missions. We have conducted a literature review on anaesthesia in austere environments relevant to distant space missions. In each setting, we assessed how the problems related to the provision of anaesthesia (e.g., medical kit and skills) are dealt with or prepared for. We analysed how these factors could be applied to the unique environment of a space exploration mission. The delivery of anaesthesia will be complicated by many factors including space-induced physiological changes and limitations in skills and equipment. The basic principles of a safe anaesthesia in an austere environment (appropriate training, presence of minimal safety and monitoring equipment, etc.) can be extended to the context of a space exploration mission. Skills redundancy is an important safety factor, and basic competency in anaesthesia should be part of the skillset of several crewmembers. The literature suggests that safe and effective anaesthesia could be achieved by a physician during future space exploration missions. In a life-or-limb situation, non-physicians may be able to conduct anaesthetic procedures, including simplified general anaesthesia.

  9. The Future of Coral Reefs Subject to Rapid Climate Change: Lessons from Natural Extreme Environments

    Directory of Open Access Journals (Sweden)

    Emma F. Camp

    2018-02-01

    Full Text Available Global climate change and localized anthropogenic stressors are driving rapid declines in coral reef health. In vitro experiments have been fundamental in providing insight into how reef organisms will potentially respond to future climates. However, such experiments are inevitably limited in their ability to reproduce the complex interactions that govern reef systems. Studies examining coral communities that already persist under naturally-occurring extreme and marginal physicochemical conditions have therefore become increasingly popular to advance ecosystem scale predictions of future reef form and function, although no single site provides a perfect analog to future reefs. Here we review the current state of knowledge that exists on the distribution of corals in marginal and extreme environments, and geographic sites at the latitudinal extremes of reef growth, as well as a variety of shallow reef systems and reef-neighboring environments (including upwelling and CO2 vent sites. We also conduct a synthesis of the abiotic data that have been collected at these systems, to provide the first collective assessment on the range of extreme conditions under which corals currently persist. We use the review and data synthesis to increase our understanding of the biological and ecological mechanisms that facilitate survival and success under sub-optimal physicochemical conditions. This comprehensive assessment can begin to: (i highlight the extent of extreme abiotic scenarios under which corals can persist, (ii explore whether there are commonalities in coral taxa able to persist in such extremes, (iii provide evidence for key mechanisms required to support survival and/or persistence under sub-optimal environmental conditions, and (iv evaluate the potential of current sub-optimal coral environments to act as potential refugia under changing environmental conditions. Such a collective approach is critical to better understand the future survival of

  10. Research on biomass energy and environment from the past to the future: A bibliometric analysis.

    Science.gov (United States)

    Mao, Guozhu; Huang, Ning; Chen, Lu; Wang, Hongmei

    2018-09-01

    The development and utilization of biomass energy can help to change the ways of energy production and consumption and establish a sustainable energy system that can effectively promote the development of the national economy and strengthen the protection of the environment. Here,we perform a bibliometric analysis of 9514 literature reports in the Web of Science Core Collection searched with the key words "Biomass energy" and "Environment*" date from 1998 to 2017; hot topics in the research and development of biomass energy utilization, as well as the status and development trends of biomass energy utilization and the environment, were analyzed based on content analysis and bibliometrics. The interaction between biomass energy and the environment began to become a major concern as the research progressively deepened. This work is of great significance for the development and utilization of biomass energy to put forward specific suggestions and strategies based on the analysis and demonstration of relationships and interactions between biomass energy utilization and environment. It is also useful to researchers for selecting the future research topics. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Tryton Supercomputer Capabilities for Analysis of Massive Data Streams

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2015-09-01

    Full Text Available The recently deployed supercomputer Tryton, located in the Academic Computer Center of Gdansk University of Technology, provides great means for massive parallel processing. Moreover, the status of the Center as one of the main network nodes in the PIONIER network enables the fast and reliable transfer of data produced by miscellaneous devices scattered in the area of the whole country. The typical examples of such data are streams containing radio-telescope and satellite observations. Their analysis, especially with real-time constraints, can be challenging and requires the usage of dedicated software components. We propose a solution for such parallel analysis using the supercomputer, supervised by the KASKADA platform, which with the conjunction with immerse 3D visualization techniques can be used to solve problems such as pulsar detection and chronometric or oil-spill simulation on the sea surface.

  12. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  13. Exploration and production environment. Preserving the future our responsibility; Exploration et production environnement. Preserver l'avenir: notre responsabilite

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This document presents the Total Group commitments to manage natural resources in a rational way, to preserve biodiversity for future generations and protect the environment. It contains the health, safety, environment and quality charter of Total, the 12 exploration and production health, safety and environment rules and the exploration and production environmental policy. (A.L.B.)

  14. Exploration and production environment. Preserving the future our responsibility; Exploration et production environnement. Preserver l'avenir: notre responsabilite

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This document presents the Total Group commitments to manage natural resources in a rational way, to preserve biodiversity for future generations and protect the environment. It contains the health, safety, environment and quality charter of Total, the 12 exploration and production health, safety and environment rules and the exploration and production environmental policy. (A.L.B.)

  15. Visualizing quantum scattering on the CM-2 supercomputer

    International Nuclear Information System (INIS)

    Richardson, J.L.

    1991-01-01

    We implement parallel algorithms for solving the time-dependent Schroedinger equation on the CM-2 supercomputer. These methods are unconditionally stable as well as unitary at each time step and have the advantage of being spatially local and explicit. We show how to visualize the dynamics of quantum scattering using techniques for visualizing complex wave functions. Several scattering problems are solved to demonstrate the use of these methods. (orig.)

  16. Development of seismic tomography software for hybrid supercomputers

    Science.gov (United States)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on

  17. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  18. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    Directory of Open Access Journals (Sweden)

    Khimich, O.M.

    2016-09-01

    Full Text Available New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  19. Cellular-automata supercomputers for fluid-dynamics modeling

    International Nuclear Information System (INIS)

    Margolus, N.; Toffoli, T.; Vichniac, G.

    1986-01-01

    We report recent developments in the modeling of fluid dynamics, and give experimental results (including dynamical exponents) obtained using cellular automata machines. Because of their locality and uniformity, cellular automata lend themselves to an extremely efficient physical realization; with a suitable architecture, an amount of hardware resources comparable to that of a home computer can achieve (in the simulation of cellular automata) the performance of a conventional supercomputer

  20. Porting Ordinary Applications to Blue Gene/Q Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Maheshwari, Ketan C.; Wozniak, Justin M.; Armstrong, Timothy; Katz, Daniel S.; Binkowski, T. Andrew; Zhong, Xiaoliang; Heinonen, Olle; Karpeyev, Dmitry; Wilde, Michael

    2015-08-31

    Efficiently porting ordinary applications to Blue Gene/Q supercomputers is a significant challenge. Codes are often originally developed without considering advanced architectures and related tool chains. Science needs frequently lead users to want to run large numbers of relatively small jobs (often called many-task computing, an ensemble, or a workflow), which can conflict with supercomputer configurations. In this paper, we discuss techniques developed to execute ordinary applications over leadership class supercomputers. We use the high-performance Swift parallel scripting framework and build two workflow execution techniques-sub-jobs and main-wrap. The sub-jobs technique, built on top of the IBM Blue Gene/Q resource manager Cobalt's sub-block jobs, lets users submit multiple, independent, repeated smaller jobs within a single larger resource block. The main-wrap technique is a scheme that enables C/C++ programs to be defined as functions that are wrapped by a high-performance Swift wrapper and that are invoked as a Swift script. We discuss the needs, benefits, technicalities, and current limitations of these techniques. We further discuss the real-world science enabled by these techniques and the results obtained.

  1. Proceedings of the first energy research power supercomputer users symposium

    International Nuclear Information System (INIS)

    1991-01-01

    The Energy Research Power Supercomputer Users Symposium was arranged to showcase the richness of science that has been pursued and accomplished in this program through the use of supercomputers and now high performance parallel computers over the last year: this report is the collection of the presentations given at the Symposium. ''Power users'' were invited by the ER Supercomputer Access Committee to show that the use of these computational tools and the associated data communications network, ESNet, go beyond merely speeding up computations. Today the work often directly contributes to the advancement of the conceptual developments in their fields and the computational and network resources form the very infrastructure of today's science. The Symposium also provided an opportunity, which is rare in this day of network access to computing resources, for the invited users to compare and discuss their techniques and approaches with those used in other ER disciplines. The significance of new parallel architectures was highlighted by the interesting evening talk given by Dr. Stephen Orszag of Princeton University

  2. Extracting the Textual and Temporal Structure of Supercomputing Logs

    Energy Technology Data Exchange (ETDEWEB)

    Jain, S; Singh, I; Chandra, A; Zhang, Z; Bronevetsky, G

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an online clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.

  3. Planning, Implementation and Optimization of Future space Missions using an Immersive Visualization Environement (IVE) Machine

    Science.gov (United States)

    Harris, E.

    Planning, Implementation and Optimization of Future Space Missions using an Immersive Visualization Environment (IVE) Machine E. N. Harris, Lockheed Martin Space Systems, Denver, CO and George.W. Morgenthaler, U. of Colorado at Boulder History: A team of 3-D engineering visualization experts at the Lockheed Martin Space Systems Company have developed innovative virtual prototyping simulation solutions for ground processing and real-time visualization of design and planning of aerospace missions over the past 6 years. At the University of Colorado, a team of 3-D visualization experts are developing the science of 3-D visualization and immersive visualization at the newly founded BP Center for Visualization, which began operations in October, 2001. (See IAF/IAA-01-13.2.09, "The Use of 3-D Immersive Visualization Environments (IVEs) to Plan Space Missions," G. A. Dorn and G. W. Morgenthaler.) Progressing from Today's 3-D Engineering Simulations to Tomorrow's 3-D IVE Mission Planning, Simulation and Optimization Techniques: 3-D (IVEs) and visualization simulation tools can be combined for efficient planning and design engineering of future aerospace exploration and commercial missions. This technology is currently being developed and will be demonstrated by Lockheed Martin in the (IVE) at the BP Center using virtual simulation for clearance checks, collision detection, ergonomics and reach-ability analyses to develop fabrication and processing flows for spacecraft and launch vehicle ground support operations and to optimize mission architecture and vehicle design subject to realistic constraints. Demonstrations: Immediate aerospace applications to be demonstrated include developing streamlined processing flows for Reusable Space Transportation Systems and Atlas Launch Vehicle operations and Mars Polar Lander visual work instructions. Long-range goals include future international human and robotic space exploration missions such as the development of a Mars

  4. The radioactive risk - the future of radionuclides in the environment and their impacts on health

    International Nuclear Information System (INIS)

    Amiard, Jean-Claude

    2013-01-01

    This document contains a brief presentation and the table of contents of a book in which the author proposes a large synthesis of present knowledge on main radioactive pollutants (uranium, transuranic elements, caesium, strontium, iodine, tritium, carbon radioactive isotopes, and so on), their behaviour and their future in the various physical components of the environment and living organisms (including mankind). He presents the fundamentals of nuclear physics and chemistry, as well as their applications in different fields (military, energy, medicine, industry, etc.). He also addresses the important ecological and genetic notions, and recalls the anthropogenic origins of radionuclides in the environment: principles of radio-ecology, main radioactive risks, main drawbacks of the use of nuclear energy (wastes and their management), and nuclear accidents and their impact

  5. Robots, multi-user virtual environments and healthcare: synergies for future directions.

    Science.gov (United States)

    Moon, Ajung; Grajales, Francisco J; Van der Loos, H F Machiel

    2011-01-01

    The adoption of technology in healthcare over the last twenty years has steadily increased, particularly as it relates to medical robotics and Multi-User Virtual Environments (MUVEs) such as Second Life. Both disciplines have been shown to improve the quality of care and have evolved, for the most part, in isolation from each other. In this paper, we present four synergies between medical robotics and MUVEs that have the potential to decrease resource utilization and improve the quality of healthcare delivery. We conclude with some foreseeable barriers and future research directions for researchers in these fields.

  6. The present and future of microplastic pollution in the marine environment

    International Nuclear Information System (INIS)

    Ivar do Sul, Juliana A.; Costa, Monica F.

    2014-01-01

    Recently, research examining the occurrence of microplastics in the marine environment has substantially increased. Field and laboratory work regularly provide new evidence on the fate of microplastic debris. This debris has been observed within every marine habitat. In this study, at least 101 peer-reviewed papers investigating microplastic pollution were critically analysed (Supplementary material). Microplastics are commonly studied in relation to (1) plankton samples, (2) sandy and muddy sediments, (3) vertebrate and invertebrate ingestion, and (4) chemical pollutant interactions. All of the marine organism groups are at an eminent risk of interacting with microplastics according to the available literature. Dozens of works on other relevant issues (i.e., polymer decay at sea, new sampling and laboratory methods, emerging sources, externalities) were also analysed and discussed. This paper provides the first in-depth exploration of the effects of microplastics on the marine environment and biota. The number of scientific publications will increase in response to present and projected plastic uses and discard patterns. Therefore, new themes and important approaches for future work are proposed. Highlights: • >100 works on microplastic marine pollution were reviewed and discussed. • Microplastics (fibres, fragments, pellets) are widespread in oceans and sediments. • Microplastics interact with POPs and contaminate the marine biota when ingested. • The marine food web might be affected by microplastic biomagnification. • Urgently needed integrated approaches are suggested to different stakeholders. -- Microplastics, which are ubiquitous in marine habitats, affect all facets of the environment and continuously cause unexpected consequences for the environment and its biota

  7. Strategic planning for future learning environments: an exploration of interpersonal, interprofessional and political factors.

    Science.gov (United States)

    Schmidt, Cathrine

    2013-09-01

    This article, written from the stance of a public planner and a policy maker, explores the challenges and potential in creating future learning environments through the concept of a new learning landscape. It is based on the belief that physical planning can support the strategic goals of universities. In Denmark, a political focus on education as a mean to improve national capacity for innovation and growth are redefining the universities role in society. This is in turn changing the circumstances for the physical planning. Drawing on examples of physical initiatives in three different scales--city, building and room scale, the paper highlights how space and place matters on an interpersonal, an interprofessional and a political level. The article suggests that a wider understanding of how new learning landscapes are created--both as a material reality and a political discourse--can help frame an emerging community of practice. This involves university leaders, faculty and students, architects, designers and urban planners, citizens and policy makers with the common goal of creating future learning environments today.

  8. IQ, the Urban Environment, and Their Impact on Future Schizophrenia Risk in Men.

    Science.gov (United States)

    Toulopoulou, Timothea; Picchioni, Marco; Mortensen, Preben Bo; Petersen, Liselotte

    2017-09-01

    Exposure to an urban environment during early life and low IQ are 2 well-established risk factors for schizophrenia. It is not known, however, how these factors might relate to one another. Data were pooled from the North Jutland regional draft board IQ assessments and the Danish Conscription Registry for men born between 1955 and 1993. Excluding those who were followed up for less than 1 year after the assessment yielded a final cohort of 153170 men of whom 578 later developed a schizophrenia spectrum disorder. We found significant effects of having an urban birth, and also experiencing an increase in urbanicity before the age of 10 years, on adult schizophrenia risk. The effect of urban birth was independent of IQ. However, there was a significant interaction between childhood changes in urbanization in the first 10 years and IQ level on the future adult schizophrenia risk. In short, those subjects who moved to more or less urban areas before their 10th birthday lost the protective effect of IQ. When thinking about adult schizophrenia risk, the critical time window of childhood sensitivity to changes in urbanization seems to be linked to IQ. Given the prediction that by 2050, over 80% of the developed world's population will live in an urban environment, this represents a major future public health issue. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Debating the future of comfort: environmental sustainability, energy consumption and the indoor environment

    Energy Technology Data Exchange (ETDEWEB)

    Chappells, H.; Shove, E.

    2005-02-01

    Vast quantities of energy are consumed in heating and cooling to provide what are now regarded as acceptable standards of thermal comfort. In the UK as in a number of other countries, there is a real danger that responses in anticipation of global warming and climate change - including growing reliance on air-conditioning - will increase energy demand and CO{sub 2} emissions even further. This is an appropriate moment to reflect on the history and future of comfort, both as an idea and as a material reality. Based on interviews and discussions with UK policy makers and building practitioners involved in specifying and constructing what will become the indoor environments of the future, four possible scenarios are identified each with different implications for energy and resource consumption. By actively promoting debate about the indoor environment and associated ways of life, it may yet be possible to avoid becoming locked into social and technical trajectories that are ultimately unsustainable. The aim of this paper is to inspire and initiate just such a discussion through demonstrating that comfort is a highly negotiable socio-cultural construct. (author)

  10. Predicting the future impact of droughts on ungulate populations in arid and semi-arid environments.

    Directory of Open Access Journals (Sweden)

    Clare Duncan

    Full Text Available Droughts can have a severe impact on the dynamics of animal populations, particularly in semi-arid and arid environments where herbivore populations are strongly limited by resource availability. Increased drought intensity under projected climate change scenarios can be expected to reduce the viability of such populations, yet this impact has seldom been quantified. In this study, we aim to fill this gap and assess how the predicted worsening of droughts over the 21(st century is likely to impact the population dynamics of twelve ungulate species occurring in arid and semi-arid habitats. Our results provide support to the hypotheses that more sedentary, grazing and mixed feeding species will be put at high risk from future increases in drought intensity, suggesting that management intervention under these conditions should be targeted towards species possessing these traits. Predictive population models for all sedentary, grazing or mixed feeding species in our study show that their probability of extinction dramatically increases under future emissions scenarios, and that this extinction risk is greater for smaller populations than larger ones. Our study highlights the importance of quantifying the current and future impacts of increasing extreme natural events on populations and species in order to improve our ability to mitigate predicted biodiversity loss under climate change.

  11. Passive BCI in Operational Environments: Insights, Recent Advances, and Future Trends.

    Science.gov (United States)

    Arico, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Sciaraffa, Nicolina; Colosimo, Alfredo; Babiloni, Fabio

    2017-07-01

    This minireview aims to highlight recent important aspects to consider and evaluate when passive brain-computer interface (pBCI) systems would be developed and used in operational environments, and remarks future directions of their applications. Electroencephalography (EEG) based pBCI has become an important tool for real-time analysis of brain activity since it could potentially provide covertly-without distracting the user from the main task-and objectively-not affected by the subjective judgment of an observer or the user itself-information about the operator cognitive state. Different examples of pBCI applications in operational environments and new adaptive interface solutions have been presented and described. In addition, a general overview regarding the correct use of machine learning techniques (e.g., which algorithm to use, common pitfalls to avoid, etc.) in the pBCI field has been provided. Despite recent innovations on algorithms and neurotechnology, pBCI systems are not completely ready to enter the market yet, mainly due to limitations of the EEG electrodes technology, and algorithms reliability and capability in real settings. High complexity and safety critical systems (e.g., airplanes, ATM interfaces) should adapt their behaviors and functionality accordingly to the user' actual mental state. Thus, technologies (i.e., pBCIs) able to measure in real time the user's mental states would result very useful in such "high risk" environments to enhance human machine interaction, and so increase the overall safety.

  12. Applications of supercomputing and the utility industry: Calculation of power transfer capabilities

    International Nuclear Information System (INIS)

    Jensen, D.D.; Behling, S.R.; Betancourt, R.

    1990-01-01

    Numerical models and iterative simulation using supercomputers can furnish cost-effective answers to utility industry problems that are all but intractable using conventional computing equipment. An example of the use of supercomputers by the utility industry is the determination of power transfer capability limits for power transmission systems. This work has the goal of markedly reducing the run time of transient stability codes used to determine power distributions following major system disturbances. To date, run times of several hours on a conventional computer have been reduced to several minutes on state-of-the-art supercomputers, with further improvements anticipated to reduce run times to less than a minute. In spite of the potential advantages of supercomputers, few utilities have sufficient need for a dedicated in-house supercomputing capability. This problem is resolved using a supercomputer center serving a geographically distributed user base coupled via high speed communication networks

  13. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  14. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    International Nuclear Information System (INIS)

    Delbecq, J.M.; Banner, D.

    2003-01-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  15. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    Energy Technology Data Exchange (ETDEWEB)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S [Earth Sciences Department. Barcelona Supercomputing Center. Barcelona (Spain); Cuevas, E [Izanaa Atmospheric Research Center. Agencia Estatal de Meteorologia, Tenerife (Spain); Nickovic, S [Atmospheric Research and Environment Branch, World Meteorological Organization, Geneva (Switzerland)], E-mail: carlos.perez@bsc.es

    2009-03-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  16. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    International Nuclear Information System (INIS)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S; Cuevas, E; Nickovic, S

    2009-01-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  17. Japan Environment and Children's Study: backgrounds, activities, and future directions in global perspectives.

    Science.gov (United States)

    Ishitsuka, Kazue; Nakayama, Shoji F; Kishi, Reiko; Mori, Chisato; Yamagata, Zentaro; Ohya, Yukihiro; Kawamoto, Toshihiro; Kamijima, Michihiro

    2017-07-14

    There is worldwide concern about the effects of environmental factors on children's health and development. The Miami Declaration was signed at the G8 Environment Ministers Meeting in 1997 to promote children's environmental health research. The following ministerial meetings continued to emphasize the need to foster children's research. In response to such a worldwide movement, the Ministry of the Environment, Japan (MOE), launched a nationwide birth cohort study with 100,000 pairs of mothers and children, namely, the Japan Environment and Children's Study (JECS), in 2010. Other countries have also started or planned large-scale studies focusing on children's environmental health issues. The MOE initiated dialogue among those countries and groups to discuss and share the various processes, protocols, knowledge, and techniques for future harmonization and data pooling among such studies. The MOE formed the JECS International Liaison Committee in 2011, which plays a primary role in promoting the international collaboration between JECS and the other children's environmental health research projects and partnership with other countries. This review article aims to present activities that JECS has developed. As one of the committee's activities, a workshop and four international symposia were held between 2011 and 2015 in Japan. In these conferences, international researchers and government officials, including those from the World Health Organization, have made presentations on their own birth cohort studies and health policies. In 2015, the MOE hosted the International Advisory Board meeting and received constructive comments and recommendations from the board. JECS is a founding member of the Environment and Child Health International Birth Cohort Group, and has discussed harmonization of exposure and outcome measurements with member parties, which will make it possible to compare and further combine data from different studies, considering the diversity in the

  18. Palacios and Kitten : high performance operating systems for scalable virtualized and native supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Widener, Patrick (University of New Mexico); Jaconette, Steven (Northwestern University); Bridges, Patrick G. (University of New Mexico); Xia, Lei (Northwestern University); Dinda, Peter (Northwestern University); Cui, Zheng.; Lange, John (Northwestern University); Hudson, Trammell B.; Levenhagen, Michael J.; Pedretti, Kevin Thomas Tauke; Brightwell, Ronald Brian

    2009-09-01

    Palacios and Kitten are new open source tools that enable applications, whether ported or not, to achieve scalable high performance on large machines. They provide a thin layer over the hardware to support both full-featured virtualized environments and native code bases. Kitten is an OS under development at Sandia that implements a lightweight kernel architecture to provide predictable behavior and increased flexibility on large machines, while also providing Linux binary compatibility. Palacios is a VMM that is under development at Northwestern University and the University of New Mexico. Palacios, which can be embedded into Kitten and other OSes, supports existing, unmodified applications and operating systems by using virtualization that leverages hardware technologies. We describe the design and implementation of both Kitten and Palacios. Our benchmarks show that they provide near native, scalable performance. Palacios and Kitten provide an incremental path to using supercomputer resources that is not performance-compromised.

  19. Research to application: Supercomputing trends for the 90's - Opportunities for interdisciplinary computations

    International Nuclear Information System (INIS)

    Shankar, V.

    1991-01-01

    The progression of supercomputing is reviewed from the point of view of computational fluid dynamics (CFD), and multidisciplinary problems impacting the design of advanced aerospace configurations are addressed. The application of full potential and Euler equations to transonic and supersonic problems in the 70s and early 80s is outlined, along with Navier-Stokes computations widespread during the late 80s and early 90s. Multidisciplinary computations currently in progress are discussed, including CFD and aeroelastic coupling for both static and dynamic flexible computations, CFD, aeroelastic, and controls coupling for flutter suppression and active control, and the development of a computational electromagnetics technology based on CFD methods. Attention is given to computational challenges standing in a way of the concept of establishing a computational environment including many technologies. 40 refs

  20. Cyber warfare and electronic warfare integration in the operational environment of the future: cyber electronic warfare

    Science.gov (United States)

    Askin, Osman; Irmak, Riza; Avsever, Mustafa

    2015-05-01

    For the states with advanced technology, effective use of electronic warfare and cyber warfare will be the main determining factor of winning a war in the future's operational environment. The developed states will be able to finalize the struggles they have entered with a minimum of human casualties and minimum cost thanks to high-tech. Considering the increasing number of world economic problems, the development of human rights and humanitarian law it is easy to understand the importance of minimum cost and minimum loss of human. In this paper, cyber warfare and electronic warfare concepts are examined in conjunction with the historical development and the relationship between them is explained. Finally, assessments were carried out about the use of cyber electronic warfare in the coming years.

  1. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  2. Protecting the environment for future generations. Principles and actors in international environmental law

    Energy Technology Data Exchange (ETDEWEB)

    Proelss, Alexander (ed.) [Trier Univ. (Germany). Inst. of Environmental and Technology Law

    2017-08-01

    This book compiles the written versions of presentations held at the occasion of an international symposium entitled ''Protecting the Environment for Future Generations - Principles and Actors in International Environmental Law''. The symposium was organized by the Institute of Environmental and Technology Law of Trier University (IUTR) on the basis of a cooperation scheme with the Environmental Law Institute of the Johannes Kepler University Linz, Austria, and took place in Trier on 29-30 October 2015. It brought together a distinguished group of experts from Europe and abroad to address current issues of international and European environmental law. The main objective of the symposium was to take stock of the actors and principles of international and European environmental law, and to analyze how and to what extent these principles have been implemented on the supranational and domestic legal levels.

  3. Supercomputers and the mathematical modeling of high complexity problems

    International Nuclear Information System (INIS)

    Belotserkovskii, Oleg M

    2010-01-01

    This paper is a review of many works carried out by members of our scientific school in past years. The general principles of constructing numerical algorithms for high-performance computers are described. Several techniques are highlighted and these are based on the method of splitting with respect to physical processes and are widely used in computing nonlinear multidimensional processes in fluid dynamics, in studies of turbulence and hydrodynamic instabilities and in medicine and other natural sciences. The advances and developments related to the new generation of high-performance supercomputing in Russia are presented.

  4. Performance Evaluation of Supercomputers using HPCC and IMB Benchmarks

    Science.gov (United States)

    Saini, Subhash; Ciotti, Robert; Gunney, Brian T. N.; Spelce, Thomas E.; Koniges, Alice; Dossa, Don; Adamidis, Panagiotis; Rabenseifner, Rolf; Tiyyagura, Sunil R.; Mueller, Matthias; hide

    2006-01-01

    The HPC Challenge (HPCC) benchmark suite and the Intel MPI Benchmark (IMB) are used to compare and evaluate the combined performance of processor, memory subsystem and interconnect fabric of five leading supercomputers - SGI Altix BX2, Cray XI, Cray Opteron Cluster, Dell Xeon cluster, and NEC SX-8. These five systems use five different networks (SGI NUMALINK4, Cray network, Myrinet, InfiniBand, and NEC IXS). The complete set of HPCC benchmarks are run on each of these systems. Additionally, we present Intel MPI Benchmarks (IMB) results to study the performance of 11 MPI communication functions on these systems.

  5. A fast random number generator for the Intel Paragon supercomputer

    Science.gov (United States)

    Gutbrod, F.

    1995-06-01

    A pseudo-random number generator is presented which makes optimal use of the architecture of the i860-microprocessor and which is expected to have a very long period. It is therefore a good candidate for use on the parallel supercomputer Paragon XP. In the assembler version, it needs 6.4 cycles for a real∗4 random number. There is a FORTRAN routine which yields identical numbers up to rare and minor rounding discrepancies, and it needs 28 cycles. The FORTRAN performance on other microprocessors is somewhat better. Arguments for the quality of the generator and some numerical tests are given.

  6. Review of the ASDEX upgrade data acquisition environment - present operation and future requirements

    International Nuclear Information System (INIS)

    Behler, K.; Blank, H.; Buhler, A.; Drube, R.; Friedrich, H.; Foerster, K.; Hallatschek, K.; Heimann, P.; Hertweck, F.; Maier, J.; Heimann, R.; Hertweck, F.; Maier, J.; Merkel, R.; Pacco-Duechs, M.-G.; Raupp, G.; Reuter, H.; Schneider-Maxon, U.; Tisma, R.; Zilker, M.

    1999-01-01

    The data acquisition environment of the ASDEX upgrade fusion experiment was designed in the late 1980s to handle a predicted quantity of 8 Mbytes fo data per discharge. After 7 years of operation a review of the whole data acquisition and analysis environment shows what remains of the original design ideas. Comparing the original 15 diagnostics with the present set of 250 diagnostic datasets generated per shot shows how the system has grown. Although now a vast accumulation of functional parts, the system still works in a stable manner and is maintainable. The underlying concepts affirming these qualities are modularity and compatibility. Modularity ensures that most parts of the system can be modified without affecting others. Standards for data structures and interfaces between components and methods are the prerequisites which make modularity work. The experience of the last few years shows that, besides the standards achieved, new, mainly real-time, features are needed: real-time event recognition allowing reaction to complex changing conditions; real-time wavelet analysis allowing adapted sampling rates; real-time data exchange between diagnostics and control; real-time networks allowing flexible computer coupling to permit interplay between different components; object-oriented programming concepts and databases are required for readily adaptable software modules. A final assessment of our present data processing situation and future requirements shows that modern information technology methods have to be applied more intensively to provide the most flexible means to improve the interaction of all components on a large fusion device. (orig.)

  7. The present and future of microplastic pollution in the marine environment.

    Science.gov (United States)

    Ivar do Sul, Juliana A; Costa, Monica F

    2014-02-01

    Recently, research examining the occurrence of microplastics in the marine environment has substantially increased. Field and laboratory work regularly provide new evidence on the fate of microplastic debris. This debris has been observed within every marine habitat. In this study, at least 101 peer-reviewed papers investigating microplastic pollution were critically analysed (Supplementary material). Microplastics are commonly studied in relation to (1) plankton samples, (2) sandy and muddy sediments, (3) vertebrate and invertebrate ingestion, and (4) chemical pollutant interactions. All of the marine organism groups are at an eminent risk of interacting with microplastics according to the available literature. Dozens of works on other relevant issues (i.e., polymer decay at sea, new sampling and laboratory methods, emerging sources, externalities) were also analysed and discussed. This paper provides the first in-depth exploration of the effects of microplastics on the marine environment and biota. The number of scientific publications will increase in response to present and projected plastic uses and discard patterns. Therefore, new themes and important approaches for future work are proposed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. ASPECTS OF PROFESSIONAL DEVELOPMENT OF FUTURE TEACHER OF PHYSICAL CULTURE ARE IN INFORMATIVELY-EDUCATIONAL ENVIRONMENT OF HIGHER EDUCATIONAL ESTABLISHMENT

    OpenAIRE

    Yuriy V. Dragnev

    2011-01-01

    In the article the aspects of professional development of future teacher of physical culture are examined in the informatively-educational environment of higher educational establishment. Importance of introduction of information and telecommunication technologies opens up in the sphere of higher education; the components of informatively-educational environment are given; a concept „Professional development” and „informatively-educational environment opens up”. Specified, that informative su...

  9. A future data environment - reusability vs. citability and synchronisation vs. ingestion

    Science.gov (United States)

    Fleischer, D.

    2012-04-01

    During the last decades data managers dedicated their work to the pursuit for importable data. In the recent years this chase seams to come to an end while funding organisations assume that the approach of data publications with citable data sets will eliminate denial of scientists to commit their data. But is this true for all problems we are facing at the edge of a data avalanche and data intensive science? The concept of citable data is a logical consequence from the connection of points. Potential data providers in the past complained usually about the missing of a credit assignment for data providers and they still do. The selected way of DOI captured data sets is perfectly fitting into the credit system of publisher driven publications with countable citations. This system is well known by scientists for approximately 400 years now. Unfortunately, there is a double bind situation between citeability and reusability. While cooperation of publishers and data archives are coming into existence, it is necessary to get one question clear: "Is it really worth while in the twenty-first century to force data into the publication process of the seventeenth century?" Data publications enable easy citability, but do not support easy data reusability for future users. Additional problems occur in such an environment while taking into account the chances of collaborative data corrections in the institutional repository. The future with huge amounts of data connected with publications makes reconsideration towards a more integrated approach reasonable. In the past data archives were the only infrastructures taking care of long-term data retrievability and availability. Nevertheless, they were never a part of the scientific process from data creation, analysis, interpretation and publication. Data archives were regarded as isolated islands in the sea of scientific data. Accordingly scientists considered data publications like a stumbling stone in their daily routines and

  10. Plane-wave electronic structure calculations on a parallel supercomputer

    International Nuclear Information System (INIS)

    Nelson, J.S.; Plimpton, S.J.; Sears, M.P.

    1993-01-01

    The development of iterative solutions of Schrodinger's equation in a plane-wave (pw) basis over the last several years has coincided with great advances in the computational power available for performing the calculations. These dual developments have enabled many new and interesting condensed matter phenomena to be studied from a first-principles approach. The authors present a detailed description of the implementation on a parallel supercomputer (hypercube) of the first-order equation-of-motion solution to Schrodinger's equation, using plane-wave basis functions and ab initio separable pseudopotentials. By distributing the plane-waves across the processors of the hypercube many of the computations can be performed in parallel, resulting in decreases in the overall computation time relative to conventional vector supercomputers. This partitioning also provides ample memory for large Fast Fourier Transform (FFT) meshes and the storage of plane-wave coefficients for many hundreds of energy bands. The usefulness of the parallel techniques is demonstrated by benchmark timings for both the FFT's and iterations of the self-consistent solution of Schrodinger's equation for different sized Si unit cells of up to 512 atoms

  11. Supercomputer algorithms for reactivity, dynamics and kinetics of small molecules

    International Nuclear Information System (INIS)

    Lagana, A.

    1989-01-01

    Even for small systems, the accurate characterization of reactive processes is so demanding of computer resources as to suggest the use of supercomputers having vector and parallel facilities. The full advantages of vector and parallel architectures can sometimes be obtained by simply modifying existing programs, vectorizing the manipulation of vectors and matrices, and requiring the parallel execution of independent tasks. More often, however, a significant time saving can be obtained only when the computer code undergoes a deeper restructuring, requiring a change in the computational strategy or, more radically, the adoption of a different theoretical treatment. This book discusses supercomputer strategies based upon act and approximate methods aimed at calculating the electronic structure and the reactive properties of small systems. The book shows how, in recent years, intense design activity has led to the ability to calculate accurate electronic structures for reactive systems, exact and high-level approximations to three-dimensional reactive dynamics, and to efficient directive and declaratory software for the modelling of complex systems

  12. The TeraGyroid Experiment – Supercomputing 2003

    Directory of Open Access Journals (Sweden)

    R.J. Blake

    2005-01-01

    Full Text Available Amphiphiles are molecules with hydrophobic tails and hydrophilic heads. When dispersed in solvents, they self assemble into complex mesophases including the beautiful cubic gyroid phase. The goal of the TeraGyroid experiment was to study defect pathways and dynamics in these gyroids. The UK's supercomputing and USA's TeraGrid facilities were coupled together, through a dedicated high-speed network, into a single computational Grid for research work that peaked around the Supercomputing 2003 conference. The gyroids were modeled using lattice Boltzmann methods with parameter spaces explored using many 1283 and 3grid point simulations, this data being used to inform the world's largest three-dimensional time dependent simulation with 10243-grid points. The experiment generated some 2 TBytes of useful data. In terms of Grid technology the project demonstrated the migration of simulations (using Globus middleware to and fro across the Atlantic exploiting the availability of resources. Integration of the systems accelerated the time to insight. Distributed visualisation of the output datasets enabled the parameter space of the interactions within the complex fluid to be explored from a number of sites, informed by discourse over the Access Grid. The project was sponsored by EPSRC (UK and NSF (USA with trans-Atlantic optical bandwidth provided by British Telecommunications.

  13. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to

  14. Harnessing Petaflop-Scale Multi-Core Supercomputing for Problems in Space Science

    Science.gov (United States)

    Albright, B. J.; Yin, L.; Bowers, K. J.; Daughton, W.; Bergen, B.; Kwan, T. J.

    2008-12-01

    The particle-in-cell kinetic plasma code VPIC has been migrated successfully to the world's fastest supercomputer, Roadrunner, a hybrid multi-core platform built by IBM for the Los Alamos National Laboratory. How this was achieved will be described and examples of state-of-the-art calculations in space science, in particular, the study of magnetic reconnection, will be presented. With VPIC on Roadrunner, we have performed, for the first time, plasma PIC calculations with over one trillion particles, >100× larger than calculations considered "heroic" by community standards. This allows examination of physics at unprecedented scale and fidelity. Roadrunner is an example of an emerging paradigm in supercomputing: the trend toward multi-core systems with deep hierarchies and where memory bandwidth optimization is vital to achieving high performance. Getting VPIC to perform well on such systems is a formidable challenge: the core algorithm is memory bandwidth limited with low compute-to-data ratio and requires random access to memory in its inner loop. That we were able to get VPIC to perform and scale well, achieving >0.374 Pflop/s and linear weak scaling on real physics problems on up to the full 12240-core Roadrunner machine, bodes well for harnessing these machines for our community's needs in the future. Many of the design considerations encountered commute to other multi-core and accelerated (e.g., via GPU) platforms and we modified VPIC with flexibility in mind. These will be summarized and strategies for how one might adapt a code for such platforms will be shared. Work performed under the auspices of the U.S. DOE by the LANS LLC Los Alamos National Laboratory. Dr. Bowers is a LANL Guest Scientist; he is presently at D. E. Shaw Research LLC, 120 W 45th Street, 39th Floor, New York, NY 10036.

  15. MEDIA ENVIRONMENT AS FACTOR OF REALIZATION OF CREATIVE POTENTIAL OF FUTURE TEACHERS` IN THE MOUNTAIN SCHOOLS OF THE UKRAINIAN CARPATHIANS

    Directory of Open Access Journals (Sweden)

    Alla Lebedieva

    2015-04-01

    Full Text Available The article shows up “media environment” as a factor of future teachers` creative potential realization in the mountainous schools of the Ukrainian Carpathians. The problem of using media environment as a factor of future teachers` creative potential in the mountainous schools of the Ukrainian Carpathians and the ways of its optimization is the main point of this research. Highlights ways to modernize social and professional orientation training of students in the creative process of nature is situates in information education and educational environment of high school. We consider the causal link use media environment as a factor of future teachers` creative potential and complexity of the teacher in the mountainous schools of the Ukrainian Carpathians. The basic function of the media environment are extensity, instrumental, communicative, interactive, multimedia. Reveals some aspects of training students to creatively active teaching process we describe subjects with objective possibilities in the formation of professional skills of future teachers` and which directly affect the realization of creative potential – “Ukrainian folk art”, “Basic recitation and rhetoric”, “The basis of pedagogical creativity”. The necessity of creating a full-fledged media environment in higher education is important condition of successful education as an important factor that allows the efficiency of the creative potential of future teachers` in the mountainous schools of the Ukrainian Carpathians.

  16. Plastics, the environment and human health: current consensus and future trends

    Science.gov (United States)

    Thompson, Richard C.; Moore, Charles J.; vom Saal, Frederick S.; Swan, Shanna H.

    2009-01-01

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However, concerns about usage and disposal are diverse and include accumulation of waste in landfills and in natural habitats, physical problems for wildlife resulting from ingestion or entanglement in plastic, the leaching of chemicals from plastic products and the potential for plastics to transfer chemicals to wildlife and humans. However, perhaps the most important overriding concern, which is implicit throughout this volume, is that our current usage is not sustainable. Around 4 per cent of world oil production is used as a feedstock to make plastics and a similar amount is used as energy in the process. Yet over a third of current production is used to make items of packaging, which are then rapidly discarded. Given our declining reserves of fossil fuels, and finite capacity for disposal of waste to landfill, this linear use of hydrocarbons, via packaging and other short-lived applications of plastic, is simply not sustainable. There are solutions, including material reduction, design for end-of-life recyclability, increased recycling capacity, development of bio-based feedstocks, strategies to reduce littering, the application of green chemistry life-cycle analyses and revised risk assessment approaches. Such measures will be most effective through the combined actions of the public, industry, scientists and policymakers. There is some urgency, as the quantity of plastics produced in the first 10 years of the current century is likely to approach the quantity produced in the

  17. Plastics, the environment and human health: current consensus and future trends.

    Science.gov (United States)

    Thompson, Richard C; Moore, Charles J; vom Saal, Frederick S; Swan, Shanna H

    2009-07-27

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However, concerns about usage and disposal are diverse and include accumulation of waste in landfills and in natural habitats, physical problems for wildlife resulting from ingestion or entanglement in plastic, the leaching of chemicals from plastic products and the potential for plastics to transfer chemicals to wildlife and humans. However, perhaps the most important overriding concern, which is implicit throughout this volume, is that our current usage is not sustainable. Around 4 per cent of world oil production is used as a feedstock to make plastics and a similar amount is used as energy in the process. Yet over a third of current production is used to make items of packaging, which are then rapidly discarded. Given our declining reserves of fossil fuels, and finite capacity for disposal of waste to landfill, this linear use of hydrocarbons, via packaging and other short-lived applications of plastic, is simply not sustainable. There are solutions, including material reduction, design for end-of-life recyclability, increased recycling capacity, development of bio-based feedstocks, strategies to reduce littering, the application of green chemistry life-cycle analyses and revised risk assessment approaches. Such measures will be most effective through the combined actions of the public, industry, scientists and policymakers. There is some urgency, as the quantity of plastics produced in the first 10 years of the current century is likely to approach the quantity produced in the

  18. KfK-seminar series on supercomputing und visualization from May till September 1992

    International Nuclear Information System (INIS)

    Hohenhinnebusch, W.

    1993-05-01

    During the period of may 1992 to september 1992 a series of seminars was held at KfK on several topics of supercomputing in different fields of application. The aim was to demonstrate the importance of supercomputing and visualization in numerical simulations of complex physical and technical phenomena. This report contains the collection of all submitted seminar papers. (orig./HP) [de

  19. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    Science.gov (United States)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  20. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    OpenAIRE

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment Programme (UNEP), Nairobi, Kenia; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to 2050. The study was carried out in support of the Agenda 21 interim evaluation, five years after 'Rio' and ten years after 'Brundtland'. The scenario analysis is based on only one scenario, Conventional...

  1. Automatic discovery of the communication network topology for building a supercomputer model

    Science.gov (United States)

    Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim

    2016-10-01

    The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.

  2. TRAINING OF FUTURE TEACHER OF INFORMATICS TO WORK IN MODERN INFORMATION AND EDUCATIONAL ENVIRONMENT OF SCHOOL

    Directory of Open Access Journals (Sweden)

    V. Shovkun

    2015-05-01

    Full Text Available The article analyzes the impact of new information and communication technologies in formation trends for changes in the education system. An important factor according to specific trends and satisfying the educational needs of students in the school is to create an information and communication environment (ICE. This requires the presence in educational institutions the specialists able to advise the management on the choice of hardware and software, to the design, implementation, configuration programs, serve teaching aid and others. Anonymous survey of teachers of Informatics of Kherson region is conducted and it revealed that in most cases the defined functions are performed exactly by teachers of Informatics. Only a few schools have special workers or appeal to workers or companies that provide related services. Therefore, special importance is the preparation of future teachers of Informatics for continuous tracking trends of educational technologies, self-reliant mastering of new services and applications, finding ways for their implementation in the educational process of the school, consulting colleagues, conducting explanatory work with parents. Also, in the survey we determined the level of equipment and working conditions of teachers of Informatics at school and at home.

  3. The Future of Nonproliferation in a Changed and Changing Environment: A Workshop Summary

    International Nuclear Information System (INIS)

    Dreicer, M.

    2016-01-01

    The Center for Global Security Research and Global Security Principal Directorate at Lawrence Livermore National Laboratory convened a workshop in July 2016 to consider ''The Future of Nonproliferation in a Changed and Changing Security Environment.'' We took a broad view of nonproliferation, encompassing not just the treaty regime but also arms control, threat reduction, counter-roliferation, and countering nuclear terrorism. We gathered a group of approximately 60 experts from the technical, academic, political, defense and think tank communities and asked them what and how much can reasonably be accomplished in each of these areas in the 5 to 10 years ahead. Discussion was on a not-for-attribution basis. This document provides a summary of key insights and lessons learned, and is provided to help stimulate broader public discussion of these issues. It is a collection of ideas as informally discussed and debated among a group of experts. The ideas reported here are the personal views of individual experts and should not be attributed to Lawrence Livermore National Laboratory.

  4. The Future of Nonproliferation in a Changed and Changing Environment: A Workshop Summary

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-30

    The Center for Global Security Research and Global Security Principal Directorate at Lawrence Livermore National Laboratory convened a workshop in July 2016 to consider “The Future of Nonproliferation in a Changed and Changing Security Environment.” We took a broad view of nonproliferation, encompassing not just the treaty regime but also arms control, threat reduction, counter-­proliferation, and countering nuclear terrorism. We gathered a group of approximately 60 experts from the technical, academic, political, defense and think tank communities and asked them what—and how much—can reasonably be accomplished in each of these areas in the 5 to 10 years ahead. Discussion was on a not-­for-­attribution basis. This document provides a summary of key insights and lessons learned, and is provided to help stimulate broader public discussion of these issues. It is a collection of ideas as informally discussed and debated among a group of experts. The ideas reported here are the personal views of individual experts and should not be attributed to Lawrence Livermore National Laboratory.

  5. Gene x environment interactions in conduct disorder: Implications for future treatments.

    Science.gov (United States)

    Holz, Nathalie E; Zohsel, Katrin; Laucht, Manfred; Banaschewski, Tobias; Hohmann, Sarah; Brandeis, Daniel

    2016-08-18

    Conduct disorder (CD) causes high financial and social costs, not only in affected families but across society, with only moderately effective treatments so far. There is consensus that CD is likely caused by the convergence of many different factors, including genetic and adverse environmental factors. There is ample evidence of gene-environment interactions in the etiology of CD on a behavioral level regarding genetically sensitive designs and candidate gene-driven approaches, most prominently and consistently represented by MAOA. However, conclusive indications of causal GxE patterns are largely lacking. Inconsistent findings, lack of replication and methodological limitations remain a major challenge. Likewise, research addressing the identification of affected brain pathways which reflect plausible biological mechanisms underlying GxE is still very sparse. Future research will have to take multilevel approaches into account, which combine genetic, environmental, epigenetic, personality, neural and hormone perspectives. A better understanding of relevant GxE patterns in the etiology of CD might enable researchers to design customized treatment options (e.g. biofeedback interventions) for specific subgroups of patients. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A 2-layer and P2P-based architecture on resource location in future grid environment

    International Nuclear Information System (INIS)

    Pei Erming; Sun Gongxin; Zhang Weiyi; Pang Yangguang; Gu Ming; Ma Nan

    2004-01-01

    Grid and Peer-to-Peer computing are two distributed resource sharing environments developing rapidly in recent years. The final objective of Grid, as well as that of P2P technology, is to pool large sets of resources effectively to be used in a more convenient, fast and transparent way. We can speculate that, though many difference exists, Grid and P2P environments will converge into a large scale resource sharing environment that combines the characteristics of the two environments: large diversity, high heterogeneity (of resources), dynamism, and lack of central control. Resource discovery in this future Grid environment is a basic however, important problem. In this article. We propose a two-layer and P2P-based architecture for resource discovery and design a detailed algorithm for resource request propagation in the computing environment discussed above. (authors)

  7. SUPERCOMPUTER SIMULATION OF CRITICAL PHENOMENA IN COMPLEX SOCIAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Petrus M.A. Sloot

    2014-09-01

    Full Text Available The paper describes a problem of computer simulation of critical phenomena in complex social systems on a petascale computing systems in frames of complex networks approach. The three-layer system of nested models of complex networks is proposed including aggregated analytical model to identify critical phenomena, detailed model of individualized network dynamics and model to adjust a topological structure of a complex network. The scalable parallel algorithm covering all layers of complex networks simulation is proposed. Performance of the algorithm is studied on different supercomputing systems. The issues of software and information infrastructure of complex networks simulation are discussed including organization of distributed calculations, crawling the data in social networks and results visualization. The applications of developed methods and technologies are considered including simulation of criminal networks disruption, fast rumors spreading in social networks, evolution of financial networks and epidemics spreading.

  8. Lectures in Supercomputational Neurosciences Dynamics in Complex Brain Networks

    CERN Document Server

    Graben, Peter beim; Thiel, Marco; Kurths, Jürgen

    2008-01-01

    Computational Neuroscience is a burgeoning field of research where only the combined effort of neuroscientists, biologists, psychologists, physicists, mathematicians, computer scientists, engineers and other specialists, e.g. from linguistics and medicine, seem to be able to expand the limits of our knowledge. The present volume is an introduction, largely from the physicists' perspective, to the subject matter with in-depth contributions by system neuroscientists. A conceptual model for complex networks of neurons is introduced that incorporates many important features of the real brain, such as various types of neurons, various brain areas, inhibitory and excitatory coupling and the plasticity of the network. The computational implementation on supercomputers, which is introduced and discussed in detail in this book, will enable the readers to modify and adapt the algortihm for their own research. Worked-out examples of applications are presented for networks of Morris-Lecar neurons to model the cortical co...

  9. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    Science.gov (United States)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  10. A supercomputing application for reactors core design and optimization

    International Nuclear Information System (INIS)

    Hourcade, Edouard; Gaudier, Fabrice; Arnaud, Gilles; Funtowiez, David; Ammar, Karim

    2010-01-01

    Advanced nuclear reactor designs are often intuition-driven processes where designers first develop or use simplified simulation tools for each physical phenomenon involved. Through the project development, complexity in each discipline increases and implementation of chaining/coupling capabilities adapted to supercomputing optimization process are often postponed to a further step so that task gets increasingly challenging. In the context of renewal in reactor designs, project of first realization are often run in parallel with advanced design although very dependant on final options. As a consequence, the development of tools to globally assess/optimize reactor core features, with the on-going design methods accuracy, is needed. This should be possible within reasonable simulation time and without advanced computer skills needed at project management scale. Also, these tools should be ready to easily cope with modeling progresses in each discipline through project life-time. An early stage development of multi-physics package adapted to supercomputing is presented. The URANIE platform, developed at CEA and based on the Data Analysis Framework ROOT, is very well adapted to this approach. It allows diversified sampling techniques (SRS, LHS, qMC), fitting tools (neuronal networks...) and optimization techniques (genetic algorithm). Also data-base management and visualization are made very easy. In this paper, we'll present the various implementing steps of this core physics tool where neutronics, thermo-hydraulics, and fuel mechanics codes are run simultaneously. A relevant example of optimization of nuclear reactor safety characteristics will be presented. Also, flexibility of URANIE tool will be illustrated with the presentation of several approaches to improve Pareto front quality. (author)

  11. All Possible Wars? Toward a Consensus View of the Future Security Environment, 2001-2025

    Science.gov (United States)

    2000-01-01

    technology that the truly unanticipated seems to be crowded out. Predictions from “our future as post-modern cyborgs ” to “the future of God,” would...Hables Grey, “Our Future as Post-Modern Cyborgs ,” in Didsbury, 20–40, and Robert B. Mellert, “The Future of God,” in Didsbury, 76–82. 305 See discussion in

  12. Futures

    DEFF Research Database (Denmark)

    Pedersen, Michael Haldrup

    2017-01-01

    Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores the potenti......Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores...... the potentials of speculative thinking in relation to design and social and cultural studies, arguing that both offer valuable insights for creating a speculative space for new emergent criticalities challenging current assumptions of the relations between power and design. It does so by tracing out discussions...... of ‘futurity’ and ‘futuring’ in design as well as social and cultural studies. Firstly, by discussing futurist and speculative approaches in design thinking; secondly by engaging with ideas of scenario thinking and utopianism in current social and cultural studies; and thirdly by showing how the articulation...

  13. Suitability of Agent Technology for Military Command and Control in the Future Combat System Environment

    Energy Technology Data Exchange (ETDEWEB)

    Potok, TE

    2003-02-13

    The U.S. Army is faced with the challenge of dramatically improving its war fighting capability through advanced technologies. Any new technology must provide significant improvement over existing technologies, yet be reliable enough to provide a fielded system. The focus of this paper is to assess the novelty and maturity of agent technology for use in the Future Combat System (FCS). The FCS concept represents the U.S. Army's ''mounted'' form of the Objective Force. This concept of vehicles, communications, and weaponry is viewed as a ''system of systems'' which includes net-centric command and control (C{sup 2}) capabilities. This networked C{sup 2} is an important transformation from the historically centralized, or platform-based, C{sup 2} function since a centralized command architecture may become a decision-making and execution bottleneck, particularly as the pace of war accelerates. A mechanism to ensure an effective network-centric C{sup 2} capacity (combining intelligence gathering and analysis available at lower levels in the military hierarchy) is needed. Achieving a networked C{sup 2} capability will require breakthroughs in current software technology. Many have proposed the use of agent technology as a potential solution. Agents are an emerging technology, and it is not yet clear whether it is suitable for addressing the networked C{sup 2} challenge, particularly in satisfying battlespace scalability, mobility, and security expectations. We have developed a set of software requirements for FCS based on military requirements for this system. We have then evaluated these software requirements against current computer science technology. This analysis provides a set of limitations in the current technology when applied to the FCS challenge. Agent technology is compared against this set of limitations to provide a means of assessing the novelty of agent technology in an FCS environment. From this analysis we

  14. The future of the global environment. A model-based analysis supporting UNEP's first global environment outlook

    International Nuclear Information System (INIS)

    Bakkes, J.; Van Woerden, J.; Alcamo, J.; Berk, M.; Bol, P.; Van den Born, G.J.; Ten Brink, B.; Hettelingh, J.P.; Niessen, L.; Langeweg, F.; Swart, R.

    1997-01-01

    Integrated assessments in support of environmental policy have been applied to a number of countries and regions, and to international negotiations. UNEP's first Global Environment Outlook (GEO-1) can be seen as a step towards making the tool of integrated assessment more widely available as a means for focusing action. This technical report documents RIVM's contribution to the GEO-1 report, focusing on the subject 'looking ahead'. It is illustrated that a 'what if' analysis helps to look beyond the delays in environmental and resource processes. This report illustrates that integrated assessment and modelling techniques can be excellent tools for environment and development policy-setting. The methodology, however, will need to be further developed and adapted to the realities and expectations of diverse regions, incorporating alternative policy strategies and development scenarios. This report focuses primarily on the period 1970-2015, because reliable historical data are often only generally available from 1970 onwards and the year 2015 is believed to match the time perspective of decision-makers. The findings of the analysis are reported in terms of six regions, corresponding with the division of the UNEP regional offices. Questions asked are: how will socioeconomic driving forces affect freshwater and land resources, and how will these changes mutually interact, and why are these changes important for society? Chapter 2 deals with the development of the social and economic driving forces. In the Chapters 3 and 4 it is discussed how this pressure influences selected aspects of the environment. Chapter 3 alone addresses the importance of selected elements of the interacting global element cycles for environmental quality, while Chapter 4 addresses land resources, their potential for food production and associated dependence on freshwater resources. The impacts on selected components of natural areas (Chapter 5) and society (Chapter 6) are subsequently addressed

  15. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  16. 75 FR 44303 - The Future of Aviation Advisory Committee (FAAC) Environment Subcommittee; Notice of Meeting

    Science.gov (United States)

    2010-07-28

    ... economy. The Environment Subcommittee is charged with examining steps and strategies that can be taken by... to promote effective international actions through the International Civil Aviation Organization...

  17. Aging Well and the Environment: Toward an Integrative Model and Research Agenda for the Future

    Science.gov (United States)

    Wahl, Hans-Werner; Iwarsson, Susanne; Oswald, Frank

    2012-01-01

    Purpose of the Study: The effects of the physical-spatial-technical environment on aging well have been overlooked both conceptually and empirically. In the spirit of M. Powell Lawton's seminal work on aging and environment, this article attempts to rectify this situation by suggesting a new model of how older people interact with their…

  18. NORTH-EAST ROMANIA AS A FUTURE SOURCE OF TREES FOR URBAN PAVED ENVIRONMENTS IN NORTH-WEST EUROPE

    Directory of Open Access Journals (Sweden)

    SJÖMAN HENRIK

    2009-12-01

    Full Text Available Trees are an important feature of the urban environment. The problem today lies not in finding a wide range of well-adapted tree species for park environments, but in finding species suitable for urban paved sites. In terms of north-west Europe, it is unlikely that the limited native dendroflora will provide a large variety of tree species with high tolerance to the environmental stresses characterising urban paved sites in the region. However, other regions with a comparable climate but with a rich dendroflora can potentially provide new tree species and genera well-suited to the growing conditions at urban sites in north-west Europe. This paper examines the potential of a geographical area extending over north-east Romania and the Republic of Moldavia to supply suitable tree species for urban paved sites in Central and Northern Europe (CNE. The study involved comparing the temperature, precipitation, evapotranspiration and water runoff in the woodland area of Iasi, Romania, with those the current inner-city climate of Copenhagen, Denmark and those predicted for Copenhagen 2100. The latter included urban heat island effects and predicted global climate change. The results revealed similar pattern in summer water deficit and temperature between natural woodlands in Iasi and inner-city environment of Copenhagen today. On the other hand, there is a weak match between Iasi and the future Copenhagen. In order to match the future scenario of Copenhagen with the present situation in Iasi, a greater understanding in a early phase that the solution not only depends on suitable tree species, but also on technical solutions being developed in order to have trees in paved environments in the future. On the basis of precipitation and temperature data, natural woodlands in north-east Romania have the potential to be a source of suitable trees for urban paved environments in the CNE region, even for a future climate if other aspects in the planning of trees

  19. Novel Supercomputing Approaches for High Performance Linear Algebra Using FPGAs, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Supercomputing plays a major role in many areas of science and engineering, and it has had tremendous impact for decades in areas such as aerospace, defense, energy,...

  20. The Future of Deterrent Capability for Medium-Sized Western Powers in the New Environment

    International Nuclear Information System (INIS)

    Quinlan, Michael

    2001-01-01

    What should be the longer-term future for the nuclear-weapons capabilities of France and the United Kingdom? I plan to tackle the subject in concrete terms. My presentation will be divided into three parts, and, though they are distinct rather than separate, they interact extensively. The first and largest part will relate to strategic context and concept: what aims, justifications and limitations should guide the future, or the absence of a future, for our capabilities? The second part, a good deal briefer, will be the practical content and character of the capabilities: what questions for decision will arise, and in what timescale, about the preservation, improvement or adjustment of the present capabilities? And the third part, still more briefly, will concern the political and institutional framework into which their future should or might be fitted. (author)

  1. Toward a Holistic Federated Future Internet Experimentation Environment: The Experience of NOVI Research and Experimentation

    NARCIS (Netherlands)

    Maglaris, V.; Papagianni, C.; Androulidakis, G.; Grammatikou, M.; Grosso, P.; van der Ham, J.; de Laat, C.; Pietrzak, B.; Belter, B.; Steger, J.; Laki, S.; Campanella, M.; Sallent, S.

    This article presents the design and pilot implementation of a suite of intelligent methods, algorithms, and tools for federating heterogeneous experimental platforms (domains) toward a holistic Future Internet experimentation ecosystem. The proposed framework developed within the NOVI research and

  2. Sustainability - What are the Odds? Guessing the Future of our Environment, Economy, and Society

    Science.gov (United States)

    This article examines the concept of sustainability from a global perspective, describing how alternative futures might develop in the environmental, economic, and social dimensions. The alternatives to sustainability appear to be (a) a catastrophic failure of life support, econo...

  3. SUPERCOMPUTERS FOR AIDING ECONOMIC PROCESSES WITH REFERENCE TO THE FINANCIAL SECTOR

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2014-12-01

    Full Text Available The article discusses the use of supercomputers to support business processes with particular emphasis on the financial sector. A reference was made to the selected projects that support economic development. In particular, we propose the use of supercomputers to perform artificial intel-ligence methods in banking. The proposed methods combined with modern technology enables a significant increase in the competitiveness of enterprises and banks by adding new functionality.

  4. Adventures in supercomputing: An innovative program for high school teachers

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C.E.; Hicks, H.R.; Summers, B.G. [Oak Ridge National Lab., TN (United States); Staten, D.G. [Wartburg Central High School, TN (United States)

    1994-12-31

    Within the realm of education, seldom does an innovative program become available with the potential to change an educator`s teaching methodology. Adventures in Supercomputing (AiS), sponsored by the U.S. Department of Energy (DOE), is such a program. It is a program for high school teachers that changes the teacher paradigm from a teacher-directed approach of teaching to a student-centered approach. {open_quotes}A student-centered classroom offers better opportunities for development of internal motivation, planning skills, goal setting and perseverance than does the traditional teacher-directed mode{close_quotes}. Not only is the process of teaching changed, but the cross-curricula integration within the AiS materials is remarkable. Written from a teacher`s perspective, this paper will describe the AiS program and its effects on teachers and students, primarily at Wartburg Central High School, in Wartburg, Tennessee. The AiS program in Tennessee is sponsored by Oak Ridge National Laboratory (ORNL).

  5. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  6. Micro-mechanical Simulations of Soils using Massively Parallel Supercomputers

    Directory of Open Access Journals (Sweden)

    David W. Washington

    2004-06-01

    Full Text Available In this research a computer program, Trubal version 1.51, based on the Discrete Element Method was converted to run on a Connection Machine (CM-5,a massively parallel supercomputer with 512 nodes, to expedite the computational times of simulating Geotechnical boundary value problems. The dynamic memory algorithm in Trubal program did not perform efficiently in CM-2 machine with the Single Instruction Multiple Data (SIMD architecture. This was due to the communication overhead involving global array reductions, global array broadcast and random data movement. Therefore, a dynamic memory algorithm in Trubal program was converted to a static memory arrangement and Trubal program was successfully converted to run on CM-5 machines. The converted program was called "TRUBAL for Parallel Machines (TPM." Simulating two physical triaxial experiments and comparing simulation results with Trubal simulations validated the TPM program. With a 512 nodes CM-5 machine TPM produced a nine-fold speedup demonstrating the inherent parallelism within algorithms based on the Discrete Element Method.

  7. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  8. The Future Security Environment: Why the U.S. Army Must Differentiate and Grow Millennial Officer Talent

    Science.gov (United States)

    2015-09-01

    and M. Epstein, “ Millennials and the World of Work: An Organizational and Management Perspective,” Journal of Business and Psychology, Vol. 25, 2010...Why the U.S. Army Must Differentiate and Grow Millennial Officer Talent FOR THIS AND OTHER PUBLICATIONS, VISIT US AT http://www.carlisle.army.mil...SUBTITLE The Future Security Environment: Why the U.S. Army Must Differentiate and Grow Millennial Officer Talent 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  9. What are the factors that could influence the future of work with regard to energy systems and the built environment?

    International Nuclear Information System (INIS)

    Pratt, Andy C.

    2008-01-01

    The aim of this paper is to examine which factors in energy systems and the built environment could influence the future of work. In addition, it looks at trends in relation to corporate demands for space and its specifications, and considers what the scope is for integrating business and industry within the dwelling landscape. It seeks to consider these questions on a 50-year time horizon. The paper begins by discussing the challenge of prediction of future trends, especially in a field apparently so reliant upon technological change and innovation. Because of these problems, the paper concerns itself not with picking technologies but rather with questions about the social adoption of technologies and their applications. It highlights a spectrum of coordinating mechanisms in society that are likely to be critical in shaping the future implications of built environment forms and the consequential use of energy. The scenarios discussed arise from the intersection of two tendencies: concentration versus dispersal, and local versus globally focused growth of city regions. The challenges identified in this report are associated with 'lock-in' to past governance modes of the built environment, exacerbated by rapidly changing demand structures. Demand is not simply changing in volume but also in character. The shifts that will need to be dealt with concern a fundamental issue: how activities are coordinated in society

  10. A user-friendly web portal for T-Coffee on supercomputers

    Directory of Open Access Journals (Sweden)

    Koetsier Jos

    2011-05-01

    Full Text Available Abstract Background Parallel T-Coffee (PTC was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.

  11. Sustainability—What Are the Odds? Envisioning the Future of Our Environment, Economy and Society

    Directory of Open Access Journals (Sweden)

    Stephen J. Jordan

    2013-03-01

    Full Text Available This article examines the concept of sustainability from a global perspective, describing how alternative futures might develop in the environmental, economic, and social dimensions. The alternatives to sustainability appear to be (a a catastrophic failure of life support, economies, and societies, or (b a radical technological revolution (singularity. The case is made that solutions may be found by developing a global vision of the future, estimating the probabilities of possible outcomes from multiple indicators, and looking holistically for the most likely paths to sustainability. Finally, an intuitive vision of these paths is offered as a starting point for discussion.

  12. Identification of glacial meltwater runoff in a karstic environment and its implication for present and future water availability

    Directory of Open Access Journals (Sweden)

    D. Finger

    2013-08-01

    Full Text Available Glaciers all over the world are expected to continue to retreat due to the global warming throughout the 21st century. Consequently, future seasonal water availability might become scarce once glacier areas have declined below a certain threshold affecting future water management strategies. Particular attention should be paid to glaciers located in a karstic environment, as parts of the meltwater can be drained by underlying karst systems, making it difficult to assess water availability. In this study tracer experiments, karst modeling and glacier melt modeling are combined in order to identify flow paths in a high alpine, glacierized, karstic environment (Glacier de la Plaine Morte, Switzerland and to investigate current and predict future downstream water availability. Flow paths through the karst underground were determined with natural and fluorescent tracers. Subsequently, geologic information and the findings from tracer experiments were assembled in a karst model. Finally, glacier melt projections driven with a climate scenario were performed to discuss future water availability in the area surrounding the glacier. The results suggest that during late summer glacier meltwater is rapidly drained through well-developed channels at the glacier bottom to the north of the glacier, while during low flow season meltwater enters into the karst and is drained to the south. Climate change projections with the glacier melt model reveal that by the end of the century glacier melt will be significantly reduced in the summer, jeopardizing water availability in glacier-fed karst springs.

  13. STEPS OF THE DESIGN OF CLOUD ORIENTED LEARNING ENVIRONMENT IN THE STUDY OF DATABASES FOR FUTURE TEACHERS OF INFORMATICS

    Directory of Open Access Journals (Sweden)

    Oleksandr M. Kryvonos

    2018-02-01

    Full Text Available The article describes the introduction of cloud services in the educational process of the discipline «Databases» of future teachers of informatics and the design of the cloud oriented learning environment on their basis. An analysis of the domestic experience of forming a cloud oriented learning environment of educational institutions is carried out, given interpretation of concepts «cloud oriented distance learning system», «cloud oriented learning environment in the study of databases», «the design of the cloud oriented learning environment in the study of databases for future teachers of informatics». The following stages of designing COLE are selected and described: targeted, conceptual, meaningful, component, introductory, appraisal-generalization. The structure of the educational interaction of subjects in the study of databases in the conditions of the COLE is developed by the means of the cloud oriented distance learning system Canvas, consisting of communication tools, joint work, and planning of educational events, cloud storages.

  14. The Social Semantic Web in Intelligent Learning Environments: State of the Art and Future Challenges

    Science.gov (United States)

    Jovanovic, Jelena; Gasevic, Dragan; Torniai, Carlo; Bateman, Scott; Hatala, Marek

    2009-01-01

    Today's technology-enhanced learning practices cater to students and teachers who use many different learning tools and environments and are used to a paradigm of interaction derived from open, ubiquitous, and socially oriented services. In this context, a crucial issue for education systems in general, and for Intelligent Learning Environments…

  15. The ultraviolet environment of Mars: biological implications past, present, and future

    Science.gov (United States)

    Cockell, C. S.; Catling, D. C.; Davis, W. L.; Snook, K.; Kepner, R. L.; Lee, P.; McKay, C. P.

    2000-01-01

    A radiative transfer model is used to quantitatively investigate aspects of the martian ultraviolet radiation environment, past and present. Biological action spectra for DNA inactivation and chloroplast (photosystem) inhibition are used to estimate biologically effective irradiances for the martian surface under cloudless skies. Over time Mars has probably experienced an increasingly inhospitable photobiological environment, with present instantaneous DNA weighted irradiances 3.5-fold higher than they may have been on early Mars. This is in contrast to the surface of Earth, which experienced an ozone amelioration of the photobiological environment during the Proterozoic and now has DNA weighted irradiances almost three orders of magnitude lower than early Earth. Although the present-day martian UV flux is similar to that of early Earth and thus may not be a critical limitation to life in the evolutionary context, it is a constraint to an unadapted biota and will rapidly kill spacecraft-borne microbes not covered by a martian dust layer. Microbial strategies for protection against UV radiation are considered in the light of martian photobiological calculations, past and present. Data are also presented for the effects of hypothetical planetary atmospheric manipulations on the martian UV radiation environment with estimates of the biological consequences of such manipulations.

  16. The ultraviolet environment of Mars: biological implications past, present, and future.

    Science.gov (United States)

    Cockell, C S; Catling, D C; Davis, W L; Snook, K; Kepner, R L; Lee, P; McKay, C P

    2000-08-01

    A radiative transfer model is used to quantitatively investigate aspects of the martian ultraviolet radiation environment, past and present. Biological action spectra for DNA inactivation and chloroplast (photosystem) inhibition are used to estimate biologically effective irradiances for the martian surface under cloudless skies. Over time Mars has probably experienced an increasingly inhospitable photobiological environment, with present instantaneous DNA weighted irradiances 3.5-fold higher than they may have been on early Mars. This is in contrast to the surface of Earth, which experienced an ozone amelioration of the photobiological environment during the Proterozoic and now has DNA weighted irradiances almost three orders of magnitude lower than early Earth. Although the present-day martian UV flux is similar to that of early Earth and thus may not be a critical limitation to life in the evolutionary context, it is a constraint to an unadapted biota and will rapidly kill spacecraft-borne microbes not covered by a martian dust layer. Microbial strategies for protection against UV radiation are considered in the light of martian photobiological calculations, past and present. Data are also presented for the effects of hypothetical planetary atmospheric manipulations on the martian UV radiation environment with estimates of the biological consequences of such manipulations.

  17. Beyond the Personal Learning Environment: Attachment and Control in the Classroom of the Future

    Science.gov (United States)

    Johnson, Mark William; Sherlock, David

    2014-01-01

    The Personal Learning Environment (PLE) has been presented in a number of guises over a period of 10 years as an intervention which seeks the reorganisation of educational technology through shifting the "locus of control" of technology towards the learner. In the intervening period to the present, a number of initiatives have attempted…

  18. Protecting the Environment for the Sake of Our Common Future. Special Report 4.

    Science.gov (United States)

    Born, Sigrid, Ed.

    In June 1992, representatives of more than 170 countries met in Rio de Janeiro, at the United Nations Conference on Environment and Development, to consider international cooperation aimed at preserving the sources of human life. This report presents Germany's involvement in that cooperative effort. The report is presented in six sections: (1) an…

  19. Fast and Accurate Simulation of the Cray XMT Multithreaded Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Oreste; Tumeo, Antonino; Secchi, Simone; Manzano Franco, Joseph B.

    2012-12-31

    Irregular applications, such as data mining and analysis or graph-based computations, show unpredictable memory/network access patterns and control structures. Highly multithreaded architectures with large processor counts, like the Cray MTA-1, MTA-2 and XMT, appear to address their requirements better than commodity clusters. However, the research on highly multithreaded systems is currently limited by the lack of adequate architectural simulation infrastructures due to issues such as size of the machines, memory footprint, simulation speed, accuracy and customization. At the same time, Shared-memory MultiProcessors (SMPs) with multi-core processors have become an attractive platform to simulate large scale machines. In this paper, we introduce a cycle-level simulator of the highly multithreaded Cray XMT supercomputer. The simulator runs unmodified XMT applications. We discuss how we tackled the challenges posed by its development, detailing the techniques introduced to make the simulation as fast as possible while maintaining a high accuracy. By mapping XMT processors (ThreadStorm with 128 hardware threads) to host computing cores, the simulation speed remains constant as the number of simulated processors increases, up to the number of available host cores. The simulator supports zero-overhead switching among different accuracy levels at run-time and includes a network model that takes into account contention. On a modern 48-core SMP host, our infrastructure simulates a large set of irregular applications 500 to 2000 times slower than real time when compared to a 128-processor XMT, while remaining within 10\\% of accuracy. Emulation is only from 25 to 200 times slower than real time.

  20. The future context of work in the business environment in South Africa: Some empirical evidence

    OpenAIRE

    PS Nel; AJ Du Plessis; AE Marx

    2014-01-01

    The future is uncertain, but management needs to determine and also be informed about possible change trends. This research, however, reports on empirical results of the views of South African HRM practitioners to identify and prioritise business change trends for 2002 and 2010 in terms of the “hard” or “soft” HRM debate in the literature. All organisations employing HRM practitioners were include and a total of 1640 questionnaires were distributed resulting in 207 useable responses.   ...

  1. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  2. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M.; Banner, D. [Electricite de France (EDF)- R and D Division, 92 - Clamart (France)

    2003-07-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  3. Central automatic control or distributed occupant control for better indoor environment quality in the future

    DEFF Research Database (Denmark)

    Toftum, Jørn

    2010-01-01

    of control, as perceived by occupants, seemed more important for the prevalence of adverse symptoms and building-related symptoms than the ventilation mode per se. This result indicates that even though the development and application of new indoor environment sensors and HVAC control systems may allow...... for fully automated IEQ control, such systems should not compromise occupants' perception of having some degree of control of their indoor environment....... a discrepancy in the degree of perceived control. The database was composed of 1272 responses obtained in 24 buildings of which 15 had mechanical ventilation (997 responses) and 9 had natural ventilation (275 responses). The number of occupant-reported control opportunities was higher in buildings with natural...

  4. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    Science.gov (United States)

    Mulugeta,Lealem; Myers, Jerry G.; Lewandowski, Beth; Platts, Steven H.

    2011-01-01

    Mars and NEO missions will expose astronaut to extended durations of reduced reduced gravity, isolation and higher radiation. These new operation conditions pose health risks that are not well understood and perhaps unanticipated. Advanced computational simulation environments can beneficially augment research to predict, assess and mitigate potential hazards to astronaut health. The NASA Digital Astronaut Project (DAP), within the NASA Human Research Program, strives to achieve this goal.

  5. Emerging pollutants in the environment: present and future challenges in biomonitoring, ecological risks and bioremediation

    OpenAIRE

    Gavrilescu, M.; Demnerová, K.; Aamand, J.; Agathos, S.; Fava, F.

    2015-01-01

    Emerging pollutants reach the environment from various anthropogenic sources and are distributed throughout environmental matrices. Although great advances have been made in the detection and analysis of trace pollutants during recent decades, due to the continued development and refinement of specific techniques, a wide array of undetected contaminants of emerging environmental concern need to be identified and quantified in various environmental components and biological tissues. These poll...

  6. CosmoBon for studying wood formation under exotic gravitational environment for future space agriculture

    Science.gov (United States)

    Tomita-Yokotani, Kaori; Baba, Keiichi; Suzuki, Toshisada; Funada, Ryo; Nakamura, Teruko; Hashimoto, Hirofumi; Yamashita, Masamichi; Cosmobon, Jstwg

    We are proposing to raise woody plants in space for several applications and plant science. Japanese flowering cherry tree is one of a candidate for these studies. Mechanism behind sensing gravity and controlling shape of tree has been studied quite extensively. Even molecular mechanism for the response of plant against gravity has been investigated quite intensively for various species, woody plants are left behind. Morphology of woody branch growth is different from that of stem growth in herbs. Morphology in tree is strongly dominated by the secondary xylem formation. Nobody knows the tree shape grown under the space environment. If whole tree could be brought up to space as research materials, it might provide important scientific knowledge. Furthermore, trees produce excess oxygen, wooden materials for living cabin, and provide biomass for cultivating mushroom and insect as for the space agriculture. Excellent tree shapes which would be deeply related to wood formation improve quality of life under stressful environment in outer space. The serious problem would be their size. Bonsai is one of the Japanese traditional arts. We can study secondly xylem formation, wood formation, under exotic gravitational environment using Bonsai. "CosmoBon" is the small tree Bonsai for our space experiment. It has been recognized that the reaction wood in CosmoBon is formed similar to natural trees. Our goal is to examine feasibility to grow various species of trees in space as bioresource for space agriculture.

  7. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    Science.gov (United States)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  8. Remote physiological monitoring in an austere environment: a future for battlefield care provision?

    Science.gov (United States)

    Smyth, Matthew J; Round, J A; Mellor, A J

    2018-05-14

    Wearable technologies are making considerable advances into the mainstream as they become smaller and more user friendly. The global market for such devices is forecasted to be worth over US$5 billion in 2018, with one in six people owning a device. Many professional sporting teams use self-monitoring to assess physiological parameters and work rate on the pitch, highlighting the potential utility for military command chains. As size of device reduces and sensitivity improves, coupled with remote connectivity technology, integration into the military environment could be relatively seamless. Remote monitoring of personnel on the ground, giving live updates on their physiological status, would allow commanders or medical officers the ability to manage their soldiers appropriately and improve combat effectiveness. This paper explores a proof of concept for the use of a self-monitoring system in the austere high altitude environment of the Nepalese Himalayas, akin to those experienced by modern militaries fighting in remote locations. It also reviews, in part, the historical development of remote monitoring technologies. The system allowed for physiological recordings, plotted against GPS position, to be remotely monitored in Italy. Examples of the data recorded are given and the performance of the system is discussed, including limitations, potential areas of development and how systems like this one could be integrated into the military environment. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Securing a better future for all: Nuclear techniques for global development and environmental protection. NA factsheet on environment laboratories: Protecting the environment

    International Nuclear Information System (INIS)

    2012-01-01

    According to the Millennium Development Goals, managing the environment is considered an integral part of the global development process. The main purpose of the IAEA's environment laboratories is to provide Member States with reliable information on environmental issues and facilitate decision making on protection of the environment. An increasingly important feature of this work is to assess the impact of climate change on environmental sustainability and natural resources. The IAEA's environment laboratories use nuclear techniques, radionuclides, isotopic tracers and stable isotopes to gain a better understanding of the various marine processes, including locating the sources of pollutants and their fate, their transport pathways and their ultimate accumulation in sediments. Radioisotopes are also used to study bioaccumulation in organisms and the food chain, as well as to track signals of climate change throughout history. Natural and artificial radionuclides are used to track ocean currents in key regions. They are also used to validate models designed to predict the future impact of climate change and ocean acidification. The laboratories study the fate and impact of contamination on a variety of ecosystems in order to provide effective preventative diagnostic and remediation strategies. They enhance the capability of Member States to use nuclear techniques to understand and assess changes in their own terrestrial and atmospheric environments, and adopt suitable and sustainable remediation measures when needed. Since 1995, the IAEA environment laboratories have coordinated the international network of Analytical Laboratories for the Measurement of Environmental Radioactivity, providing accurate analysis in the event of an accident or an intentional release of radioactivity. In addition, the laboratories work alongside other organizations, such as UNESCO, the IOC, UNEP and the EC. The laboratories collaborate with Member States through direct involvement with

  10. Learning in the e-environment: new media and learning for the future

    Directory of Open Access Journals (Sweden)

    Milan Matijević

    2015-03-01

    Full Text Available We live in times of rapid change in all areas of science, technology, communication and social life. Every day we are asked to what extent school prepares us for these changes and for life in a new, multimedia environment. Children and adolescents spend less time at school or in other settings of learning than they do outdoors or within other social communities (family, clubs, societies, religious institutions and the like. Experts must constantly inquire about what exactly influences learning and development in our rich media environment. The list of the most important life competences has significantly changed and expanded since the last century. Educational experts are attempting to predict changes in the content and methodology of learning at the beginning of the 21st century. Answers are sought to key questions such as: what should one learn; how should one learn; where should one learn; why should one learn; and how do these answers relate to the new learning environment? In his examination of the way children and young people learn and grow up, the author places special attention on the relationship between personal and non-personal communication (e.g. the internet, mobile phones and different types of e-learning. He deals with today's questions by looking back to some of the more prominent authors and studies of the past fifty years that tackled identical or similar questions (Alvin Toffler, Ivan Illich, George Orwell, and the members of the Club of Rome. The conclusion reached is that in today's world of rapid and continuous change, it is much more crucial than in the last century, both, to be able to learn, and to adapt to learning with the help of new media.

  11. Emerging pollutants in the environment: present and future challenges in biomonitoring, ecological risks and bioremediation.

    Science.gov (United States)

    Gavrilescu, Maria; Demnerová, Kateřina; Aamand, Jens; Agathos, Spiros; Fava, Fabio

    2015-01-25

    Emerging pollutants reach the environment from various anthropogenic sources and are distributed throughout environmental matrices. Although great advances have been made in the detection and analysis of trace pollutants during recent decades, due to the continued development and refinement of specific techniques, a wide array of undetected contaminants of emerging environmental concern need to be identified and quantified in various environmental components and biological tissues. These pollutants may be mobile and persistent in air, water, soil, sediments and ecological receptors even at low concentrations. Robust data on their fate and behaviour in the environment, as well as on threats to ecological and human health, are still lacking. Moreover, the ecotoxicological significance of some emerging micropollutants remains largely unknown, because satisfactory data to determine their risk often do not exist. This paper discusses the fate, behaviour, (bio)monitoring, environmental and health risks associated with emerging chemical (pharmaceuticals, endocrine disruptors, hormones, toxins, among others) and biological (bacteria, viruses) micropollutants in soils, sediments, groundwater, industrial and municipal wastewaters, aquaculture effluents, and freshwater and marine ecosystems, and highlights new horizons for their (bio)removal. Our study aims to demonstrate the imperative need to boost research and innovation for new and cost-effective treatment technologies, in line with the uptake, mode of action and consequences of each emerging contaminant. We also address the topic of innovative tools for the evaluation of the effects of toxicity on human health and for the prediction of microbial availability and degradation in the environment. Additionally, we consider the development of (bio)sensors to perform environmental monitoring in real-time mode. This needs to address multiple species, along with a more effective exploitation of specialised microbes or enzymes

  12. FUTURE FOREIGN LANGUAGE TEACHERS' SOCIAL AND COGNITIVE COLLABORATION IN AN ONLINE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Nike Arnold

    2006-01-01

    Full Text Available Discussion boards provide an interactive venue where new and future language teachers can reflect, evaluate, solve problems or simply exchange ideas (e.g., Bonk, Hansen, Grabner-Hagen, Lazar, & Mirabelli, 1996; DeWert, Babinski, & Jones, 2003; Kumari, 2001; Pawan, Paulus, Yalcin, & Chang, 2003. In addition, encouraging future teachers to learn with technology before teaching with it allows them to become comfortable using various computer applications. This article examines transcripts from a semester-long asynchronous discussion between foreign language methodology classes at two different universities. Social and cognitive presence in the discussions was analyzed using Garrison, Anderson, and Archer’s Framework of a Community of Inquiry (2001. The results indicate that students engaged in a high degree of interactivity as well as all types of social and cognitive presence. These findings indicate that students not only progressed in their cognitive understanding of the pedagogical topics, but also employed social presence, the more dominant of the two, to aid their discussions. The topics seemed to play an important role in the type of cognitive activity evident in the discussions. These results differ from those of studies which found that students did not engage in interactivity (Henri, 1995; Pena-Shaff & Nicholls, 2004 and others which noted low levels of social presence (Garrison, et al. 2001; Meyer, 2003.

  13. Svalbard as a study model of future High Arctic coastal environments in a warming world

    Directory of Open Access Journals (Sweden)

    Jacek Piskozub

    2017-10-01

    Full Text Available Svalbard archipelago, a high latitude area in a region undergoing rapid climate change, is relatively easily accessible for field research. This makes the fjords of Spitsbergen, its largest island, some of the best studied Arctic coastal areas. This paper aims at answering the question of how climatically diverse the fjords are, and how representative they are for the expected future Arctic diminishing range of seasonal sea-ice. This study uses a meteorological reanalysis, sea surface temperature climatology, and the results of a recent one-year meteorological campaign in Spitsbergen to determine the seasonal differences between different Spitsbergen fjords, as well as the sea water temperature and ice ranges around Svalbard in recent years. The results show that Spitsbergen fjords have diverse seasonal patterns of air temperature due to differences in the SST of the adjacent ocean, and different cloudiness. The sea water temperatures and ice concentrations around Svalbard in recent years are similar to what is expected most of the Arctic coastal areas in the second half of this century. This makes Spitsbergen a unique field study model of the conditions expected in future warmer High Arctic.

  14. Mineral formation on metallic copper in a 'future repository site environment'

    International Nuclear Information System (INIS)

    Amcoff, Oe.; Holenyi, K.

    1996-04-01

    Since reducing conditions are expected much effort has been concentrated on Cu-sulfides and CuFe-sulfides. However, oxidizing conditions are also discussed. A list of copper minerals are included. It is concluded that mineral formation and mineral transitions on the copper canister surface will be governed by kinetics and metastabilities rather than by stability relations. The sulfides formed are less likely to form a passivating layer, and the rate of sulfide growth will probably be governed by the rate of transport of reacting species to the canister surface. A series of tests are recommended, in an environment resembling the initial repository site conditions. 82 refs, 8 figs

  15. Mineral formation on metallic copper in a `future repository site environment`

    Energy Technology Data Exchange (ETDEWEB)

    Amcoff, Oe; Holenyi, K

    1996-04-01

    Since reducing conditions are expected much effort has been concentrated on Cu-sulfides and CuFe-sulfides. However, oxidizing conditions are also discussed. A list of copper minerals are included. It is concluded that mineral formation and mineral transitions on the copper canister surface will be governed by kinetics and metastabilities rather than by stability relations. The sulfides formed are less likely to form a passivating layer, and the rate of sulfide growth will probably be governed by the rate of transport of reacting species to the canister surface. A series of tests are recommended, in an environment resembling the initial repository site conditions. 82 refs, 8 figs.

  16. The challenge of monitoring the cryosphere in alpine environments: Prepare the present for the future

    Science.gov (United States)

    Fischer, Andrea; Helfricht, Kay; Seiser, Bernd; Stocker-Waldhuber, Martin; Hartl, Lea; Wiesenegger, Hans

    2017-04-01

    Understanding the interaction of mountain glaciers and permafrost with weather and climate is essential for the interpretation of past states of the cryosphere in terms of climate change. Most of the glaciers and rock glaciers in Eastern Alpine terrain are subject to strong gradients in climatic forcing, and the persistence of these gradients under past climatic conditions is, more or less, unknown. Thus a key challenge of monitoring the cryosphere is to define the demands on a monitoring strategy for capturing essential processes and their potential changes. For example, the effects of orographic precipitation and local shading vary with general circulation patterns and the amount of solar radiation during the melt(ing) season. Recent investigations based on the Austrian glacier inventories have shown that glacier distribution is closely linked to topography and climatic situation, and that these two parameters imply also different sensitivities of the specific glaciers to progressing climate change. This leads to the need to develop a monitoring system capturing past, but also fairly unknown future ensembles of climatic state and sensitivities. As a first step, the Austrian glacier monitoring network has been analyzed from the beginning of the records onwards. Today's monitoring network bears the imprints of past research interests, but also past funding policies and personal/institutional engagements. As a limitation for long term monitoring in general, today's monitoring strategies have to cope with being restricted to these historical commitments to preserve the length of the time series, but at the same time expanding the measurements to fulfil present and future scientific and societal demands. The decision on cryospheric benchmark sites has an additional uncertainty: the ongoing disintegration of glaciers, their increasing debris cover as well as the potential low ice content and relatively unknown reaction of rock glaciers in the course of climate change

  17. Central automatic control or distributed occupant control for better indoor environment quality in the future

    Energy Technology Data Exchange (ETDEWEB)

    Toftum, Joern [International Centre for Indoor Environment and Energy, Department of Civil Engineering, Technical University of Denmark, DTU, Building 402, DK-2800 Lyngby (Denmark)

    2010-01-15

    Based on a database accumulated from several recent surveys of office buildings located in a temperate climate (Denmark), the effect on occupant perceptions and symptom prevalence was compared in buildings with natural and with mechanical ventilation in which earlier studies have shown a discrepancy in the degree of perceived control. The database was composed of 1272 responses obtained in 24 buildings of which 15 had mechanical ventilation (997 responses) and nine had natural ventilation (275 responses). The number of occupant-reported control opportunities was higher in buildings with natural ventilation. Analysis of occupant responses, after grouping according to categories determined by the degree of satisfaction with the perceived control, showed that it was more likely the degree of control satisfaction that affected the prevalence of adverse perceptions and symptoms. Thus, the degree of control, as perceived by occupants, seemed more important for the prevalence of adverse symptoms and building-related symptoms than the ventilation mode per se. This result indicates that even though the development and application of new indoor environment sensors and HVAC control systems may allow for fully automated IEQ control, such systems should not compromise occupants' perception of having some degree of control of their indoor environment. (author)

  18. Early social environment affects the endogenous oxytocin system: a review and future directions

    Directory of Open Access Journals (Sweden)

    Emily eAlves

    2015-03-01

    Full Text Available Endogenous oxytocin plays an important role in a wide range of human functions including birth, milk ejection during lactation and facilitation of social interaction. There is increasing evidence that both variations in the oxytocin receptor (OXTR and concentrations of oxytocin are associated with differences in these functions. The causes for the differences that have been observed in tonic and stimulated oxytocin release remain unclear. Previous reviews have suggested that across the life course, these differences may be due to individual factors, e.g. genetic variation (of the OXTR, age or sex, or be the result of early environmental influences such as social experiences, stress or trauma partly by inducing epigenetic changes. This review has three aims. First, we briefly discuss the endogenous oxytocin system, including physiology, development, individual differences and function. Secondly, current models describing the relationship between the early life environment and the development of the oxytocin system in humans and animals are discussed. Finally, we describe research designs that can be used to investigate the effects of the early environment on the oxytocin system, identifying specific areas of research that need further attention.

  19. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    Science.gov (United States)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.

    2010-01-01

    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  20. [Environment and health in Gela (Sicily): present knowledge and prospects for future studies].

    Science.gov (United States)

    Musmeci, Loredana; Bianchi, Fabrizio; Carere, Mario; Cori, Liliana

    2009-01-01

    The study area includes the Municipalities of Gela, Niscemi and Butera located in the South of Sicily, Italy. In 1990 it was declared Area at High Risk of Environmental Crisis. In 2000 part of it was designated as Gela Reclamation Site of National Interest, RSNI. The site includes a private industrial area, public and marine areas, for a total of 51 km(2). Gela populationin 2008 was 77,145 (54,774 in 1961). Sea level:46 m. Total area: 276 km(2). Grid reference: 37 degrees 4' 0" N, 14 degrees 15' 0" E. Niscemi and Butera are located border to Gela. Populations are respectively 26,541 and 5,063. Sea level respectively: 332 m and 402 m. Close to the city of Gela, the industrial area, operating since 1962, includes chemical production plants, a power station and an oil refinery plant, one of the larger in Europe, refining 5 millions tons of crude per year. From the beginning the workforces decreased from 7,000 to the current 3,000 units. Over the years, these industrial activities have been a major source of environmental pollution. Extremely high levels of toxic, persistent and bio-accumulating chemical pollutants have been documented. Many relevant environmental and health data are available. Prior to the studies described in the present publication, their use in order to identify environmental pressures on health has been limited. Nevertheless, since several years different epidemiological studies have provided evidence of the occurrence of health outcomes significantly higher than in neighbouring areas and compared to regional data. In 2007 a Multidisciplinary Working Group has been established, to analyze the existing data on pollution-exposure-effect and to complete current knowledge on the cycle of pollutants, from migration in the environment to health impact. The present publication is a collection of contribution of this group of experts, supported by the following projects: Evaluation of environmental health impact and estimation of economic costs at of

  1. Comparative assessment for future prediction of urban water environment using WEAP model: A case study of Kathmandu, Manila and Jakarta

    Science.gov (United States)

    Kumar, Pankaj; Yoshifumi, Masago; Ammar, Rafieiemam; Mishra, Binaya; Fukushi, Ken

    2017-04-01

    Uncontrolled release of pollutants, increasing extreme weather condition, rapid urbanization and poor governance posing a serious threat to sustainable water resource management in developing urban spaces. Considering half of the world's mega-cities are in the Asia and the Pacific with 1.7 billion people do not access to improved water and sanitation, water security through its proper management is both an increasing concern and an imperative critical need. This research work strives to give a brief glimpse about predicted future water environment in Bagmati, Pasig and Ciliwung rivers from three different cities viz. Manila, Kathmandu and Jakarta respectively. Hydrological model used here to foresee the collective impacts of rapid population growth because of urbanization as well as climate change on unmet demand and water quality in near future time by 2030. All three rivers are major source of water for different usage viz. domestic, industrial, agriculture and recreation but uncontrolled withdrawal and sewerage disposal causing deterioration of water environment in recent past. Water Evaluation and Planning (WEAP) model was used to model river water quality pollution future scenarios using four indicator species i.e. Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD) and Nitrate (NO3). Result for simulated water quality as well as unmet demand for year 2030 when compared with that of reference year clearly indicates that not only water quality deteriorates but also unmet demands is increasing in future course of time. This also suggests that current initiatives and policies for water resource management are not sufficient enough and hence immediate and inclusive action through transdisciplinary research.

  2. Greening Internet of Things for Smart Everythings with A Green-Environment Life: A Survey and Future Prospects

    OpenAIRE

    Alsamhi, S. H.; Ma, Ou; Ansari, M. Samar; Meng, Qingliang

    2018-01-01

    Tremendous technology development in the field of Internet of Things (IoT) has changed the way we work and live. Although the numerous advantages of IoT are enriching our society, it should be reminded that the IoT also consumes energy, embraces toxic pollution and E-waste. These place new stress on the environments and smart world. In order to increase the benefits and reduce the harm of IoT, there is an increasing desire to move toward green IoT. Green IoT is seen as the future of IoT that ...

  3. The future is in the numbers: the power of predictive analysis in the biomedical educational environment

    Directory of Open Access Journals (Sweden)

    Charles A. Gullo

    2016-07-01

    Full Text Available Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large.

  4. Recent research activities and future subjects on stable- and radio-isotopes of chlorine in environment

    International Nuclear Information System (INIS)

    Kushita, Kouhei

    2001-12-01

    This report reviews the recent studies on the stable- and radio-isotopes of chlorine from a viewpoint of environmental science, partly including historic references on this element. First, general properties, occurrence, and utilization of chlorine are described. Secondly, current status and research works on chlorine-compounds, which attract special attention in recent years as environmentally hazardous materials, are reported. Thirdly, research works on stable chlorine isotopes, 35 Cl and 37 Cl, are described with a focus laid on the newly-developed techniques; isotopic ratio mass spectrometry (IRMS) and thermal ionization mass spectrometry (TIMS). Fourthly, recent research works on chlorine radioisotopes, 36 Cl etc., are described, focusing on the development of accelerator mass spectrometry (AMS) and its application to geochemistry and others. Finally, taking account of the above-mentioned recent works on Cl isotopes, possible future research subjects are discussed. (author)

  5. The future context of work in the business environment in South Africa: Some empirical evidence

    Directory of Open Access Journals (Sweden)

    PS Nel

    2014-10-01

    Full Text Available The future is uncertain, but management needs to determine and also be informed about possible change trends. This research, however, reports on empirical results of the views of South African HRM practitioners to identify and prioritise business change trends for 2002 and 2010 in terms of the “hard” or “soft” HRM debate in the literature. All organisations employing HRM practitioners were include and a total of 1640 questionnaires were distributed resulting in 207 useable responses.   The results highlight trends such as increased international competition, globalisation and inadequate skills in different rankings for 2002 and 2010. It is concluded that HRM practitioners, are influenced by the “hard” or “soft” approach, when they participate in a strategic management context in organisations.

  6. Gene–Environment Interactions in Preventive Medicine: Current Status and Expectations for the Future

    Directory of Open Access Journals (Sweden)

    Hiroto Narimatsu

    2017-01-01

    Full Text Available The progression of many common disorders involves a complex interplay of multiple factors, including numerous different genes and environmental factors. Gene–environmental cohort studies focus on the identification of risk factors that cannot be discovered by conventional epidemiological methodologies. Such epidemiological methodologies preclude precise predictions, because the exact risk factors can be revealed only after detailed analyses of the interactions among multiple factors, that is, between genes and environmental factors. To date, these cohort studies have reported some promising results. However, the findings do not yet have sufficient clinical significance for the development of precise, personalized preventive medicine. Especially, some promising preliminary studies have been conducted in terms of the prevention of obesity. Large-scale validation studies of those preliminary studies, using a prospective cohort design and long follow-ups, will produce useful and practical evidence for the development of preventive medicine in the future.

  7. Recent research activities and future subjects on stable- and radio-isotopes of chlorine in environment

    Energy Technology Data Exchange (ETDEWEB)

    Kushita, Kouhei [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    This report reviews the recent studies on the stable- and radio-isotopes of chlorine from a viewpoint of environmental science, partly including historic references on this element. First, general properties, occurrence, and utilization of chlorine are described. Secondly, current status and research works on chlorine-compounds, which attract special attention in recent years as environmentally hazardous materials, are reported. Thirdly, research works on stable chlorine isotopes, {sup 35}Cl and {sup 37}Cl, are described with a focus laid on the newly-developed techniques; isotopic ratio mass spectrometry (IRMS) and thermal ionization mass spectrometry (TIMS). Fourthly, recent research works on chlorine radioisotopes, {sup 36}Cl etc., are described, focusing on the development of accelerator mass spectrometry (AMS) and its application to geochemistry and others. Finally, taking account of the above-mentioned recent works on Cl isotopes, possible future research subjects are discussed. (author)

  8. The future is in the numbers: the power of predictive analysis in the biomedical educational environment

    Science.gov (United States)

    Gullo, Charles A.

    2016-01-01

    Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large. PMID:27374246

  9. Microbial fuel cells in saline and hypersaline environments: Advancements, challenges and future perspectives.

    Science.gov (United States)

    Grattieri, Matteo; Minteer, Shelley D

    2018-04-01

    This review is aimed to report the possibility to utilize microbial fuel cells for the treatment of saline and hypersaline solutions. An introduction to the issues related with the biological treatment of saline and hypersaline wastewater is reported, discussing the limitation that characterizes classical aerobic and anaerobic digestions. The microbial fuel cell (MFC) technology, and the possibility to be applied in the presence of high salinity, is discussed before reviewing the most recent advancements in the development of MFCs operating in saline and hypersaline conditions, with their different and interesting applications. Specifically, the research performed in the last 5years will be the main focus of this review. Finally, the future perspectives for this technology, together with the most urgent research needs, are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  11. Biofuels are (Not the Future! Legitimation Strategies of Sustainable Ventures in Complex Institutional Environments

    Directory of Open Access Journals (Sweden)

    Neil A. Thompson

    2018-04-01

    Full Text Available Sustainable ventures often lack legitimacy (perceived to be desirable and appropriate because various stakeholder groups use contradictory institutions (rules and norms to make their judgements, which leads to there being fewer resources available and higher failure rates. Using an institutional theory framework and a multi-case research design with 15 biofuel ventures operating in the Netherlands, this study asks how sustainable entrepreneurs attempt to gain legitimacy in these circumstances. Analysis reveals that the entrepreneurs use a combination of rhetorical, reconciliatory and institutional change strategies to obtain legitimacy from different stakeholder groups. These findings further our understanding of sustainable entrepreneurial behavior by revealing how and why different legitimation strategies are used in complex institutional environments.

  12. A low-carbon future: Spatial planning's role in enhancing technological innovation in the built environment

    International Nuclear Information System (INIS)

    Crawford, Jenny; French, Will

    2008-01-01

    The scope of spatial planning activity includes issues of governance, corporate organisation, policy integration, statutory and regulatory frameworks, and technical analysis and design. The nature of its potential contribution to achieving low-carbon built environments will vary according to the resolution of tensions between pressures for leadership, consistent decision making and speed of change and the value placed on diversity, flexibility and innovation. A planning system that can support technological innovation will be characterised by high levels of organisational and institutional capacity and high-quality knowledge systems that support a focus on delivering place-based objectives. The paper reflects on further aspects of such a system and the issues that spatial planning needs to address in delivering low-carbon energy systems

  13. The SEDIBUD (Sediment Budgets in Cold Environments) Programme: Current activities and future key tasks

    Science.gov (United States)

    Beylich, A. A.; Lamoureux, S. F.; Decaulne, A.

    2012-04-01

    Projected climate change in cold regions is expected to alter melt season duration and intensity, along with the number of extreme rainfall events, total annual precipitation and the balance between snowfall and rainfall. Similarly, changes to the thermal balance are expected to reduce the extent of permafrost and seasonal ground frost and increase active layer depths. These effects will undoubtedly change surface environments in cold regions and alter the fluxes of sediments, nutrients and solutes, but the absence of quantitative data and coordinated process monitoring and analysis to understand the sensitivity of the Earth surface environment is acute in cold climate environments. The International Association of Geomorphologists (I.A.G./A.I.G.)SEDIBUD (Sediment Budgets in Cold Environments) Programme was formed in 2005 to address this existing key knowledge gap. SEDIBUD currently has about 400 members worldwide and the Steering Committee of this international programme is composed of ten scientists from eight different countries: Achim A. Beylich (Chair) (Norway), Armelle Decaulne (Secretary) (France), John C. Dixon (USA), Scott F. Lamoureux (Vice-Chair) (Canada), John F. Orwin (Canada), Jan-Christoph Otto (Austria), Irina Overeem (USA), Thorsteinn Saemundsson (Iceland), Jeff Warburton (UK), Zbigniew Zwolinski (Poland). The central research question of this global group of scientists is to: Assess and model the contemporary sedimentary fluxes in cold climates, with emphasis on both particulate and dissolved components. Initially formed as European Science Foundation (ESF) Network SEDIFLUX (2004-2006), SEDIBUD has further expanded to a global group of researchers with field research sites located in polar and alpine regions in the northern and southern hemisphere. Research carried out at each of the close to 50 defined SEDIBUD key test sites varies by programme, logistics and available resources, but typically represent interdisciplinary collaborations of

  14. Campus Retrofitting (CARE) Methodology: A Way to Co-Create Future Learning Environments

    DEFF Research Database (Denmark)

    Nenonen, Suvi; Eriksson, Robert; Niemi, Olli

    2016-01-01

    (CARE)- methodology for user-centric and co- creative campus retrofitting processes. The campus development research in Nordic countries and co-creation in retrofitting processes are discussed. The campus retrofitting cases in different countries are described by emphasising especially the methods...... of resources in form of both teachers and university facilities is challenged by development of integration of learning, teaching and the spaces where it takes place. The challenges are shared among users and owners of campus, where retrofitting is needed too. This paper aims to describe Campus Retrofitting...... they used. Based on the analysis of the methods the framework for Campus retrofitting (CARE) - methodology is presented and discussed. CARE-methodology is a tool to capture new logic to learning environment design. It has three key activities: co-creating, co-financing and co-evaluating. The integrated...

  15. Central automatic control or distributed occupant control for better indoor environment quality in the future

    DEFF Research Database (Denmark)

    Toftum, Jørn

    2008-01-01

    of adverse symptoms and building related symptoms than the ventilation mode per se. This result indicates that even though the development and application of new indoor environment sensors and HVAC control systems may allow for fully automated IEQ control, such systems should not compromise occupants...... in the degree of perceived control. The database was composed of 1353 responses obtained in 25 buildings of which 15 had mechanical ventilation (997 responses) and 9 had natural ventilation (275 responses). Analysis of occupant responses, after grouping according to categories determined by the degree...... of satisfaction with the perceived control, showed that the degree of control satisfaction, but rarely building category (natural vs. mechanical ventilation), affected the prevalence of adverse perceptions and symptoms. Thus, the degree of control, as perceived by occupants, was more important for the prevalence...

  16. New Mexico High School Supercomputing Challenge, 1990--1995: Five years of making a difference to students, teachers, schools, and communities. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Foster, M.; Kratzer, D.

    1996-02-01

    The New Mexico High School Supercomputing Challenge is an academic program dedicated to increasing interest in science and math among high school students by introducing them to high performance computing. This report provides a summary and evaluation of the first five years of the program, describes the program and shows the impact that it has had on high school students, their teachers, and their communities. Goals and objectives are reviewed and evaluated, growth and development of the program are analyzed, and future directions are discussed.

  17. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    Science.gov (United States)

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  18. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    Science.gov (United States)

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  19. Hydropower's future, the environment, and global electricity systems

    Energy Technology Data Exchange (ETDEWEB)

    Sternberg, R. [Department of Earth and Environmental Studies, Montclair State University, 1 Normal Ave, Montclair, NJ 07043-1624 (United States)

    2010-02-15

    Hydropower is a well established electricity system on the global scene. Global electricity needs by far exceed the amount of electricity that hydrosystems can provide to meet global electricity needs. Much of the world's hydropower remains to be brought into production. Improved technology, better calibrated environmental parameters for large projects have become the norm in the past 15 years. How and why does hydropower retain a prominent role in electricity production? How and why does hydropower find social acceptance in diverse social systems? How does hydropower project planning address issues beyond electricity generation? How does the systems approach to hydropower installations further analysis of comparative energy sources powering electricity systems? Attention to the environmental impact of hydropower facilities forms an integral part of systems analysis. Similarly, the technical, political and economic variables call for balanced analysis to identify the viability status of hydro projects. Economic competition among energy systems requires in context assessments as these shape decision making in planning of hydropower systems. Moreover, technological change has to be given a time frame during which the sector advances in productivity and share in expanding electricity generation. The low production costs per kWh assure hydropower at this juncture, 2009, a very viable future. (author)

  20. Transportation Energy Futures Series: Effects of the Built Environment on Transportation: Energy Use, Greenhouse Gas Emissions, and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C. D.; Brown, A.; Dunphy, R. T.; Vimmerstedt, L.

    2013-03-01

    Planning initiatives in many regions and communities aim to reduce transportation energy use, decrease emissions, and achieve related environmental benefits by changing land use. This report reviews and summarizes findings from existing literature on the relationship between the built environment and transportation energy use and greenhouse gas emissions, identifying results trends as well as potential future actions. The indirect influence of federal transportation and housing policies, as well as the direct impact of municipal regulation on land use are examined for their effect on transportation patterns and energy use. Special attention is given to the 'four D' factors of density, diversity, design and accessibility. The report concludes that policy-driven changes to the built environment could reduce transportation energy and GHG emissions from less than 1% to as much as 10% by 2050, the equivalent of 16%-18% of present-day urban light-duty-vehicle travel. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  1. Transportation Energy Futures Series. Effects of the Built Environment on Transportation. Energy Use, Greenhouse Gas Emissions, and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C. D. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Brown, A. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Dunphy, R. T. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Vimmerstedt, L. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States)

    2013-03-15

    Planning initiatives in many regions and communities aim to reduce transportation energy use, decrease emissions, and achieve related environmental benefits by changing land use. This report reviews and summarizes findings from existing literature on the relationship between the built environment and transportation energy use and greenhouse gas emissions, identifying results trends as well as potential future actions. The indirect influence of federal transportation and housing policies, as well as the direct impact of municipal regulation on land use are examined for their effect on transportation patterns and energy use. Special attention is given to the 'four D' factors of density, diversity, design and accessibility. The report concludes that policy-driven changes to the built environment could reduce transportation energy and GHG emissions from less than 1% to as much as 10% by 2050, the equivalent of 16%-18% of present-day urban light-duty-vehicle travel. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  2. Parallel simulation of tsunami inundation on a large-scale supercomputer

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2013-12-01

    An accurate prediction of tsunami inundation is important for disaster mitigation purposes. One approach is to approximate the tsunami wave source through an instant inversion analysis using real-time observation data (e.g., Tsushima et al., 2009) and then use the resulting wave source data in an instant tsunami inundation simulation. However, a bottleneck of this approach is the large computational cost of the non-linear inundation simulation and the computational power of recent massively parallel supercomputers is helpful to enable faster than real-time execution of a tsunami inundation simulation. Parallel computers have become approximately 1000 times faster in 10 years (www.top500.org), and so it is expected that very fast parallel computers will be more and more prevalent in the near future. Therefore, it is important to investigate how to efficiently conduct a tsunami simulation on parallel computers. In this study, we are targeting very fast tsunami inundation simulations on the K computer, currently the fastest Japanese supercomputer, which has a theoretical peak performance of 11.2 PFLOPS. One computing node of the K computer consists of 1 CPU with 8 cores that share memory, and the nodes are connected through a high-performance torus-mesh network. The K computer is designed for distributed-memory parallel computation, so we have developed a parallel tsunami model. Our model is based on TUNAMI-N2 model of Tohoku University, which is based on a leap-frog finite difference method. A grid nesting scheme is employed to apply high-resolution grids only at the coastal regions. To balance the computation load of each CPU in the parallelization, CPUs are first allocated to each nested layer in proportion to the number of grid points of the nested layer. Using CPUs allocated to each layer, 1-D domain decomposition is performed on each layer. In the parallel computation, three types of communication are necessary: (1) communication to adjacent neighbours for the

  3. Present and future thermal environments available to Sharp-tailed Grouse in an intact grassland.

    Directory of Open Access Journals (Sweden)

    Edward J Raynor

    Full Text Available Better understanding animal ecology in terms of thermal habitat use has become a focus of ecological studies, in large part due to the predicted temperature increases associated with global climate change. To further our knowledge on how ground-nesting endotherms respond to thermal landscapes, we examined the thermal ecology of Sharp-tailed Grouse (Tympanuchus phasianellus during the nesting period. We measured site-specific iButton temperatures (TiB and vegetation characteristics at nest sites, nearby random sites, and landscape sites to assess thermal patterns at scales relevant to nesting birds. We asked if microhabitat vegetation characteristics at nest sites matched the characteristics that directed macrohabitat nest-site selection. Grouse selected sites sheltered by dense vegetation for nesting that moderated TiB on average up to 2.7°C more than available landscape sites. Successful nests were positioned in a way that reduced exposure to thermal extremes by as much as 4°C relative to failed nests with an overall mean daytime difference (±SE of 0.4 ±0.03°C. We found that macrohabitat nest-site selection was guided by dense vegetation cover and minimal bare ground as also seen at the microhabitat scale. Global climate projections for 2080 suggest that TiB at nest sites may approach temperatures currently avoided on the landscape, emphasizing a need for future conservation plans that acknowledge fine-scale thermal space in climate change scenarios. These data show that features of grassland landscapes can buffer organisms from unfavorable microclimatic conditions and highlight how thermal heterogeneity at the individual-level can drive decisions guiding nest site selection.

  4. Present and future thermal environments available to Sharp-tailed Grouse in an intact grassland.

    Science.gov (United States)

    Raynor, Edward J; Powell, Larkin A; Schacht, Walter H

    2018-01-01

    Better understanding animal ecology in terms of thermal habitat use has become a focus of ecological studies, in large part due to the predicted temperature increases associated with global climate change. To further our knowledge on how ground-nesting endotherms respond to thermal landscapes, we examined the thermal ecology of Sharp-tailed Grouse (Tympanuchus phasianellus) during the nesting period. We measured site-specific iButton temperatures (TiB) and vegetation characteristics at nest sites, nearby random sites, and landscape sites to assess thermal patterns at scales relevant to nesting birds. We asked if microhabitat vegetation characteristics at nest sites matched the characteristics that directed macrohabitat nest-site selection. Grouse selected sites sheltered by dense vegetation for nesting that moderated TiB on average up to 2.7°C more than available landscape sites. Successful nests were positioned in a way that reduced exposure to thermal extremes by as much as 4°C relative to failed nests with an overall mean daytime difference (±SE) of 0.4 ±0.03°C. We found that macrohabitat nest-site selection was guided by dense vegetation cover and minimal bare ground as also seen at the microhabitat scale. Global climate projections for 2080 suggest that TiB at nest sites may approach temperatures currently avoided on the landscape, emphasizing a need for future conservation plans that acknowledge fine-scale thermal space in climate change scenarios. These data show that features of grassland landscapes can buffer organisms from unfavorable microclimatic conditions and highlight how thermal heterogeneity at the individual-level can drive decisions guiding nest site selection.

  5. Climate change, renewable energy and population impact on future energy demand for Burkina Faso build environment

    Science.gov (United States)

    Ouedraogo, B. I.

    This research addresses the dual challenge faced by Burkina Faso engineers to design sustainable low-energy cost public buildings and domestic dwellings while still providing the required thermal comfort under warmer temperature conditions caused by climate change. It was found base don climate change SRES scenario A2 that predicted mean temperature in Burkina Faso will increase by 2oC between 2010 and 2050. Therefore, in order to maintain a thermally comfortable 25oC inside public buildings, the projected annual energy consumption for cooling load will increase by 15%, 36% and 100% respectively for the period between 2020 to 2039, 2040 to 2059 and 2070 to 2089 when compared to the control case. It has also been found that a 1% increase in population growth will result in a 1.38% and 2.03% increase in carbon emission from primary energy consumption and future electricity consumption respectively. Furthermore, this research has investigated possible solutions for adaptation to the severe climate change and population growth impact on energy demand in Burkina Faso. Shading devices could potentially reduce the cooling load by up to 40%. Computer simulation programming of building energy consumption and a field study has shown that adobe houses have the potential of significantly reducing energy demand for cooling and offer a formidable method for climate change adaptation. Based on the Net Present Cost, hybrid photovoltaic (PV) and Diesel generator energy production configuration is the most cost effective local electricity supply system, for areas without electricity at present, with a payback time of 8 years when compared to diesel generator stand-alone configuration. It is therefore a viable solution to increase electricity access to the majority of the population.

  6. Holocene Paleoceanographic Environments at the Chukchi-Alaskan Margin: Implications for Future Changes

    Science.gov (United States)

    Polyak, L.; Nam, S. I.; Dipre, G.; Kim, S. Y.; Ortiz, J. D.; Darby, D. A.

    2017-12-01

    The impacts of the North Pacific oceanic and atmospheric system on the Arctic Ocean result in accelerated sea-ice retreat and related changes in hydrography and biota in the western Arctic. Paleoclimatic records from the Pacific sector of the Arctic are key for understanding the long-term history of these interactions. As opposed to stratigraphically long but strongly compressed sediment cores recovered from the deep Arctic Ocean, sediment depocenters on the Chukchi-Alaskan margin yield continuous, medium to high resolution records formed since the last deglaciation. While early Holocene conditions were non-analogous to modern environments due to the effects of prolonged deglaciation and insufficiently high sea levels, mid to late Holocene sediments are more relevant for recent and modern climate variability. Notably, a large depocenter at the Alaskan margin has sedimentation rates estimated as high as a few millimeters per year, thus providing a decadal to near-annual resolution. This high accumulation can be explained by sediment delivery via the Alaskan Coastal Current originating from the Bering Sea and supposedly controlled by the Aleutian Low pressure center. Preliminary results from sediment cores recovering the last several centuries, along with a comparison with other paleoclimatic proxy records from the Arctic-North Pacific region, indicate a persistent role of the Aleutian Low in the Bering Strait inflow and attendant deposition. More proxy studies are underway to reconstruct the history of this circulation system and its relationship with sea ice extent. The expected results will improve our understanding of natural variability in oceanic and atmospheric conditions at the Chukchi-Alaskan margin, a critical area for modulating the Arctic climate change.

  7. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    Science.gov (United States)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  8. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Energy Technology Data Exchange (ETDEWEB)

    De, K [University of Texas at Arlington; Jha, S [Rutgers University; Klimentov, A [Brookhaven National Laboratory (BNL); Maeno, T [Brookhaven National Laboratory (BNL); Nilsson, P [Brookhaven National Laboratory (BNL); Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Wells, Jack C [ORNL; Wenaus, T [Brookhaven National Laboratory (BNL)

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation

  9. Guide to dataflow supercomputing basic concepts, case studies, and a detailed example

    CERN Document Server

    Milutinovic, Veljko; Trifunovic, Nemanja; Giorgi, Roberto

    2015-01-01

    This unique text/reference describes an exciting and novel approach to supercomputing in the DataFlow paradigm. The major advantages and applications of this approach are clearly described, and a detailed explanation of the programming model is provided using simple yet effective examples. The work is developed from a series of lecture courses taught by the authors in more than 40 universities across more than 20 countries, and from research carried out by Maxeler Technologies, Inc. Topics and features: presents a thorough introduction to DataFlow supercomputing for big data problems; revie

  10. Radioactivity in the aquatic environment. A review of UK research 1994-1997 and recommendations for future work

    International Nuclear Information System (INIS)

    1998-07-01

    The national Radioactivity Research and Environmental Monitoring Committee (RADREM) provides a forum for liaison on UK research and monitoring in the radioactive substances and radioactive waste management fields. The committee aims to ensure that there is no unnecessary overlap between, or significant omission from, the research programmes of the various parts of Government, the regulatory bodies or industry. This report has been produced by the Aquatic Environment Sub-Committee (AESC) of RADREM. AESC is responsible for providing RADREM with scientific advice in the field of research relating to radionuclides in the aquatic environment, for reporting on the progress of research in this field and on future research requirements. The objectives of this report are presented in Section 2, and the membership of AESC given in Section 3. This report describes a review of research undertaken in the field of radioactivity in aquatic systems over the last three years (Section 4). The review updates previous reviews, the most recent of which being in 1993 (AESC, 1994). Future research requirements have been identified by AESC, considering past work and work in progress, and are presented in Section 5. Specific research requirements are discussed in Section 5, whilst Section 6 summarises the main areas where future research is identified as a priority. These areas are as follows: the movement and uptake of 99 Tc and 14 C in aquatic systems and biota; geochemical processes; off-shore sediments; non-equilibrium systems; radiation exposure during civil engineering works; further work on movement of radionuclides in salt marshes; development and validation of models. The specific objectives of this report are as follows: 1. To provide a summary of research undertaken in this field over the last three years. 2. To identify future research requirements. 3. To attach priorities to the future research requirements. It should be noted that the purpose of the report is to identify

  11. Nature, nurture, and capital punishment: How evidence of a genetic-environment interaction, future dangerousness, and deliberation affect sentencing decisions.

    Science.gov (United States)

    Gordon, Natalie; Greene, Edie

    2018-01-01

    Research has shown that the low-activity MAOA genotype in conjunction with a history of childhood maltreatment increases the likelihood of violent behaviors. This genetic-environment (G × E) interaction has been introduced as mitigation during the sentencing phase of capital trials, yet there is scant data on its effectiveness. This study addressed that issue. In a factorial design that varied mitigating evidence offered by the defense [environmental (i.e., childhood maltreatment), genetic, G × E, or none] and the likelihood of the defendant's future dangerousness (low or high), 600 mock jurors read sentencing phase evidence in a capital murder trial, rendered individual verdicts, and half deliberated as members of a jury to decide a sentence of death or life imprisonment. The G × E evidence had little mitigating effect on sentencing preferences: participants who received the G × E evidence were no less likely to sentence the defendant to death than those who received evidence of childhood maltreatment or a control group that received neither genetic nor maltreatment evidence. Participants with evidence of a G × E interaction were more likely to sentence the defendant to death when there was a high risk of future dangerousness than when there was a low risk. Sentencing preferences were more lenient after deliberation than before. We discuss limitations and future directions. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Effects of the Extraterrestrial Environment on Plants: Recommendations for Future Space Experiments for the MELiSSA Higher Plant Compartment

    Directory of Open Access Journals (Sweden)

    Silje A. Wolff

    2014-05-01

    Full Text Available Due to logistical challenges, long-term human space exploration missions require a life support system capable of regenerating all the essentials for survival. Higher plants can be utilized to provide a continuous supply of fresh food, atmosphere revitalization, and clean water for humans. Plants can adapt to extreme environments on Earth, and model plants have been shown to grow and develop through a full life cycle in microgravity. However, more knowledge about the long term effects of the extraterrestrial environment on plant growth and development is necessary. The European Space Agency (ESA has developed the Micro-Ecological Life Support System Alternative (MELiSSA program to develop a closed regenerative life support system, based on micro-organisms and higher plant processes, with continuous recycling of resources. In this context, a literature review to analyze the impact of the space environments on higher plants, with focus on gravity levels, magnetic fields and radiation, has been performed. This communication presents a roadmap giving directions for future scientific activities within space plant cultivation. The roadmap aims to identify the research activities required before higher plants can be included in regenerative life support systems in space.

  13. Decadal analysis of impact of future climate on wheat production in dry Mediterranean environment: A case of Jordan.

    Science.gov (United States)

    Dixit, Prakash N; Telleria, Roberto; Al Khatib, Amal N; Allouzi, Siham F

    2018-01-01

    Different aspects of climate change, such as increased temperature, changed rainfall and higher atmospheric CO 2 concentration, all have different effects on crop yields. Process-based crop models are the most widely used tools for estimating future crop yield responses to climate change. We applied APSIM crop simulation model in a dry Mediterranean climate with Jordan as sentinel site to assess impact of climate change on wheat production at decadal level considering two climate change scenarios of representative concentration pathways (RCP) viz., RCP4.5 and RCP8.5. Impact of climatic variables alone was negative on grain yield but this adverse effect was negated when elevated atmospheric CO 2 concentrations were also considered in the simulations. Crop cycle of wheat was reduced by a fortnight for RCP4.5 scenario and by a month for RCP8.5 scenario at the approach of end of the century. On an average, a grain yield increase of 5 to 11% in near future i.e., 2010s-2030s decades, 12 to 16% in mid future i.e., 2040s-2060s decades and 9 to 16% in end of century period can be expected for moderate climate change scenario (RCP4.5) and 6 to 15% in near future, 13 to 19% in mid future and 7 to 20% increase in end of century period for a drastic climate change scenario (RCP8.5) based on different soils. Positive impact of elevated CO 2 is more pronounced in soils with lower water holding capacity with moderate increase in temperatures. Elevated CO 2 had greater positive effect on transpiration use efficiency (TUE) than negative effect of elevated mean temperatures. The change in TUE was in near perfect direct relationship with elevated CO 2 levels (R 2 >0.99) and every 100-ppm atmospheric CO 2 increase resulted in TUE increase by 2kgha -1 mm -1 . Thereby, in this environment yield gains are expected in future and farmers can benefit from growing wheat. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Highly parallel machines and future of scientific computing

    International Nuclear Information System (INIS)

    Singh, G.S.

    1992-01-01

    Computing requirement of large scale scientific computing has always been ahead of what state of the art hardware could supply in the form of supercomputers of the day. And for any single processor system the limit to increase in the computing power was realized a few years back itself. Now with the advent of parallel computing systems the availability of machines with the required computing power seems a reality. In this paper the author tries to visualize the future large scale scientific computing in the penultimate decade of the present century. The author summarized trends in parallel computers and emphasize the need for a better programming environment and software tools for optimal performance. The author concludes this paper with critique on parallel architectures, software tools and algorithms. (author). 10 refs., 2 tabs

  15. Preliminary design of CERN Future Circular Collider tunnel: first evaluation of the radiation environment in critical areas for electronics

    Science.gov (United States)

    Infantino, Angelo; Alía, Rubén García; Besana, Maria Ilaria; Brugger, Markus; Cerutti, Francesco

    2017-09-01

    As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh), running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee) as well as a lepton-hadron option (FCC-he). In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.

  16. Preliminary design of CERN Future Circular Collider tunnel: first evaluation of the radiation environment in critical areas for electronics

    Directory of Open Access Journals (Sweden)

    Infantino Angelo

    2017-01-01

    Full Text Available As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh, running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee as well as a lepton-hadron option (FCC-he. In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.

  17. Interactive real-time nuclear plant simulations on a UNIX based supercomputer

    International Nuclear Information System (INIS)

    Behling, S.R.

    1990-01-01

    Interactive real-time nuclear plant simulations are critically important to train nuclear power plant engineers and operators. In addition, real-time simulations can be used to test the validity and timing of plant technical specifications and operational procedures. To accurately and confidently simulate a nuclear power plant transient in real-time, sufficient computer resources must be available. Since some important transients cannot be simulated using preprogrammed responses or non-physical models, commonly used simulation techniques may not be adequate. However, the power of a supercomputer allows one to accurately calculate the behavior of nuclear power plants even during very complex transients. Many of these transients can be calculated in real-time or quicker on the fastest supercomputers. The concept of running interactive real-time nuclear power plant transients on a supercomputer has been tested. This paper describes the architecture of the simulation program, the techniques used to establish real-time synchronization, and other issues related to the use of supercomputers in a new and potentially very important area. (author)

  18. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  19. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  20. Design and performance characterization of electronic structure calculations on massively parallel supercomputers

    DEFF Research Database (Denmark)

    Romero, N. A.; Glinsvad, Christian; Larsen, Ask Hjorth

    2013-01-01

    Density function theory (DFT) is the most widely employed electronic structure method because of its favorable scaling with system size and accuracy for a broad range of molecular and condensed-phase systems. The advent of massively parallel supercomputers has enhanced the scientific community...

  1. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  2. An efficient implementation of a backpropagation learning algorithm on quadrics parallel supercomputer

    International Nuclear Information System (INIS)

    Taraglio, S.; Massaioli, F.

    1995-08-01

    A parallel implementation of a library to build and train Multi Layer Perceptrons via the Back Propagation algorithm is presented. The target machine is the SIMD massively parallel supercomputer Quadrics. Performance measures are provided on three different machines with different number of processors, for two network examples. A sample source code is given

  3. Radioactivity in the terrestrial environment; review of UK research 1993-1996 and recommendations for future work

    International Nuclear Information System (INIS)

    1997-03-01

    The national Radioactivity Research and Environmental Monitoring Committee (RADREM) provides a forum for liaison on UK research and monitoring in the radioactive substances and radioactive waste management fields. It is subscribed to by Government departments, national regulatory bodies, the UK nuclear industry and other bodies with relevant research sponsorship and monitoring interests. A key function of the RADREM committee is to ensure that there is no unnecessary overlap between or significant omission from the research sponsored by the organisations represented upon it. To this end periodic reviews of research sector programmes are carried out. This report covers a review which was carried out by the Terrestrial Environment Sub-Committee (TESC) of RADREM for the period 1993-1996. In particular possible future research requirements are considered and evaluated. Such omissions are as identified do not reflect Sub-Committee views on the adequacy of any individual organisations research programme. Rather they should be seen as areas where gaps in knowledge may exist, which all organisations are free to consider and prioritise in the formulation of their future research requirements. (author)

  4. Future integrated design environments

    DEFF Research Database (Denmark)

    Christiansson, Per; Svidt, Kjeld; Sørensen, Kristian Birch

    2009-01-01

    and modeling of explicit and implicit end-user needs and requirements on both the building to be designed and the supporting design tools. The paper provides grounds to higher success rate in capture of explicit and implicit end user needs and requirements on functional performance in use and re...

  5. Technology - environment - future

    International Nuclear Information System (INIS)

    1980-01-01

    This volume contains the materials of the meeting Scientific-technical progress and sociological alternatives organized in March 1980 by the Institute for Marxistic Studies and Research (IMSF). The goal of the meeting was to give a view at the present level of knowledge and discussion among the Federal Republic's Marxists on the direction, the social and ecological consequences of the development of science and technique under the conditions of capitalism. The arguments with the bourgeois opinions of the relation between technique and society was paid special attention to, as well as the discussion on alternative sociological concepts. (HSCH) [de

  6. Measured and Modeled Downwelling Far-Infrared Radiances in Very Dry Environments and Calibration Requirements for Future Experiments

    Science.gov (United States)

    Mast, J. C.; Mlynczak, M. G.; Cageao, R.; Kratz, D. P.; Latvakoski, H.; Johnson, D. G.; Mlawer, E. J.; Turner, D. D.

    2016-12-01

    Downwelling radiances measured by the Far-Infrared Spectroscopy of the Troposphere (FIRST) instrument in an environment with integrated precipitable water as low as 0.03 cm are compared with calculated spectra in the far-infrared and mid-infrared. In its current ground-based configuration FIRST was deployed to 5.38 km on Cerro Toco, a mountain in the Atacama Desert of Chile, from August to October 2009. There FIRST took part in the Radiative Heating in Unexplored Bands Campaign Part 2. Water vapor and temperature profiles from an optimal-estimation-based physical retrieval algorithm (using simultaneous radiosonde and multichannel 183 GHz microwave radiometer measurements) are input to the AER Line-by-Line Radiative Transfer Model (LBLRTM) to compute radiances for comparison with FIRST. The AER v3.4 line parameter database is used. The low water vapor amounts and relatively cold atmosphere result in extremely small far-IR radiances (1.5 mW/m2/sr/cm-1) with corresponding brightness temperatures of 120 K. The residual LBLRTM minus FIRST is calculated to assess agreement between the measured and modeled spectra. Uncertainties in both the measured and modeled radiances are accounted for in the comparison. A goal of the deployment and subsequent analysis is the assessment of water vapor spectroscopy in the far-infrared and mid-infrared. While agreement is found between measured and modeled radiances within the combined uncertainties across all spectra, uncertainties in the measured water vapor profiles and from the laboratory calibration exceed those associated with water vapor spectroscopy in this very low radiance environment. Consequently, no improvement in water vapor spectroscopy is afforded by these measurements. However, we use these results to place requirements on instrument calibration accuracy and water vapor profile accuracy for future campaigns to similarly dry environments. Instrument calibration uncertainty needs to be at 2% (1-sigma) of measured radiance

  7. Technologies for the people: a future in the making

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, D.C.

    2004-09-01

    India's post-independence policy of using science and technology for national development, and investment in research and development infrastructure resulted in success in space, atomic energy, missile development and supercomputing. Use of space technology has impacted directly or indirectly the vast majority of India's billion plus population. Developments in a number of emerging technologies in recent years hold the promise of impacting the future of ordinary Indians in significant ways, if a proper policy and enabling environment are provided. New telecom technologies - a digital rural exchange and a wireless access system - are beginning to touch the lives of common people. Development of a low-cost hand held computing device, use of hybrid telemedicine systems to extend modem healthcare to the unreached, and other innovative uses of IT at the grassroots also hold promise for the future. Biotechnology too has the potential to deliver cost-effective vaccines and drugs, but the future of GM crops is uncertain due to growing opposition. Some of these emerging technologies hold promise for future, provided a positive policy and enabling environment. (author)

  8. Serving two purposes: Plans for a MOOC and a World Campus course called Energy, the Environment, and Our Future (Invited)

    Science.gov (United States)

    Bralower, T. J.; Alley, R. B.; Blumsack, S.; Keller, K.; Feineman, M. D.

    2013-12-01

    We are in the final stages of developing a Massive Open Online Course entitled Energy, the Environment, and Our Future. The course is a broad overview of the implications of the current energy options on Earth's climate and the choices for more sustainable energy sources in the future. The course is founded in concepts explored in the book and PBS series Earth: The Operators' Manual, but it includes more in-depth treatment of renewable energy as well as the ethical issues surrounding energy choices. One of the key aspects of the course is that it is being designed to be taught in two formats, the first, an eight week MOOC through Coursera in Fall semester 2013, and the second, a 16 week online course developed as part of the NSF Geo-STEP InTeGrate program and offered through the Penn State World Campus. The advantage of the MOOC format is the ability to reach out to thousands of students worldwide, exposing them to the science behind important issues that may have a direct impact on the lifestyle decisions they make, while the World Campus course allows us to explore deeper levels of cognition through application of carefully designed pedagogies. The principal difference between the two versions of the course will be assessment. The MOOC will have embedded assessment between pages and end of module quizzes. The InTeGrate course will have a range of assessments that are directly linked to the goals and objectives of the course. These will include active learning exercises built around energy and climate data. Both of the versions are works in progress and we anticipate modifying them regularly based on student feedback.

  9. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Science.gov (United States)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  10. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    International Nuclear Information System (INIS)

    Klimentov, A; Maeno, T; Nilsson, P; Panitkin, S; Wenaus, T; De, K; Oleynik, D; Jha, S; Wells, J

    2016-01-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  11. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  12. Human–environment interactions in urban green spaces — A systematic review of contemporary issues and prospects for future research

    International Nuclear Information System (INIS)

    Kabisch, Nadja; Qureshi, Salman; Haase, Dagmar

    2015-01-01

    Scientific papers on landscape planning underline the importance of maintaining and developing green spaces because of their multiple environmental and social benefits for city residents. However, a general understanding of contemporary human–environment interaction issues in urban green space is still incomplete and lacks orientation for urban planners. This review examines 219 publications to (1) provide an overview of the current state of research on the relationship between humans and urban green space, (2) group the different research approaches by identifying the main research areas, methods, and target groups, and (3) highlight important future prospects in urban green space research. - Highlights: • Reviewed literature on urban green pins down a dearth of comparative studies. • Case studies in Africa and Russia are marginalized – the Europe and US dominate. • Questionnaires are used as major tool followed by GIS and quantitative approaches. • Developing countries should contribute in building an urban green space agenda. • Interdisciplinary, adaptable and pluralistic approaches can satiate a knowledge gap

  13. Human–environment interactions in urban green spaces — A systematic review of contemporary issues and prospects for future research

    Energy Technology Data Exchange (ETDEWEB)

    Kabisch, Nadja, E-mail: nadja.kabisch@geo.hu-berlin.de [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); Department of Urban and Environmental Sociology, Helmholtz Centre for Environmental Research — UFZ, 04318 Leipzig (Germany); Qureshi, Salman [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); School of Architecture, Birmingham Institute of Art and Design, Birmingham City University, The Parkside Building, 5 Cardigan Street, Birmingham B4 7BD (United Kingdom); Haase, Dagmar [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); Department of Computational Landscape Ecology, Helmholtz Centre for Environmental Research — UFZ, 04318 Leipzig (Germany)

    2015-01-15

    Scientific papers on landscape planning underline the importance of maintaining and developing green spaces because of their multiple environmental and social benefits for city residents. However, a general understanding of contemporary human–environment interaction issues in urban green space is still incomplete and lacks orientation for urban planners. This review examines 219 publications to (1) provide an overview of the current state of research on the relationship between humans and urban green space, (2) group the different research approaches by identifying the main research areas, methods, and target groups, and (3) highlight important future prospects in urban green space research. - Highlights: • Reviewed literature on urban green pins down a dearth of comparative studies. • Case studies in Africa and Russia are marginalized – the Europe and US dominate. • Questionnaires are used as major tool followed by GIS and quantitative approaches. • Developing countries should contribute in building an urban green space agenda. • Interdisciplinary, adaptable and pluralistic approaches can satiate a knowledge gap.

  14. The future of water quality and the regulatory environment for the oil sands and coalbed methane development

    International Nuclear Information System (INIS)

    Kasperski, K.; Mikula, R.

    2004-01-01

    The use of consolidated tailings in recent years for the surface mined oil sands bitumen extraction process has resulted in major improvements in water consumption because materials are transported more efficiently in a slurry form. Water storage requirements will be reduced as the cost of handling tailings in the conventional manner becomes clearer. Future improvements may be in the form of mine face sand rejection, more advanced tailings treatment, or the use of clays for continuous reclamation. Sand filtering or stacking technologies can improve tailings properties and reduce the amount of water needed per unit of bitumen. It was noted that although the technologies will minimize land disturbance and fresh water consumption, water chemistries will be driven to the point where extraction recovery is impaired and water treatment will be required. The volumes and quality of water that is pumped out to produce coalbed methane (CBM) was also discussed with reference to the origin of water in coal beds, water resource depletion, water disposal, direct land applications, and surface evaporation. The Alberta Energy and Utilities Board and Alberta Environment are responsible for regulating CBM water issues in the province, including water disposal from CBM production. 41 refs., 6 tabs., 8 figs

  15. Direct exploitation of a top 500 Supercomputer for Analysis of CMS Data

    International Nuclear Information System (INIS)

    Cabrillo, I; Cabellos, L; Marco, J; Fernandez, J; Gonzalez, I

    2014-01-01

    The Altamira Supercomputer hosted at the Instituto de Fisica de Cantatbria (IFCA) entered in operation in summer 2012. Its last generation FDR Infiniband network used (for message passing) in parallel jobs, supports the connection to General Parallel File System (GPFS) servers, enabling an efficient simultaneous processing of multiple data demanding jobs. Sharing a common GPFS system and a single LDAP-based identification with the existing Grid clusters at IFCA allows CMS researchers to exploit the large instantaneous capacity of this supercomputer to execute analysis jobs. The detailed experience describing this opportunistic use for skimming and final analysis of CMS 2012 data for a specific physics channel, resulting in an order of magnitude reduction of the waiting time, is presented.

  16. ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.

    Science.gov (United States)

    Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping

    2018-04-27

    A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.

  17. Explaining the gap between theoretical peak performance and real performance for supercomputer architectures

    International Nuclear Information System (INIS)

    Schoenauer, W.; Haefner, H.

    1993-01-01

    The basic architectures of vector and parallel computers with their properties are presented. Then the memory size and the arithmetic operations in the context of memory bandwidth are discussed. For the exemplary discussion of a single operation micro-measurements of the vector triad for the IBM 3090 VF and the CRAY Y-MP/8 are presented. They reveal the details of the losses for a single operation. Then we analyze the global performance of a whole supercomputer by identifying reduction factors that bring down the theoretical peak performance to the poor real performance. The responsibilities of the manufacturer and of the user for these losses are dicussed. Then the price-performance ratio for different architectures in a snapshot of January 1991 is briefly mentioned. Finally some remarks to a user-friendly architecture for a supercomputer will be made. (orig.)

  18. Information Environment is an Integral Element of Informational Space in the Process of Professional Development of Future Teacher of Physical Culture

    Directory of Open Access Journals (Sweden)

    Yuri V. Dragnev

    2012-04-01

    Full Text Available The article examines information environment as an integral element of information space in the process of professional development of future teacher of physical culture, notes that the strategic objective of the system of higher education is training of competent future teacher of physical culture in the field of information technologies, when information competence and information culture are major components of professionalism in modern information-oriented society

  19. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  20. Application of Supercomputer Technologies for Simulation Of Socio-Economic Systems

    Directory of Open Access Journals (Sweden)

    Vladimir Valentinovich Okrepilov

    2015-06-01

    Full Text Available To date, an extensive experience has been accumulated in investigation of problems related to quality, assessment of management systems, modeling of economic system sustainability. The performed studies have created a basis for development of a new research area — Economics of Quality. Its tools allow to use opportunities of model simulation for construction of the mathematical models adequately reflecting the role of quality in natural, technical, social regularities of functioning of the complex socio-economic systems. Extensive application and development of models, and also system modeling with use of supercomputer technologies, on our deep belief, will bring the conducted research of socio-economic systems to essentially new level. Moreover, the current scientific research makes a significant contribution to model simulation of multi-agent social systems and that is not less important, it belongs to the priority areas in development of science and technology in our country. This article is devoted to the questions of supercomputer technologies application in public sciences, first of all, — regarding technical realization of the large-scale agent-focused models (AFM. The essence of this tool is that owing to the power computer increase it has become possible to describe the behavior of many separate fragments of a difficult system, as socio-economic systems are. The article also deals with the experience of foreign scientists and practicians in launching the AFM on supercomputers, and also the example of AFM developed in CEMI RAS, stages and methods of effective calculating kernel display of multi-agent system on architecture of a modern supercomputer will be analyzed. The experiments on the basis of model simulation on forecasting the population of St. Petersburg according to three scenarios as one of the major factors influencing the development of socio-economic system and quality of life of the population are presented in the

  1. Heat dissipation computations of a HVDC ground electrode using a supercomputer

    International Nuclear Information System (INIS)

    Greiss, H.; Mukhedkar, D.; Lagace, P.J.

    1990-01-01

    This paper reports on the temperature, of soil surrounding a High Voltage Direct Current (HVDC) toroidal ground electrode of practical dimensions, in both homogeneous and non-homogeneous soils that was computed at incremental points in time using finite difference methods on a supercomputer. Curves of the response were computed and plotted at several locations within the soil in the vicinity of the ground electrode for various values of the soil parameters

  2. Environment

    International Nuclear Information System (INIS)

    McIntyre, A.D.; Turnbull, R.G.H.

    1992-01-01

    The development of the hydrocarbon resources of the North Sea has resulted in both offshore and onshore environmental repercussions, involving the existing physical attributes of the sea and seabed, the coastline and adjoining land. The social and economic repercussions of the industry were equally widespread. The dramatic and speedy impact of the exploration and exploitation of the northern North Sea resources in the early 1970s, on the physical resources of Scotland was quickly realised together with the concern that any environmental and social damage to the physical and social fabric should be kept to a minimum. To this end, a wide range of research and other activities by central and local government, and other interested agencies was undertaken to extend existing knowledge on the marine and terrestrial environments that might be affected by the oil and gas industry. The outcome of these activities is summarized in this paper. The topics covered include a survey of the marine ecosystems of the North Sea, the fishing industry, the impact of oil pollution on seabirds and fish stocks, the ecology of the Scottish coastline and the impact of the petroleum industry on a selection of particular sites. (author)

  3. High Performance Simulation of Large-Scale Red Sea Ocean Bottom Seismic Data on the Supercomputer Shaheen II

    KAUST Repository

    Tonellot, Thierry; Etienne, Vincent; Gashawbeza, Ewenet; Curiel, Emesto Sandoval; Khan, Azizur; Feki, Saber; Kortas, Samuel

    2017-01-01

    three days. After careful optimization of the finite difference kernel, each gather was computed at 184 gigaflops, on average. Up to 6,103 nodes could be used during the computation, resulting in a peak computation speed greater than 1.11 petaflops. The synthetic seismic data using the planned survey geometry was available one month before the actual acquisition, allowing for early real scale validation of our processing and imaging workflows. Moreover, the availability of a massive supercomputer such as Shaheen II enables fast reverse time migration (RTM) and full waveform inversion, and therefore, a more accurate velocity model estimation for future work.

  4. High Performance Simulation of Large-Scale Red Sea Ocean Bottom Seismic Data on the Supercomputer Shaheen II

    KAUST Repository

    Tonellot, Thierry

    2017-02-27

    than three days. After careful optimization of the finite difference kernel, each gather was computed at 184 gigaflops, on average. Up to 6,103 nodes could be used during the computation, resulting in a peak computation speed greater than 1.11 petaflops. The synthetic seismic data using the planned survey geometry was available one month before the actual acquisition, allowing for early real scale validation of our processing and imaging workflows. Moreover, the availability of a massive supercomputer such as Shaheen II enables fast reverse time migration (RTM) and full waveform inversion, and therefore, a more accurate velocity model estimation for future work.

  5. Analysis of the interrelationship of energy, economy, and environment: A model of a sustainable energy future for Korea

    Science.gov (United States)

    Boo, Kyung-Jin

    The primary purpose of this dissertation is to provide the groundwork for a sustainable energy future in Korea. For this purpose, a conceptual framework of sustainable energy development was developed to provide a deeper understanding of interrelationships between energy, the economy, and the environment (E 3). Based on this theoretical work, an empirical simulation model was developed to investigate the ways in which E3 interact. This dissertation attempts to develop a unified concept of sustainable energy development by surveying multiple efforts to integrate various definitions of sustainability. Sustainable energy development should be built on the basis of three principles: ecological carrying capacity, economic efficiency, and socio-political equity. Ecological carrying capacity delineates the earth's resource constraints as well as its ability to assimilate wastes. Socio-political equity implies an equitable distribution of the benefits and costs of energy consumption and an equitable distribution of environmental burdens. Economic efficiency dictates efficient allocation of scarce resources. The simulation model is composed of three modules: an energy module, an environmental module and an economic module. Because the model is grounded on economic structural behaviorism, the dynamic nature of the current economy is effectively depicted and simulated through manipulating exogenous policy variables. This macro-economic model is used to simulate six major policy intervention scenarios. Major findings from these policy simulations were: (1) carbon taxes are the most effective means of reducing air-pollutant emissions; (2) sustainable energy development can be achieved through reinvestment of carbon taxes into energy efficiency and renewable energy programs; and (3) carbon taxes would increase a nation's welfare if reinvested in relevant areas. The policy simulation model, because it is based on neoclassical economics, has limitations such that it cannot fully

  6. Combining density functional theory calculations, supercomputing, and data-driven methods to design new materials (Conference Presentation)

    Science.gov (United States)

    Jain, Anubhav

    2017-04-01

    Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.

  7. Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia.

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk (Kitware, Inc., Clifton Park, NY); Fabian, Nathan; Marion, Patrick (Kitware, Inc., Clifton Park, NY); Moreland, Kenneth D.

    2010-09-01

    This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four

  8. ATLAS FTK a - very complex - custom parallel supercomputer

    CERN Document Server

    Kimura, Naoki; The ATLAS collaboration

    2016-01-01

    In the ever increasing pile-up LHC environment advanced techniques of analysing the data are implemented in order to increase the rate of relevant physics processes with respect to background processes. The Fast TracKer (FTK) is a track finding implementation at hardware level that is designed to deliver full-scan tracks with $p_{T}$ above 1GeV to the ATLAS trigger system for every L1 accept (at a maximum rate of 100kHz). In order to achieve this performance a highly parallel system was designed and now it is under installation in ATLAS. In the beginning of 2016 it will provide tracks for the trigger system in a region covering the central part of the ATLAS detector, and during the year it's coverage will be extended to the full detector coverage. The system relies on matching hits coming from the silicon tracking detectors against 1 billion patterns stored in specially designed ASICS chips (Associative memory - AM06). In a first stage coarse resolution hits are matched against the patterns and the accepted h...

  9. Evaluating Satellite and Supercomputing Technologies for Improved Coastal Ecosystem Assessments

    Science.gov (United States)

    McCarthy, Matthew James

    Water quality and wetlands represent two vital elements of a healthy coastal ecosystem. Both experienced substantial declines in the U.S. during the 20th century. Overall coastal wetland cover decreased over 50% in the 20th century due to coastal development and water pollution. Management and legislative efforts have successfully addressed some of the problems and threats, but recent research indicates that the diffuse impacts of climate change and non-point source pollution may be the primary drivers of current and future water-quality and wetland stress. In order to respond to these pervasive threats, traditional management approaches need to adopt modern technological tools for more synoptic, frequent and fine-scale monitoring and assessment. In this dissertation, I explored some of the applications possible with new, commercial satellite imagery to better assess the status of coastal ecosystems. Large-scale land-cover change influences the quality of adjacent coastal water. Satellite imagery has been used to derive land-cover maps since the 1960's. It provides multiple data points with which to evaluate the effects of land-cover change on water quality. The objective of the first chapter of this research was to determine how 40 years of land-cover change in the Tampa Bay watershed (6,500 km2) may have affected turbidity and chlorophyll concentration - two proxies for coastal water quality. Land cover classes were evaluated along with precipitation and wind stress as explanatory variables. Results varied between analyses for the entire estuary and those of segments within the bay. Changes in developed land percent cover best explained the turbidity and chlorophyll-concentration time series for the entire bay (R2 > 0.75, p Ocean-color satellite imagery was used to derive proxies for coastal water with near-daily satellite observations since 2000. The goal of chapter two was to identify drivers of turbidity variability for 11 National Estuary Program water bodies

  10. The genesis of neurosurgery and the evolution of the neurosurgical operative environment: part II--concepts for future development, 2003 and beyond.

    Science.gov (United States)

    Liu, Charles Y; Spicer, Mark; Apuzzo, Michael L J

    2003-01-01

    The future development of the neurosurgical operative environment is driven principally by concurrent development in science and technology. In the new millennium, these developments are taking on a Jules Verne quality, with the ability to construct and manipulate the human organism and its surroundings at the level of atoms and molecules seemingly at hand. Thus, an examination of currents in technology advancement from the neurosurgical perspective can provide insight into the evolution of the neurosurgical operative environment. In the future, the optimal design solution for the operative environment requirements of specialized neurosurgery may take the form of composites of venues that are currently mutually distinct. Advances in microfabrication technology and laser optical manipulators are expanding the scope and role of robotics, with novel opportunities for bionic integration. Assimilation of biosensor technology into the operative environment promises to provide neurosurgeons of the future with a vastly expanded set of physiological data, which will require concurrent simplification and optimization of analysis and presentation schemes to facilitate practical usefulness. Nanotechnology derivatives are shattering the maximum limits of resolution and magnification allowed by conventional microscopes. Furthermore, quantum computing and molecular electronics promise to greatly enhance computational power, allowing the emerging reality of simulation and virtual neurosurgery for rehearsal and training purposes. Progressive minimalism is evident throughout, leading ultimately to a paradigm shift as the nanoscale is approached. At the interface between the old and new technological paradigms, issues related to integration may dictate the ultimate emergence of the products of the new paradigm. Once initiated, however, history suggests that the process of change will proceed rapidly and dramatically, with the ultimate neurosurgical operative environment of the future

  11. Watson will see you now: a supercomputer to help clinicians make informed treatment decisions.

    Science.gov (United States)

    Doyle-Lindrud, Susan

    2015-02-01

    IBM has collaborated with several cancer care providers to develop and train the IBM supercomputer Watson to help clinicians make informed treatment decisions. When a patient is seen in clinic, the oncologist can input all of the clinical information into the computer system. Watson will then review all of the data and recommend treatment options based on the latest evidence and guidelines. Once the oncologist makes the treatment decision, this information can be sent directly to the insurance company for approval. Watson has the ability to standardize care and accelerate the approval process, a benefit to the healthcare provider and the patient.

  12. Grassroots Supercomputing

    CERN Multimedia

    Buchanan, Mark

    2005-01-01

    What started out as a way for SETI to plow through its piles or radio-signal data from deep space has turned into a powerful research tool as computer users acrosse the globe donate their screen-saver time to projects as diverse as climate-change prediction, gravitational-wave searches, and protein folding (4 pages)

  13. Evaluation of existing and proposed computer architectures for future ground-based systems

    Science.gov (United States)

    Schulbach, C.

    1985-01-01

    Parallel processing architectures and techniques used in current supercomputers are described and projections are made of future advances. Presently, the von Neumann sequential processing pattern has been accelerated by having separate I/O processors, interleaved memories, wide memories, independent functional units and pipelining. Recent supercomputers have featured single-input, multiple data stream architectures, which have different processors for performing various operations (vector or pipeline processors). Multiple input, multiple data stream machines have also been developed. Data flow techniques, wherein program instructions are activated only when data are available, are expected to play a large role in future supercomputers, along with increased parallel processor arrays. The enhanced operational speeds are essential for adequately treating data from future spacecraft remote sensing instruments such as the Thematic Mapper.

  14. Frequently updated noise threat maps created with use of supercomputing grid

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2014-09-01

    Full Text Available An innovative supercomputing grid services devoted to noise threat evaluation were presented. The services described in this paper concern two issues, first is related to the noise mapping, while the second one focuses on assessment of the noise dose and its influence on the human hearing system. The discussed serviceswere developed within the PL-Grid Plus Infrastructure which accumulates Polish academic supercomputer centers. Selected experimental results achieved by the usage of the services proposed were presented. The assessment of the environmental noise threats includes creation of the noise maps using either ofline or online data, acquired through a grid of the monitoring stations. A concept of estimation of the source model parameters based on the measured sound level for the purpose of creating frequently updated noise maps was presented. Connecting the noise mapping grid service with a distributed sensor network enables to automatically update noise maps for a specified time period. Moreover, a unique attribute of the developed software is the estimation of the auditory effects evoked by the exposure to noise. The estimation method uses a modified psychoacoustic model of hearing and is based on the calculated noise level values and on the given exposure period. Potential use scenarios of the grid services for research or educational purpose were introduced. Presentation of the results of predicted hearing threshold shift caused by exposure to excessive noise can raise the public awareness of the noise threats.

  15. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00300320; Klimentov, Alexei; Oleynik, Danila; Panitkin, Sergey; Petrosyan, Artem; Vaniachine, Alexandre; Wenaus, Torre; Schovancova, Jaroslava

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modi ed PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real time, information about unused...

  16. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration; Klimentov, Alexei; Oleynik, Danila; Petrosyan, Artem; Schovancova, Jaroslava; Vaniachine, Alexandre; Wenaus, Torre

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently uses more than 100,000 cores at well over 100 Grid sites with a peak performance of 0.3 petaFLOPS, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real tim...

  17. Feynman diagrams sampling for quantum field theories on the QPACE 2 supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Rappl, Florian

    2016-08-01

    This work discusses the application of Feynman diagram sampling in quantum field theories. The method uses a computer simulation to sample the diagrammatic space obtained in a series expansion. For running large physical simulations powerful computers are obligatory, effectively splitting the thesis in two parts. The first part deals with the method of Feynman diagram sampling. Here the theoretical background of the method itself is discussed. Additionally, important statistical concepts and the theory of the strong force, quantum chromodynamics, are introduced. This sets the context of the simulations. We create and evaluate a variety of models to estimate the applicability of diagrammatic methods. The method is then applied to sample the perturbative expansion of the vertex correction. In the end we obtain the value for the anomalous magnetic moment of the electron. The second part looks at the QPACE 2 supercomputer. This includes a short introduction to supercomputers in general, as well as a closer look at the architecture and the cooling system of QPACE 2. Guiding benchmarks of the InfiniBand network are presented. At the core of this part, a collection of best practices and useful programming concepts are outlined, which enables the development of efficient, yet easily portable, applications for the QPACE 2 system.

  18. Use of high performance networks and supercomputers for real-time flight simulation

    Science.gov (United States)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  19. Communication Characterization and Optimization of Applications Using Topology-Aware Task Mapping on Large Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; D' Azevedo, Eduardo [ORNL; Philip, Bobby [ORNL; Worley, Patrick H [ORNL

    2016-01-01

    On large supercomputers, the job scheduling systems may assign a non-contiguous node allocation for user applications depending on available resources. With parallel applications using MPI (Message Passing Interface), the default process ordering does not take into account the actual physical node layout available to the application. This contributes to non-locality in terms of physical network topology and impacts communication performance of the application. In order to mitigate such performance penalties, this work describes techniques to identify suitable task mapping that takes the layout of the allocated nodes as well as the application's communication behavior into account. During the first phase of this research, we instrumented and collected performance data to characterize communication behavior of critical US DOE (United States - Department of Energy) applications using an augmented version of the mpiP tool. Subsequently, we developed several reordering methods (spectral bisection, neighbor join tree etc.) to combine node layout and application communication data for optimized task placement. We developed a tool called mpiAproxy to facilitate detailed evaluation of the various reordering algorithms without requiring full application executions. This work presents a comprehensive performance evaluation (14,000 experiments) of the various task mapping techniques in lowering communication costs on Titan, the leadership class supercomputer at Oak Ridge National Laboratory.

  20. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  1. Computational Science with the Titan Supercomputer: Early Outcomes and Lessons Learned

    Science.gov (United States)

    Wells, Jack

    2014-03-01

    Modeling and simulation with petascale computing has supercharged the process of innovation and understanding, dramatically accelerating time-to-insight and time-to-discovery. This presentation will focus on early outcomes from the Titan supercomputer at the Oak Ridge National Laboratory. Titan has over 18,000 hybrid compute nodes consisting of both CPUs and GPUs. In this presentation, I will discuss the lessons we have learned in deploying Titan and preparing applications to move from conventional CPU architectures to a hybrid machine. I will present early results of materials applications running on Titan and the implications for the research community as we prepare for exascale supercomputer in the next decade. Lastly, I will provide an overview of user programs at the Oak Ridge Leadership Computing Facility with specific information how researchers may apply for allocations of computing resources. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.

  2. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; Kumar, Jitendra [ORNL; Mills, Richard T. [Argonne National Laboratory; Hoffman, Forrest M. [ORNL; Sripathi, Vamsi [Intel Corporation; Hargrove, William Walter [United States Department of Agriculture (USDA), United States Forest Service (USFS)

    2017-09-01

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like the Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.

  3. An Interface for Biomedical Big Data Processing on the Tianhe-2 Supercomputer.

    Science.gov (United States)

    Yang, Xi; Wu, Chengkun; Lu, Kai; Fang, Lin; Zhang, Yong; Li, Shengkang; Guo, Guixin; Du, YunFei

    2017-12-01

    Big data, cloud computing, and high-performance computing (HPC) are at the verge of convergence. Cloud computing is already playing an active part in big data processing with the help of big data frameworks like Hadoop and Spark. The recent upsurge of high-performance computing in China provides extra possibilities and capacity to address the challenges associated with big data. In this paper, we propose Orion-a big data interface on the Tianhe-2 supercomputer-to enable big data applications to run on Tianhe-2 via a single command or a shell script. Orion supports multiple users, and each user can launch multiple tasks. It minimizes the effort needed to initiate big data applications on the Tianhe-2 supercomputer via automated configuration. Orion follows the "allocate-when-needed" paradigm, and it avoids the idle occupation of computational resources. We tested the utility and performance of Orion using a big genomic dataset and achieved a satisfactory performance on Tianhe-2 with very few modifications to existing applications that were implemented in Hadoop/Spark. In summary, Orion provides a practical and economical interface for big data processing on Tianhe-2.

  4. MODERN INFORMATIONAL AND EDUCATIONAL ENVIRONMENT AS A FACTOR OF IMPROVEMENT OF FUTURE UNIVERSITY TEACHER’S PROFESSIONAL TRAINING

    Directory of Open Access Journals (Sweden)

    Наталія Гунька

    2014-10-01

    Full Text Available The article presents the analysis of the notion of “informational and educational environment”. The difference between the “informational environment” and the “informational and educational environment” has been shown. The main functions of the informational and educational environment and its facilities to enhance the quality of education have been revealed. There have been defined major components of the informational and educational environment. A connection has been detected between the informational and educational environment and the formation of the foundations of pedagogical skills. The research also presents a description of the electronic system “Socrates” of Vinnytsia State Agrarian University, and shows possible ways of its use in the educational process with an aim of the formation and improvement of higher educational institution teacher’s professional skills.

  5. Fire evolution in the radioactive forests of Ukraine and Belarus: future risks for the population and the environment

    Science.gov (United States)

    N. Evangeliou; Y. Balkanski; A. Cozic; WeiMin Hao; F. Mouillot; K. Thonicke; R. Paugam; S. Zibtsev; T. A. Mousseau; R. Wang; B. Poulter; A. Petkov; C. Yue; P. Cadule; B. Koffi; J. W. Kaiser; A. P. Moller

    2015-01-01

    In this paper, we analyze the current and future status of forests in Ukraine and Belarus that were contaminated after the nuclear disaster in 1986. Using several models, together with remote-sensing data and observations, we studied how climate change in these forests may affect fire regimes. We investigated the possibility of 137Cs displacement over Europe...

  6. RIIHI. Radical innovations for combatting climate change. Results from Futures Clinique by the Ministry of the Environment; RIIHI. Radikaalit innovaatiot ilmastonmuutoksen hillitsemiseksi. RIIHI-tulevaisuusklinikan tulokset

    Energy Technology Data Exchange (ETDEWEB)

    Heinonen, S.; Keskinen, A.; Ruotsalainen, J.

    2011-07-01

    This report presents the starting poits, implementations and the results of the Futures Clinique commissioned by the Ministry of the Environment and conducted by Finland Futures Research Centre. The theme for the clinique was 'Radical innovations to constrain the climate change by the year 2050'. The focus was set on the households, since the attention has previously been on the industry and production. The time frame from the present to 2050 was chosen because achieving the goals set by EU's climate policy requires emission cuts of 80% by the year 2050. The carbon footprint of households was examined from the perspectives of energy, traffic, food and water and leisure, entertainment and communications. As desirable goals regarding these aspects the participants studied economic efficiency, sustainability and durability, safety and security, healthinesss and comfortable living. The groups discussed also on converging NBIC technologies (nano, bio, information and cognitive technologies) that could enable these goals. According to the concept of Futures Clinique the work was conducted in groups, in which topics attuned by pre-tasks done by the participants were worked on by the methods of the Futures Wheel and the Futures Table. From each group's working resulted various both technological and socio-cultural radical innovations. (orig.)

  7. Healthy and sustainable diets: Community concern about the effect of the future food environments and support for government regulating sustainable food supplies in Western Australia.

    Science.gov (United States)

    Harray, Amelia J; Meng, Xingqiong; Kerr, Deborah A; Pollard, Christina M

    2018-06-01

    To determine the level of community concern about future food supplies and perception of the importance placed on government regulation over the supply of environmentally friendly food and identify dietary and other factors associated with these beliefs in Western Australia. Data from the 2009 and 2012 Nutrition Monitoring Survey Series computer-assisted telephone interviews were pooled. Level of concern about the effect of the environment on future food supplies and importance of government regulating the supply of environmentally friendly food were measured. Multivariate regression analysed potential associations with sociodemographic variables, dietary health consciousness, weight status and self-reported intake of eight foods consistent with a sustainable diet. Western Australia. Community-dwelling adults aged 18-64 years (n = 2832). Seventy nine per cent of Western Australians were 'quite' or 'very' concerned about the effect of the environment on future food supplies. Respondents who paid less attention to the health aspects of their diet were less likely than those who were health conscious ('quite' or 'very' concerned) (OR = 0.53, 95% CI [0.35, 0.8] and 0.38 [0.17, 0.81] respectively). The majority of respondents (85.3%) thought it was 'quite' or 'very' important that government had regulatory control over an environmentally friendly food supply. Females were more likely than males to rate regulatory control as 'quite' or 'very' important' (OR = 1.63, 95% CI [1.09, 2.44], p = .02). Multiple regression modeling found that no other factors predicted concern or importance. There is a high level of community concern about the impact of the environment on future food supplies and most people believe it is important that the government regulates the issue. These attitudes dominate regardless of sociodemographic characteristics, weight status or sustainable dietary behaviours. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Focal relationships and the environment of project marketing. A literature review with suggestions for practitioners and future research

    DEFF Research Database (Denmark)

    Skaates, Maria Anne; Tikkanen, Henrikki

    2000-01-01

    of the review is upon the connection between focal relationships and the wider environment in which project marketing and systems selling takes place. First, several common definitions of projects and project marketing are presented and discussed. Second, the implications of three specific features of project......Project marketing is an important mode of business-to-business marketing today. This paper assesses recent project marketing contributions, including predominantly those of members of the (mainly European) International Network for Project Marketing and Systems Selling (INPM). The emphasis...... business - discontinuity, uniqueness, and complexity - for the focal relationship and the broader marketing environment are considered at the level of multiple projects. Third, three overlapping types of postures that project-selling firms can adopt in relation to their focal relationships...

  9. Ignalina NPP its environment, safety and future, prospects of the energetic, ethnic and cultural situation: expert evaluation

    International Nuclear Information System (INIS)

    Morkunas, Z. V.; Ciuzas, A.; Jonaitis, V.; Sutiniene, I.

    1995-01-01

    According to the tasks defined in the 'Atomic Energy and the Environment' program an expert evaluative survey was done for the first time in Lithuania concerning the Ignalina NPP and its consequences and perspectives according to the concept which was prepared. The results of survey analysis, done by Lithuanian experts, are presented. Investigation involved these problems: evolution of the technical state safety, use and prospects of the nuclear power plant; evaluation of the activities of governmental and social institutions in connection with the nuclear power plant; Ignalina NPP and the environment; the effect of the nuclear power plant on agricultural activities and development; evolution of the ethnic and cultural situation; conclusions and recommendations for regulations of those areas. (author). 2 refs., 11 figs

  10. PSYCHOLOGICAL STRATEGY OF COOPERATION, MOTIVATIONAL, INFORMATION AND TECHNOLOGICAL COMPONENTS OF FUTURE HUMANITARIAN TEACHER READINESS FOR PROFESSIONAL ACTIVITY IN POLYSUBJECTIVE LEARNING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Y. Spivakovska

    2014-04-01

    Full Text Available Redefining of modern information and communication technologies (ICT from teaching aids to teaching process subjects, continuous growth of their subjectivity necessary demands appropriate knowledge, skills, appropriate attitude to didactic capabilities of ICT, ability to cooperate with them and to build pupils learning activity aimed at formation and development of self organization, self development skills, promoting their subjective position in getting education that will be readiness of modern teacher to organize effective professional activities in polysubjective learning environment (PLE. The new tasks of humanitarian teacher related to self selection and design of educational content as well as the modeling of the learning process in conditions of PLE virtualized alternatives choice, impose special requirements to professionally important teacher’s personality qualities, rather to his readiness to implement effective professional work in such conditions. In this article the essence of future humanitarian teacher readiness concept to professional activity in polysubjective educational environment is proved. The structure of the readiness is analyzed. Psychological strategy of cooperation, reflective, motivational and informational partials are substantiated and characterized as components of the future humanitarian teacher readiness to professional activities in polysubjective educational environment.

  11. Developing a General Decision Tool for Future Cancer Care: Getting Feedback from Users in Busy Hospital Environments

    DEFF Research Database (Denmark)

    Dankl, Kathrina; Akoglu, Canan; Dahl Steffensen, Karina

    2017-01-01

    a specific challenge for involving all stakeholders in the design development process and therefore require the development of methods that work in busy healthcare environments. Based on this perspective, the abstract presents an ongoing research collaboration (started in 2014) between The Patients Cancer...... and patients choose the same design proposals - 90% went for the identical design suggestion, commenting on it as being friendly, easy to understand and positive. In one oncology department’s staff room an alternative proposal received the majority of votes. Conclusions: Human-centered design in healthcare...

  12. The I.A.G. / A.I.G. SEDIBUD (Sediment Budgets in Cold Environments) Programme: Current and future activities

    Science.gov (United States)

    Beylich, Achim A.; Lamoureux, Scott; Decaulne, Armelle

    2013-04-01

    Projected climate change in cold regions is expected to alter melt season duration and intensity, along with the number of extreme rainfall events, total annual precipitation and the balance between snowfall and rainfall. Similarly, changes to the thermal balance are expected to reduce the extent of permafrost and seasonal ground frost and increase active layer depths. These effects will undoubtedly change surface environments in cold regions and alter the fluxes of sediments, nutrients and solutes, but the absence of quantitative data and coordinated geomorphic process monitoring and analysis to understand the sensitivity of the Earth surface environment is acute in cold climate environments. The International Association of Geomorphologists (I.A.G. / A.I.G. ) SEDIBUD (Sediment Budgets in Cold Environments) Programme was formed in 2005 to address this existing key knowledge gap. SEDIBUD currently has about 400 members worldwide and the Steering Committee of this international programme is composed of ten scientists from eight different countries: Achim A. Beylich (Chair) (Norway), Armelle Decaulne (Secretary) (France), John C. Dixon (USA), Scott F. Lamoureux (Vice-Chair) (Canada), John F. Orwin (Canada), Jan-Christoph Otto (Austria), Irina Overeem (USA), Thorsteinn Sæmundsson (Iceland), Jeff Warburton (UK) and Zbigniew Zwolinski (Poland). The central research question of this global group of scientists is to: Assess and model the contemporary sedimentary fluxes in cold climates, with emphasis on both particulate and dissolved components. Initially formed as European Science Foundation (ESF) Network SEDIFLUX (Sedimentary Source-to-Sink Fluxes in Cold Environments) (2004 - ), SEDIBUD has further expanded to a global group of researchers with field research sites located in polar and alpine regions in the northern and southern hemisphere. Research carried out at each of the close to 50 defined SEDIBUD key test sites varies by programme, logistics and available

  13. Wavelet transform-vector quantization compression of supercomputer ocean model simulation output

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J N; Brislawn, C M

    1992-11-12

    We describe a new procedure for efficient compression of digital information for storage and transmission purposes. The algorithm involves a discrete wavelet transform subband decomposition of the data set, followed by vector quantization of the wavelet transform coefficients using application-specific vector quantizers. The new vector quantizer design procedure optimizes the assignment of both memory resources and vector dimensions to the transform subbands by minimizing an exponential rate-distortion functional subject to constraints on both overall bit-rate and encoder complexity. The wavelet-vector quantization method, which originates in digital image compression. is applicable to the compression of other multidimensional data sets possessing some degree of smoothness. In this paper we discuss the use of this technique for compressing the output of supercomputer simulations of global climate models. The data presented here comes from Semtner-Chervin global ocean models run at the National Center for Atmospheric Research and at the Los Alamos Advanced Computing Laboratory.

  14. Large scale simulations of lattice QCD thermodynamics on Columbia Parallel Supercomputers

    International Nuclear Information System (INIS)

    Ohta, Shigemi

    1989-01-01

    The Columbia Parallel Supercomputer project aims at the construction of a parallel processing, multi-gigaflop computer optimized for numerical simulations of lattice QCD. The project has three stages; 16-node, 1/4GF machine completed in April 1985, 64-node, 1GF machine completed in August 1987, and 256-node, 16GF machine now under construction. The machines all share a common architecture; a two dimensional torus formed from a rectangular array of N 1 x N 2 independent and identical processors. A processor is capable of operating in a multi-instruction multi-data mode, except for periods of synchronous interprocessor communication with its four nearest neighbors. Here the thermodynamics simulations on the two working machines are reported. (orig./HSI)

  15. Use of QUADRICS supercomputer as embedded simulator in emergency management systems

    International Nuclear Information System (INIS)

    Bove, R.; Di Costanzo, G.; Ziparo, A.

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system

  16. Reactive flow simulations in complex geometries with high-performance supercomputing

    International Nuclear Information System (INIS)

    Rehm, W.; Gerndt, M.; Jahn, W.; Vogelsang, R.; Binninger, B.; Herrmann, M.; Olivier, H.; Weber, M.

    2000-01-01

    In this paper, we report on a modern field code cluster consisting of state-of-the-art reactive Navier-Stokes- and reactive Euler solvers that has been developed on vector- and parallel supercomputers at the research center Juelich. This field code cluster is used for hydrogen safety analyses of technical systems, for example, in the field of nuclear reactor safety and conventional hydrogen demonstration plants with fuel cells. Emphasis is put on the assessment of combustion loads, which could result from slow, fast or rapid flames, including transition from deflagration to detonation. As a sample of proof tests, the special tools have been tested for specific tasks, based on the comparison of experimental and numerical results, which are in reasonable agreement. (author)

  17. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    Science.gov (United States)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  18. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  19. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  20. Solving sparse linear least squares problems on some supercomputers by using large dense blocks

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Ostromsky, T; Sameh, A

    1997-01-01

    technique is preferable to sparse matrix technique when the matrices are not large, because the high computational speed compensates fully the disadvantages of using more arithmetic operations and more storage. For very large matrices the computations must be organized as a sequence of tasks in each......Efficient subroutines for dense matrix computations have recently been developed and are available on many high-speed computers. On some computers the speed of many dense matrix operations is near to the peak-performance. For sparse matrices storage and operations can be saved by operating only...... and storing only nonzero elements. However, the price is a great degradation of the speed of computations on supercomputers (due to the use of indirect addresses, to the need to insert new nonzeros in the sparse storage scheme, to the lack of data locality, etc.). On many high-speed computers a dense matrix...

  1. An Optimized Parallel FDTD Topology for Challenging Electromagnetic Simulations on Supercomputers

    Directory of Open Access Journals (Sweden)

    Shugang Jiang

    2015-01-01

    Full Text Available It may not be a challenge to run a Finite-Difference Time-Domain (FDTD code for electromagnetic simulations on a supercomputer with more than 10 thousands of CPU cores; however, to make FDTD code work with the highest efficiency is a challenge. In this paper, the performance of parallel FDTD is optimized through MPI (message passing interface virtual topology, based on which a communication model is established. The general rules of optimal topology are presented according to the model. The performance of the method is tested and analyzed on three high performance computing platforms with different architectures in China. Simulations including an airplane with a 700-wavelength wingspan, and a complex microstrip antenna array with nearly 2000 elements are performed very efficiently using a maximum of 10240 CPU cores.

  2. Development of a high performance eigensolver on the peta-scale next generation supercomputer system

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Yamada, Susumu; Machida, Masahiko

    2010-01-01

    For the present supercomputer systems, a multicore and multisocket processors are necessary to build a system, and choice of interconnection is essential. In addition, for effective development of a new code, high performance, scalable, and reliable numerical software is one of the key items. ScaLAPACK and PETSc are well-known software on distributed memory parallel computer systems. It is needless to say that highly tuned software towards new architecture like many-core processors must be chosen for real computation. In this study, we present a high-performance and high-scalable eigenvalue solver towards the next-generation supercomputer system, so called 'K-computer' system. We have developed two versions, the standard version (eigen s) and enhanced performance version (eigen sx), which are developed on the T2K cluster system housed at University of Tokyo. Eigen s employs the conventional algorithms; Householder tridiagonalization, divide and conquer (DC) algorithm, and Householder back-transformation. They are carefully implemented with blocking technique and flexible two-dimensional data-distribution to reduce the overhead of memory traffic and data transfer, respectively. Eigen s performs excellently on the T2K system with 4096 cores (theoretical peak is 37.6 TFLOPS), and it shows fine performance 3.0 TFLOPS with a two hundred thousand dimensional matrix. The enhanced version, eigen sx, uses more advanced algorithms; the narrow-band reduction algorithm, DC for band matrices, and the block Householder back-transformation with WY-representation. Even though this version is still on a test stage, it shows 4.7 TFLOPS with the same dimensional matrix on eigen s. (author)

  3. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  4. Efficient development of memory bounded geo-applications to scale on modern supercomputers

    Science.gov (United States)

    Räss, Ludovic; Omlin, Samuel; Licul, Aleksandar; Podladchikov, Yuri; Herman, Frédéric

    2016-04-01

    Numerical modeling is an actual key tool in the area of geosciences. The current challenge is to solve problems that are multi-physics and for which the length scale and the place of occurrence might not be known in advance. Also, the spatial extend of the investigated domain might strongly vary in size, ranging from millimeters for reactive transport to kilometers for glacier erosion dynamics. An efficient way to proceed is to develop simple but robust algorithms that perform well and scale on modern supercomputers and permit therefore very high-resolution simulations. We propose an efficient approach to solve memory bounded real-world applications on modern supercomputers architectures. We optimize the software to run on our newly acquired state-of-the-art GPU cluster "octopus". Our approach shows promising preliminary results on important geodynamical and geomechanical problematics: we have developed a Stokes solver for glacier flow and a poromechanical solver including complex rheologies for nonlinear waves in stressed rocks porous rocks. We solve the system of partial differential equations on a regular Cartesian grid and use an iterative finite difference scheme with preconditioning of the residuals. The MPI communication happens only locally (point-to-point); this method is known to scale linearly by construction. The "octopus" GPU cluster, which we use for the computations, has been designed to achieve maximal data transfer throughput at minimal hardware cost. It is composed of twenty compute nodes, each hosting four Nvidia Titan X GPU accelerators. These high-density nodes are interconnected with a parallel (dual-rail) FDR InfiniBand network. Our efforts show promising preliminary results for the different physics investigated. The glacier flow solver achieves good accuracy in the relevant benchmarks and the coupled poromechanical solver permits to explain previously unresolvable focused fluid flow as a natural outcome of the porosity setup. In both cases

  5. Environmental radiation monitoring during 2013 and 2014 in the environment of future site temporary store central (ATC) Spanish

    International Nuclear Information System (INIS)

    Pujol, L.; Perez Zabaleta, E.; Pablo, M. A. de; Rodrguez Arevalo, J.; Nieva, A.

    2014-01-01

    During 2013 and 2014 samples were taken of representative waters surface and underground waters near checkpoints quality currently available ATC in location of the Guadiana River Basin aquifers. The CEDEX has identified the following radiographic parameters: total alpha activity index, total beta, beta rest, activity concentration of tritium and gamma spectrometry. It has also been performed specific action of some alpha emitters (radium isotopes and uranium). values average rate of total alpha activity, gross beta and beta rest, have an increased increment in waters down Zancara of the river, in the immediate environment of the facility. The index of alpha activity is consistent with the total amount of alpha emitters and determined that the contribution principal is due to isotopic uranium, probably contributed by leaching of the geological materials from the area. (Author)

  6. DESIGNING OF ARCHITECTURE OF CLOUD-ORIENTED INFORMATION-EDUCATIONAL ENVIRONMENT TO PREPARE FUTURE IT-PROFESSIONALS

    Directory of Open Access Journals (Sweden)

    Olena H. Glazunova

    2014-12-01

    Full Text Available In the article author substantiated architecture of information-educational environment of the modern university, built on the basis of cloud technologies. A number of software and technology solutions based on virtualization, clustering and management of virtual resources that can be implemented on the basis of its own infrastructure of the institution are proposed. Model for the provision of educational services to students of IT-specialties, which is to provide access of students to teaching environmental resources: e-learning courses, resources of institutional repository, digital library, video portal, wiki portal, as well as virtual desktop with the required set of software package for the laboratory and project work through only one account in the e-learning system are substantiated. Scheme of student access to virtual learning resources, including virtual desktop directly through the web interface and by reference from resource for laboratory work in e-learning courses are proposed.

  7. Radio-Isotopes Section, radiation Safety Division, Ministry Of The Environment, Israel: A General Review, And Future Developments

    International Nuclear Information System (INIS)

    Ben-Zion, S.

    1999-01-01

    The section of radio-isotopes in the Ministry Of Environment, is responsible for preventing environmental hazards fi.om radio-isotopes ''from cradle to grave's'. The management and the supervision of radioactive materials, includes about 350 institutes in Israel. We are dealing with the implementation and the enforcement of the environmental regulations and safety standards, and licensing for each institution and installation. Among our tasks are the following: Follow-up of the import, transportation and distribution, usage and storage and disposal of radio-isotopes, as well as legislation, risk-assessments, inspection, , and ''education'. We are also participating in committees / working groups discussing specific topics: Radioactive stores, Low RW disposal, Y2K, GIS, penalties charging, transportation and more

  8. The heat is on: Australia's greenhouse future. Report to the Senate Environment, Communications, Information Technology and the Art References Committee

    International Nuclear Information System (INIS)

    2000-11-01

    On 11 August 1999, the Senate referred matters pertaining to global warming to the Environment, Communications, Information Technology and the Art References Committee for inquiring. The Committee is reporting on the progress and adequacy of Australian policies to reduce global warming, in light of Australia's commitments under the Framework Convention on Climate Change. It also critically evaluates the effectiveness Australian Government programs and policies, both state and Federal, in particular those aiming to provide for the development of emerging renewable energies, energy efficiency industries and the more efficient use of energy sources and the extent to which the Government's relations with industry under the Greenhouse Challenge Program are accountable and transparent. Projected effect of climate change on Australia's ecosystems and the potential introduction of national system of emissions trading within Australia are also examined

  9. Norfolk, Virginia—Planning to be the Coastal Community of the Future in a rising water environment

    Science.gov (United States)

    Homewood, G. M.

    2017-12-01

    Norfolk VA is the second most at-risk population center in North America from sea level rise while also being home to the world's largest naval base and one of the 3 largest east coast ports. Norfolk is one of the original cohort of cities in the 100 Resilient Cities effort pioneered by the Rockefeller Foundation and has changed its sea level adaptation strategy from "keep the water out" to "living with water" through a ground-breaking community visioning process. In Norfolk, this means, among other goals, finding co-benefits in public and private investments and interventions—these can be environmental, economic, social, recreational or other things we have not yet thought about—and it is in this area that the geosciences can benefit Norfolk's planning for a rising water environment.

  10. The future implications of some long-lived fission product nuclides discharged to the environment in fuel reprocessing wastes

    International Nuclear Information System (INIS)

    Bryant, P.M.; Jones, J.A.

    1972-12-01

    Current reprocessing practice leads to the discharge to the environment of virtually all the krypton-85 and tritium, and a large fraction of the iodine-129, formed as fission products in reactor fuel. As nuclear power programmes expand the global inventory of these long-lived nuclides is increasing. The radiological significance of these discharges is assessed in terms of radiation exposure of various population groups during the next few decades. The results of this assessment show that krypton-85 will give higher dose rates than tritium or iodine-129, but that on conventional radiological protection criteria these do not justify taking action to remove krypton-85 from reprocessing plant effluents before the 21st century. (author)

  11. Models everywhere. How a fully integrated model-based test environment can enable progress in the future

    Energy Technology Data Exchange (ETDEWEB)

    Ben Gaid, Mongi; Lebas, Romain; Fremovici, Morgan; Font, Gregory; Le Solliec, Gunael [IFP Energies nouvelles, Rueil-Malmaison (France); Albrecht, Antoine [D2T Powertrain Engineering, Rueil-Malmaison (France)

    2011-07-01

    The aim of this paper is to demonstrate how advanced modelling approaches coupled with powerful tools allow to set up a complete and coherent test environment suite. Based on a real study focused on the development of a Euro 6 hybrid powertrain with a Euro 5 turbocharged diesel engine, the authors present how a diesel engine simulator including an in-cylinder phenomenological approach to predict the raw emissions can be coupled with a DOC and DPF after-treatment system and embedded in the complete hybrid powertrain to be used in various test environments: - coupled with the control software in a multi-model multi-core simulation platform with test automation features, allowing the simulation speed to be faster than the real-time; - exported in a real time hardware in the loop platform with the ECU and hardware actuators; embedded at the experimental engine test bed to perform driving cycles such as NEDC or FTP cycles with the hybrid powertrain management. Thanks to these complete and versatile test platform suite xMOD/Morphee, all the key issues of a full hybrid powertrain can be addressed efficiently and at low cost compared to the experimental powertrain prototypes: consumption minimisation, energy optimisation, thermal exhaust management. NOx/soots trade off, NO/NO2 ratios.. Having a good balance between versatility and compliancy of the model oriented test platforms such as presented in this paper is the best way to take the maximum benefit of the model developed at each stage of the powertrain development. (orig.)

  12. Onset and stability of gas hydrates under permafrost in an environment of surface climatic change : past and future

    International Nuclear Information System (INIS)

    Majorowicz, J.A.; Osadetz, K.; Safanda, J.

    2008-01-01

    This paper presented a model designed to simulate permafrost and gas hydrate formation in a changing surface temperature environment in the Beaufort-Mackenzie Basin (BMB). The numerical model simulated surface forcing due to general cooling trends that began in the late Miocene era. This study modelled the onset of permafrost formation and subsequent gas hydrate formation in the changing surface temperature environment for the BMB. Paleoclimatic data were used. The 1-D model was constrained by deep heat flow from well bottom hole temperatures; conductivity; permafrost thickness; and the thickness of the gas hydrates. The model used latent heat effects for the ice-bearing permafrost and hydrate intervals. Surface temperatures for glacial and interglacial histories for the last 14 million years were considered. The model also used a detailed Holocene temperature history as well as a scenario in which atmospheric carbon dioxide (CO 2 ) levels were twice as high as current levels. Two scenarios were considered: (1) the formation of gas hydrates from gas entrapped under geological seals; and (2) the formation of gas hydrates from gas located in free pore spaces simultaneously with permafrost formation. Results of the study showed that gas hydrates may have formed at a depth of 0.9 km only 1 million years ago. Results of the other modelling scenarios suggested that the hydrates formed 6 million years ago, when temperature changes caused the gas hydrate layer to expand both downward and upward. Detailed models of more recent glacial and interglacial histories showed that the gas hydrate zones will persist under the thick body of the BMB permafrost through current interglacial warming as well as in scenarios where atmospheric CO 2 is doubled. 28 refs., 13 figs

  13. Teaching Research Methods and Statistics in eLearning Environments:Pedagogy, Practical Examples and Possible Futures

    Directory of Open Access Journals (Sweden)

    Adam John Rock

    2016-03-01

    Full Text Available Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal, Ginsburg, & Schau, 1997. Given the ubiquitous and distributed nature of eLearning systems (Nof, Ceroni, Jeong, & Moghaddam, 2015, teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  14. Mineral formation on metallic copper in a `Future repository site environment`: Textural considerations based on natural analogs

    Energy Technology Data Exchange (ETDEWEB)

    Amcoff, Oe. [Uppsala Univ. (Sweden). Inst. of Earth Sciences

    1998-01-01

    Copper mineral formation in the Swedish `repository site environment` is discussed. Special attention is given to ore mineral textures (=the spatial relation among minerals), with examples given from nature. It is concluded: By analogy with observations from natural occurrences, an initial coating of Cu-oxide on the canister surface (because of entrapped air during construction) will probably not hinder a later sulphidation process. Early formation of Cu-sulphides on the canister surface may be accompanied by formation of CuFe-sulphides. The latter phase(s) may form through replacement of the Cu-sulphides or, alternatively, by means of reaction between dissolved copper and fine-grained iron sulphide (pyrite) in the surrounding bentonite. Should for some reason the bentonite barrier fail and the conditions become strongly oxidizing, we can expect crustifications and rhythmic growths of Cu(II)-phases, like malachite (Cu{sub 2}(OH){sub 2}CO{sub 3}). A presence of Fe{sup 2} in the clay minerals making up the bentonite might prove to have an adverse effect on the canister stability, since, in this case, the bentonite might be expected to act as a sink for dissolved copper. The mode of mineral growth along the copper - bentonite interface remains an open question.

  15. Mineral formation on metallic copper in a 'Future repository site environment': Textural considerations based on natural analogs

    International Nuclear Information System (INIS)

    Amcoff, Oe.

    1998-01-01

    Copper mineral formation in the Swedish 'repository site environment' is discussed. Special attention is given to ore mineral textures (=the spatial relation among minerals), with examples given from nature. It is concluded: By analogy with observations from natural occurrences, an initial coating of Cu-oxide on the canister surface (because of entrapped air during construction) will probably not hinder a later sulphidation process. Early formation of Cu-sulphides on the canister surface may be accompanied by formation of CuFe-sulphides. The latter phase(s) may form through replacement of the Cu-sulphides or, alternatively, by means of reaction between dissolved copper and fine-grained iron sulphide (pyrite) in the surrounding bentonite. Should for some reason the bentonite barrier fail and the conditions become strongly oxidizing, we can expect crustifications and rhythmic growths of Cu(II)-phases, like malachite (Cu 2 (OH) 2 CO 3 ). A presence of Fe 2 in the clay minerals making up the bentonite might prove to have an adverse effect on the canister stability, since, in this case, the bentonite might be expected to act as a sink for dissolved copper. The mode of mineral growth along the copper - bentonite interface remains an open question

  16. Arthritis symptoms, the work environment, and the future: measuring perceived job strain among employed persons with arthritis.

    Science.gov (United States)

    Gignac, Monique A M; Sutton, Deborah; Badley, Elizabeth M

    2007-06-15

    To develop a measure of job strain related to differing aspects of working with arthritis and to examine the demographic, illness, work context, and psychosocial variables associated with it. Study participants were 292 employed individuals with osteoarthritis or inflammatory arthritis. Participants were from wave 3 of a 4-wave longitudinal study examining coping and adaptation efforts used to remain employed. Participants completed an interview-administered structured questionnaire, including a Chronic Illness Job Strain Scale (CIJSS) and questions on demographic (e.g., age, sex), illness and disability (e.g., disease type, pain, activity limitations), work context (e.g., job type, job control), and psychosocial variables (e.g., arthritis-work spillover, coworker/managerial support, job perceptions). Principal component analysis and multiple linear regression were used to analyze the data. A single factor solution emerged for the CIJSS. The scale had an internal reliability of 0.95. Greater job strain was reported for future uncertainty, balancing multiple roles, and difficulties accepting the disease than for current workplace conditions. Participants with inflammatory arthritis, more frequent severe pain, greater workplace activity limitations, fewer hours of work, less coworker support, and greater arthritis-work spillover reported greater job strain. The findings underscore the diverse areas that contribute to perceptions of job strain and suggest that existing models of job strain do not adequately capture the stress experienced by individuals working with chronic illnesses or the factors associated with job strain. Measures similar to the CIJSS can enhance the tools researchers and clinicians have available to examine the impact of arthritis in individuals' lives.

  17. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  18. Radiation Environment at LEO in the frame of Space Monitoring Data Center at Moscow State University - recent, current and future missions

    Science.gov (United States)

    Myagkova, Irina; Kalegaev, Vladimir; Panasyuk, Mikhail; Svertilov, Sergey; Bogomolov, Vitaly; Bogomolov, Andrey; Barinova, Vera; Barinov, Oleg; Bobrovnikov, Sergey; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir; Shugay, Julia

    2016-04-01

    Radiation Environment of Near-Earth space is one of the most important factors of space weather. Space Monitoring Data Center of Moscow State University provides operational control of radiation conditions at Low Earth's Orbits (LEO) of the near-Earth space using data of recent (Vernov, CORONAS series), current (Meteor-M, Electro-L series) and future (Lomonosov) space missions. Internet portal of Space Monitoring Data Center of Skobeltsyn Institute of Nuclear Physics of Lomonosov Moscow State University (SINP MSU) http://swx.sinp.msu.ru/ provides possibilities to control and analyze the space radiation conditions in the real time mode together with the geomagnetic and solar activity including hard X-ray and gamma- emission of solar flares. Operational data obtained from space missions at L1, GEO and LEO and from the Earth's magnetic stations are used to represent radiation and geomagnetic state of near-Earth environment. The models of space environment that use space measurements from different orbits were created. Interactive analysis and operational neural network forecast services are based on these models. These systems can automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons of outer Earth's radiation belt using data from GEO and LEO as input. As an example of LEO data we consider data from Vernov mission, which was launched into solar-synchronous orbit (altitude 640 - 83 0 km, inclination 98.4°, orbital period about 100 min) on July 8, 2014 and began to receive scientific information since July 20, 2014. Vernov mission have provided studies of the Earth's radiation belt relativistic electron precipitation and its possible connection with atmosphere transient luminous events, as well as the solar hard X-ray and gamma-emission measurements. Radiation and electromagnetic environment monitoring in the near-Earth Space, which is very important for space weather study, was also realised

  19. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  20. Research center Juelich to install Germany's most powerful supercomputer new IBM System for science and research will achieve 5.8 trillion computations per second

    CERN Multimedia

    2002-01-01

    "The Research Center Juelich, Germany, and IBM today announced that they have signed a contract for the delivery and installation of a new IBM supercomputer at the Central Institute for Applied Mathematics" (1/2 page).

  1. Earth and environmental science in the 1980's: Part 1: Environmental data systems, supercomputer facilities and networks

    Science.gov (United States)

    1986-01-01

    Overview descriptions of on-line environmental data systems, supercomputer facilities, and networks are presented. Each description addresses the concepts of content, capability, and user access relevant to the point of view of potential utilization by the Earth and environmental science community. The information on similar systems or facilities is presented in parallel fashion to encourage and facilitate intercomparison. In addition, summary sheets are given for each description, and a summary table precedes each section.

  2. The BlueGene/L Supercomputer and Quantum ChromoDynamics

    International Nuclear Information System (INIS)

    Vranas, P; Soltz, R

    2006-01-01

    In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD

  3. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  4. 369 TFlop/s molecular dynamics simulations on the Roadrunner general-purpose heterogeneous supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Swaminarayan, Sriram [Los Alamos National Laboratory; Germann, Timothy C [Los Alamos National Laboratory; Kadau, Kai [Los Alamos National Laboratory; Fossum, Gordon C [IBM CORPORATION

    2008-01-01

    The authors present timing and performance numbers for a short-range parallel molecular dynamics (MD) code, SPaSM, that has been rewritten for the heterogeneous Roadrunner supercomputer. Each Roadrunner compute node consists of two AMD Opteron dual-core microprocessors and four PowerXCell 8i enhanced Cell microprocessors, so that there are four MPI ranks per node, each with one Opteron and one Cell. The interatomic forces are computed on the Cells (each with one PPU and eight SPU cores), while the Opterons are used to direct inter-rank communication and perform I/O-heavy periodic analysis, visualization, and checkpointing tasks. The performance measured for our initial implementation of a standard Lennard-Jones pair potential benchmark reached a peak of 369 Tflop/s double-precision floating-point performance on the full Roadrunner system (27.7% of peak), corresponding to 124 MFlop/Watt/s at a price of approximately 3.69 MFlops/dollar. They demonstrate an initial target application, the jetting and ejection of material from a shocked surface.

  5. A Parallel Supercomputer Implementation of a Biological Inspired Neural Network and its use for Pattern Recognition

    International Nuclear Information System (INIS)

    De Ladurantaye, Vincent; Lavoie, Jean; Bergeron, Jocelyn; Parenteau, Maxime; Lu Huizhong; Pichevar, Ramin; Rouat, Jean

    2012-01-01

    A parallel implementation of a large spiking neural network is proposed and evaluated. The neural network implements the binding by synchrony process using the Oscillatory Dynamic Link Matcher (ODLM). Scalability, speed and performance are compared for 2 implementations: Message Passing Interface (MPI) and Compute Unified Device Architecture (CUDA) running on clusters of multicore supercomputers and NVIDIA graphical processing units respectively. A global spiking list that represents at each instant the state of the neural network is described. This list indexes each neuron that fires during the current simulation time so that the influence of their spikes are simultaneously processed on all computing units. Our implementation shows a good scalability for very large networks. A complex and large spiking neural network has been implemented in parallel with success, thus paving the road towards real-life applications based on networks of spiking neurons. MPI offers a better scalability than CUDA, while the CUDA implementation on a GeForce GTX 285 gives the best cost to performance ratio. When running the neural network on the GTX 285, the processing speed is comparable to the MPI implementation on RQCHP's Mammouth parallel with 64 notes (128 cores).

  6. Modeling radiative transport in ICF plasmas on an IBM SP2 supercomputer

    International Nuclear Information System (INIS)

    Johansen, J.A.; MacFarlane, J.J.; Moses, G.A.

    1995-01-01

    At the University of Wisconsin-Madison the authors have integrated a collisional-radiative-equilibrium model into their CONRAD radiation-hydrodynamics code. This integrated package allows them to accurately simulate the transport processes involved in ICF plasmas; including the important effects of self-absorption of line-radiation. However, as they increase the amount of atomic structure utilized in their transport models, the computational demands increase nonlinearly. In an attempt to meet this increased computational demand, they have recently embarked on a mission to parallelize the CONRAD program. The parallel CONRAD development is being performed on an IBM SP2 supercomputer. The parallelism is based on a message passing paradigm, and is being implemented using PVM. At the present time they have determined that approximately 70% of the sequential program can be executed in parallel. Accordingly, they expect that the parallel version will yield a speedup on the order of three times that of the sequential version. This translates into only 10 hours of execution time for the parallel version, whereas the sequential version required 30 hours

  7. Portable implementation model for CFD simulations. Application to hybrid CPU/GPU supercomputers

    Science.gov (United States)

    Oyarzun, Guillermo; Borrell, Ricard; Gorobets, Andrey; Oliva, Assensi

    2017-10-01

    Nowadays, high performance computing (HPC) systems experience a disruptive moment with a variety of novel architectures and frameworks, without any clarity of which one is going to prevail. In this context, the portability of codes across different architectures is of major importance. This paper presents a portable implementation model based on an algebraic operational approach for direct numerical simulation (DNS) and large eddy simulation (LES) of incompressible turbulent flows using unstructured hybrid meshes. The strategy proposed consists in representing the whole time-integration algorithm using only three basic algebraic operations: sparse matrix-vector product, a linear combination of vectors and dot product. The main idea is based on decomposing the nonlinear operators into a concatenation of two SpMV operations. This provides high modularity and portability. An exhaustive analysis of the proposed implementation for hybrid CPU/GPU supercomputers has been conducted with tests using up to 128 GPUs. The main objective consists in understanding the challenges of implementing CFD codes on new architectures.

  8. Assessment techniques for a learning-centered curriculum: evaluation design for adventures in supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Helland, B. [Ames Lab., IA (United States); Summers, B.G. [Oak Ridge National Lab., TN (United States)

    1996-09-01

    As the classroom paradigm shifts from being teacher-centered to being learner-centered, student assessments are evolving from typical paper and pencil testing to other methods of evaluation. Students should be probed for understanding, reasoning, and critical thinking abilities rather than their ability to return memorized facts. The assessment of the Department of Energy`s pilot program, Adventures in Supercomputing (AiS), offers one example of assessment techniques developed for learner-centered curricula. This assessment has employed a variety of methods to collect student data. Methods of assessment used were traditional testing, performance testing, interviews, short questionnaires via email, and student presentations of projects. The data obtained from these sources have been analyzed by a professional assessment team at the Center for Children and Technology. The results have been used to improve the AiS curriculum and establish the quality of the overall AiS program. This paper will discuss the various methods of assessment used and the results.

  9. Visualization at supercomputing centers: the tale of little big iron and the three skinny guys.

    Science.gov (United States)

    Bethel, E W; van Rosendale, J; Southard, D; Gaither, K; Childs, H; Brugger, E; Ahern, S

    2011-01-01

    Supercomputing centers are unique resources that aim to enable scientific knowledge discovery by employing large computational resources-the "Big Iron." Design, acquisition, installation, and management of the Big Iron are carefully planned and monitored. Because these Big Iron systems produce a tsunami of data, it's natural to colocate the visualization and analysis infrastructure. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys doesn't receive the same level of treatment as that of the Big Iron. This article explores the following questions about the Little Iron: How should we size the Little Iron to adequately support visualization and analysis of data coming off the Big Iron? What sort of capabilities must it have? Related questions concern the size of visualization support staff: How big should a visualization program be-that is, how many Skinny Guys should it have? What should the staff do? How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?

  10. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan; Mills, Richard T.

    2012-04-18

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors per realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.

  11. Benchmarking Further Single Board Computers for Building a Mini Supercomputer for Simulation of Telecommunication Systems

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2016-01-01

    Full Text Available Parallel Discrete Event Simulation (PDES with the conservative synchronization method can be efficiently used for the performance analysis of telecommunication systems because of their good lookahead properties. For PDES, a cost effective execution platform may be built by using single board computers (SBCs, which offer relatively high computation capacity compared to their price or power consumption and especially to the space they take up. A benchmarking method is proposed and its operation is demonstrated by benchmarking ten different SBCs, namely Banana Pi, Beaglebone Black, Cubieboard2, Odroid-C1+, Odroid-U3+, Odroid-XU3 Lite, Orange Pi Plus, Radxa Rock Lite, Raspberry Pi Model B+, and Raspberry Pi 2 Model B+. Their benchmarking results are compared to find out which one should be used for building a mini supercomputer for parallel discrete-event simulation of telecommunication systems. The SBCs are also used to build a heterogeneous cluster and the performance of the cluster is tested, too.

  12. Personality, perceived environment, and behavior systems related to future smoking intentions among youths: an application of problem-behavior theory in Shanghai, China.

    Science.gov (United States)

    Cai, Yong; Li, Rui; Zhu, Jingfen; Na, Li; He, Yaping; Redmon, Pam; Qiao, Yun; Ma, Jin

    2015-01-01

    Smoking among youths is a worldwide problem, particularly in China. Many endogenous and environmental factors influence smokers' intentions to smoke; therefore, a comprehensive model is needed to understand the significance and relationship of predictors. This study aimed to develop a prediction model based on problem-behavior theory (PBT) to interpret intentions to smoke among Chinese youths. We conducted a cross-sectional study of 26,675 adolescents from junior, senior, and vocational high schools in Shanghai, China. Data on smoking status, smoking knowledge, attitude toward smoking, parents' and peers' smoking, and media exposure to smoking were collected from students. A structural equation model was used to assess the developed prediction model. The experimental smoking rate and current smoking rate among the students were 11.0% and 3%, respectively. Our constructed model showed an acceptable fit to the data (comparative fit index = 0.987, root-mean-square error of approximation = 0.034). Intention to smoke was predicted by perceived environment (β = 0.455, P peer smoking (β = 0.599, P 0.05) which consisted of acceptance of tobacco use (β = 0.668, P academic performance (β = 0.171, P < 0.001). The PBT-based model we developed provides a good understanding of the predictors of intentions to smoke and it suggests future interventions among youths should focus on components in perceived environment and behavior systems, and take into account the moderating effects of personality system.

  13. Personality, perceived environment, and behavior systems related to future smoking intentions among youths: an application of problem-behavior theory in Shanghai, China.

    Directory of Open Access Journals (Sweden)

    Yong Cai

    Full Text Available Smoking among youths is a worldwide problem, particularly in China. Many endogenous and environmental factors influence smokers' intentions to smoke; therefore, a comprehensive model is needed to understand the significance and relationship of predictors. This study aimed to develop a prediction model based on problem-behavior theory (PBT to interpret intentions to smoke among Chinese youths.We conducted a cross-sectional study of 26,675 adolescents from junior, senior, and vocational high schools in Shanghai, China. Data on smoking status, smoking knowledge, attitude toward smoking, parents' and peers' smoking, and media exposure to smoking were collected from students. A structural equation model was used to assess the developed prediction model.The experimental smoking rate and current smoking rate among the students were 11.0% and 3%, respectively. Our constructed model showed an acceptable fit to the data (comparative fit index = 0.987, root-mean-square error of approximation = 0.034. Intention to smoke was predicted by perceived environment (β = 0.455, P 0.05 which consisted of acceptance of tobacco use (β = 0.668, P < 0.001 and academic performance (β = 0.171, P < 0.001.The PBT-based model we developed provides a good understanding of the predictors of intentions to smoke and it suggests future interventions among youths should focus on components in perceived environment and behavior systems, and take into account the moderating effects of personality system.

  14. High tolerance to temperature and salinity change should enable scleractinian coral Platygyra acuta from marginal environments to persist under future climate change.

    Directory of Open Access Journals (Sweden)

    Apple Pui Yi Chui

    Full Text Available With projected changes in the marine environment under global climate change, the effects of single stressors on corals have been relatively well studied. However, more focus should be placed on the interactive effects of multiple stressors if their impacts upon corals are to be assessed more realistically. Elevation of sea surface temperature is projected under global climate change, and future increases in precipitation extremes related to the monsoon are also expected. Thus, the lowering of salinity could become a more common phenomenon and its impact on corals could be significant as extreme precipitation usually occurs during the coral spawning season. Here, we investigated the interactive effects of temperature [24, 27 (ambient, 30, 32°C] and salinity [33 psu (ambient, 30, 26, 22, 18, 14 psu] on larval settlement, post-settlement survival and early growth of the dominant coral Platygyra acuta from Hong Kong, a marginal environment for coral growth. The results indicate that elevated temperatures (+3°C and +5°C above ambient did not have any significant effects on larval settlement success and post-settlement survival for up to 56 days of prolonged exposure. Such thermal tolerance was markedly higher than that reported in the literature for other coral species. Moreover, there was a positive effect of these elevated temperatures in reducing the negative effects of lowered salinity (26 psu on settlement success. The enhanced settlement success brought about by elevated temperatures, together with the high post-settlement survival recorded up to 44 and 8 days of exposure under +3°C and +5°C ambient respectively, resulted in the overall positive effects of elevated temperatures on recruitment success. These results suggest that projected elevation in temperature over the next century should not pose any major problem for the recruitment success of P. acuta. The combined effects of higher temperatures and lowered salinity (26 psu could

  15. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  16. Update on the Worsening Particle Radiation Environment Observed by CRaTER and Implications for Future Human Deep-Space Exploration

    Science.gov (United States)

    Schwadron, N. A.; Rahmanifard, F.; Wilson, J.; Jordan, A. P.; Spence, H. E.; Joyce, C. J.; Blake, J. B.; Case, A. W.; de Wet, W.; Farrell, W. M.; Kasper, J. C.; Looper, M. D.; Lugaz, N.; Mays, L.; Mazur, J. E.; Niehof, J.; Petro, N.; Smith, C. W.; Townsend, L. W.; Winslow, R.; Zeitlin, C.

    2018-03-01

    Over the last decade, the solar wind has exhibited low densities and magnetic field strengths, representing anomalous states that have never been observed during the space age. As discussed by Schwadron, Blake, et al. (2014, https://doi.org/10.1002/2014SW001084), the cycle 23-24 solar activity led to the longest solar minimum in more than 80 years and continued into the "mini" solar maximum of cycle 24. During this weak activity, we observed galactic cosmic ray fluxes that exceeded theERobserved small solar energetic particle events. Here we provide an update to the Schwadron, Blake, et al. (2014, https://doi.org/10.1002/2014SW001084) observations from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter. The Schwadron, Blake, et al. (2014, https://doi.org/10.1002/2014SW001084) study examined the evolution of the interplanetary magnetic field and utilized a previously published study by Goelzer et al. (2013, https://doi.org/10.1002/2013JA019404) projecting out the interplanetary magnetic field strength based on the evolution of sunspots as a proxy for the rate that the Sun releases coronal mass ejections. This led to a projection of dose rates from galactic cosmic rays on the lunar surface, which suggested a ˜20% increase of dose rates from one solar minimum to the next and indicated that the radiation environment in space may be a worsening factor important for consideration in future planning of human space exploration. We compare the predictions of Schwadron, Blake, et al. (2014, https://doi.org/10.1002/2014SW001084) with the actual dose rates observed by CRaTER in the last 4 years. The observed dose rates exceed the predictions by ˜10%, showing that the radiation environment is worsening more rapidly than previously estimated. Much of this increase is attributable to relatively low-energy ions, which can be effectively shielded. Despite the continued paucity of solar activity, one of the hardest solar events in

  17. The Past and Future Trends of Heat Stress Based On Wet Bulb Globe Temperature Index in Outdoor Environment of Tehran City, Iran.

    Science.gov (United States)

    Habibi Mohraz, Majid; Ghahri, Asghar; Karimi, Mehrdad; Golbabaei, Farideh

    2016-06-01

    The workers who are working in the open and warm environments are at risk of health effects of climate and heat changes. It is expected that the risk is increase with global warming. This study aimed to investigate the changes of Wet Bulb Globe Temperature (WBGT) index in the past and to predict their trend of future changes in Tehran, capital of Iran. The meteorological data recorded in Tehran, Iran during the statistical period between 1961 and 2009 were obtained from the Iran Meteorological Organization and based on them, WBGT index was calculated and processed using Man-Kendall correlation test. The results of Man-Kendall correlation test showed that the trend of changes of annual mean WBGT during the statistical period under study (1961-2009) has been significantly increasing. In addition, the result of proposed predictive model estimated that an increase of about 1.55 degree in WBGT index will be seen over 40 years from 2009 to 2050 in Tehran. Climate change in Tehran has had an effect on person's exposure to heat stresses consistent with global warming.

  18. Physicochemical interactions resulting from the use of a SiC/SiC composite material in typical environments of future nuclear reactors

    International Nuclear Information System (INIS)

    Braun, James

    2014-01-01

    The development of high purity SiC fibers during the nineties has led to their consideration as nuclear reactors components through the use of SiC/SiC composites. SiC and SiC/SiC composites are considered as core materials of future nuclear reactors (SFR, GFR) and as a potential replacement for the zirconium cladding of PWR. Therefore, the thermochemical compatibility of these materials with typical environments of those nuclear reactors has been studied. The composition and the growth kinetics of the reaction zone of SiC towards niobium and tantalum (considered as materials to ensure the leak-tightness of a SiC/SiC cladding for GFR) have been studied between 1050 and 1500 C. High temperature heat treatments in open and closed systems between SiC and UO 2 have shown a significant reactivity over 1200 C characterized by the formation of CO and uranium silicides. Moreover, a liquid phase has been detected between 1500 and 1650 C. The exposure of SiC/SiC to liquid sodium (550 C, up to 2000 h) has been studied as a function of the oxygen concentration dissolved in liquid sodium. An improvement of the mechanical properties of the composites elaborated for this study (increase of the tensile strength and strain at failure) has been highlighted after immersion in the liquid sodium independently of its oxygen concentration. It is believed that this phenomenon is due to the presence of residual sodium in the material. (author) [fr

  19. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    Science.gov (United States)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using

  20. Non-sectarian scenario experiments in socio-ecological knowledge building for multi-use marine environments: Insights from New Zealand's Marine Futures project

    KAUST Repository

    Le Heron, Richard

    2016-01-29

    The challenges of managing marine ecosystems for multiple users, while well recognised, has not led to clear strategies, principles or practice. The paper uses novel workshop based thought-experiments to address these concerns. These took the form of trans-disciplinary Non-Sectarian Scenario Experiments (NSSE), involving participants who agreed to put aside their disciplinary interests and commercial and institutional obligations. The NSSE form of co-production of knowledge is a distinctive addition to the participatory and scenario literatures in marine resource management (MRM). Set in the context of resource use conflicts in New Zealand, the workshops assembled diverse participants in the marine economy to co-develop and co-explore the making of socio-ecological knowledge and identify capability required for a new generation of multi-use oriented resource management. The thought-experiments assumed that non-sectarian navigation of scenarios will resource a step-change in marine management by facilitating new connections, relationships, and understandings of potential marine futures. Two questions guided workshop interactions: what science needs spring from pursuing imaginable possibilities and directions in a field of scenarios, and what kinds of institutions would aid the generation of science knowledge, and it application to policy and management solutions. The effectiveness of the thought- experiments helped identify ways of dealing with core problems in multi-use marine management, such as the urgent need to cope with ecological and socio-economic surprise, and define and address cumulative impacts. Discussion focuses on how the workshops offered fresh perspectives and insights into a number of challenges. These challenges include building relations of trust and collective organisation, showing the importance of values-means-ends pathways, developing facilitative legislation to enable initiatives, and the utility of the NSSEs in informing new governance and

  1. Simulation of x-rays in refractive structure by the Monte Carlo method using the supercomputer SKIF

    International Nuclear Information System (INIS)

    Yaskevich, Yu.R.; Kravchenko, O.I.; Soroka, I.I.; Chembrovskij, A.G.; Kolesnik, A.S.; Serikova, N.V.; Petrov, P.V.; Kol'chevskij, N.N.

    2013-01-01

    Software 'Xray-SKIF' for the simulation of the X-rays in refractive structures by the Monte-Carlo method using the supercomputer SKIF BSU are developed. The program generates a large number of rays propagated from a source to the refractive structure. The ray trajectory under assumption of geometrical optics is calculated. Absorption is calculated for each ray inside of refractive structure. Dynamic arrays are used for results of calculation rays parameters, its restore the X-ray field distributions very fast at different position of detector. It was found that increasing the number of processors leads to proportional decreasing of calculation time: simulation of 10 8 X-rays using supercomputer with the number of processors from 1 to 30 run-times equal 3 hours and 6 minutes, respectively. 10 9 X-rays are calculated by software 'Xray-SKIF' which allows to reconstruct the X-ray field after refractive structure with a special resolution of 1 micron. (authors)

  2. Coherent 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an Optimal Supercomputer Optical Switch Fabric

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko

    2013-01-01

    We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates.......We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates....

  3. Compiler and Runtime Support for Programming in Adaptive Parallel Environments

    Science.gov (United States)

    1998-10-15

    noother job is waiting for resources, and use a smaller number of processors when other jobs needresources. Setia et al. [15, 20] have shown that such...15] Vijay K. Naik, Sanjeev Setia , and Mark Squillante. Performance analysis of job scheduling policiesin parallel supercomputing environments. In...on networks ofheterogeneous workstations. Technical Report CSE-94-012, Oregon Graduate Institute of Scienceand Technology, 1994.[20] Sanjeev Setia

  4. MaMiCo: Transient multi-instance molecular-continuum flow simulation on supercomputers

    Science.gov (United States)

    Neumann, Philipp; Bian, Xin

    2017-11-01

    We present extensions of the macro-micro-coupling tool MaMiCo, which was designed to couple continuum fluid dynamics solvers with discrete particle dynamics. To enable local extraction of smooth flow field quantities especially on rather short time scales, sampling over an ensemble of molecular dynamics simulations is introduced. We provide details on these extensions including the transient coupling algorithm, open boundary forcing, and multi-instance sampling. Furthermore, we validate the coupling in Couette flow using different particle simulation software packages and particle models, i.e. molecular dynamics and dissipative particle dynamics. Finally, we demonstrate the parallel scalability of the molecular-continuum simulations by using up to 65 536 compute cores of the supercomputer Shaheen II located at KAUST. Program Files doi:http://dx.doi.org/10.17632/w7rgdrhb85.1 Licensing provisions: BSD 3-clause Programming language: C, C++ External routines/libraries: For compiling: SCons, MPI (optional) Subprograms used: ESPResSo, LAMMPS, ls1 mardyn, waLBerla For installation procedures of the MaMiCo interfaces, see the README files in the respective code directories located in coupling/interface/impl. Journal reference of previous version: P. Neumann, H. Flohr, R. Arora, P. Jarmatz, N. Tchipev, H.-J. Bungartz. MaMiCo: Software design for parallel molecular-continuum flow simulations, Computer Physics Communications 200: 324-335, 2016 Does the new version supersede the previous version?: Yes. The functionality of the previous version is completely retained in the new version. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics or another particle-based solver whereas large parts are covered by a mesh-based CFD solver, e.g. a lattice Boltzmann automaton. Solution method: We couple existing MD and CFD solvers via MaMiCo (macro-micro coupling tool). Data exchange and

  5. Three-dimensional kinetic simulations of whistler turbulence in solar wind on parallel supercomputers

    Science.gov (United States)

    Chang, Ouliang

    The objective of this dissertation is to study the physics of whistler turbulence evolution and its role in energy transport and dissipation in the solar wind plasmas through computational and theoretical investigations. This dissertation presents the first fully three-dimensional (3D) particle-in-cell (PIC) simulations of whistler turbulence forward cascade in a homogeneous, collisionless plasma with a uniform background magnetic field B o, and the first 3D PIC simulation of whistler turbulence with both forward and inverse cascades. Such computationally demanding research is made possible through the use of massively parallel, high performance electromagnetic PIC simulations on state-of-the-art supercomputers. Simulations are carried out to study characteristic properties of whistler turbulence under variable solar wind fluctuation amplitude (epsilon e) and electron beta (betae), relative contributions to energy dissipation and electron heating in whistler turbulence from the quasilinear scenario and the intermittency scenario, and whistler turbulence preferential cascading direction and wavevector anisotropy. The 3D simulations of whistler turbulence exhibit a forward cascade of fluctuations into broadband, anisotropic, turbulent spectrum at shorter wavelengths with wavevectors preferentially quasi-perpendicular to B o. The overall electron heating yields T ∥ > T⊥ for all epsilone and betae values, indicating the primary linear wave-particle interaction is Landau damping. But linear wave-particle interactions play a minor role in shaping the wavevector spectrum, whereas nonlinear wave-wave interactions are overall stronger and faster processes, and ultimately determine the wavevector anisotropy. Simulated magnetic energy spectra as function of wavenumber show a spectral break to steeper slopes, which scales as k⊥lambda e ≃ 1 independent of betae values, where lambdae is electron inertial length, qualitatively similar to solar wind observations. Specific

  6. Harmonized Constraints in Software Engineering and Acquisition Process Management Requirements are the Clue to Meet Future Performance Goals Successfully in an Environment of Scarce Resources

    National Research Council Canada - National Science Library

    Reich, Holger

    2008-01-01

    This MBA project investigates the importance of correctly deriving requirements from the capability gap and operational environment, and translating them into the processes of contracting, software...

  7. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  8. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  9. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Science.gov (United States)

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  10. A criticality safety analysis code using a vectorized Monte Carlo method on the HITAC S-810 supercomputer

    International Nuclear Information System (INIS)

    Morimoto, Y.; Maruyama, H.

    1987-01-01

    A vectorized Monte Carlo criticality safety analysis code has been developed on the vector supercomputer HITAC S-810. In this code, a multi-particle tracking algorithm was adopted for effective utilization of the vector processor. A flight analysis with pseudo-scattering was developed to reduce the computational time needed for flight analysis, which represents the bulk of computational time. This new algorithm realized a speed-up of factor 1.5 over the conventional flight analysis. The code also adopted the multigroup cross section constants library of the Bodarenko type with 190 groups, with 132 groups being for fast and epithermal regions and 58 groups being for the thermal region. Evaluation work showed that this code reproduce the experimental results to an accuracy of about 1 % for the effective neutron multiplication factor. (author)

  11. Using the LANSCE irradiation facility to predict the number of fatal soft errors in one of the world's fastest supercomputers

    International Nuclear Information System (INIS)

    Michalak, S.E.; Harris, K.W.; Hengartner, N.W.; Takala, B.E.; Wender, S.A.

    2005-01-01

    Los Alamos National Laboratory (LANL) is home to the Los Alamos Neutron Science Center (LANSCE). LANSCE is a unique facility because its neutron spectrum closely mimics the neutron spectrum at terrestrial and aircraft altitudes, but is many times more intense. Thus, LANSCE provides an ideal setting for accelerated testing of semiconductor and other devices that are susceptible to cosmic ray induced neutrons. Many industrial companies use LANSCE to estimate device susceptibility to cosmic ray induced neutrons, and it has also been used to test parts from one of LANL's supercomputers, the ASC (Advanced Simulation and Computing Program) Q. This paper discusses our use of the LANSCE facility to study components in Q including a comparison with failure data from Q

  12. Evaluating the networking characteristics of the Cray XC-40 Intel Knights Landing-based Cori supercomputer at NERSC

    Energy Technology Data Exchange (ETDEWEB)

    Doerfler, Douglas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Austin, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Brandon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kandalla, Krishna [Cray Inc, Bloomington, MN (United States); Mendygral, Peter [Cray Inc, Bloomington, MN (United States)

    2017-09-12

    There are many potential issues associated with deploying the Intel Xeon Phi™ (code named Knights Landing [KNL]) manycore processor in a large-scale supercomputer. One in particular is the ability to fully utilize the high-speed communications network, given that the serial performance of a Xeon Phi TM core is a fraction of a Xeon®core. In this paper, we take a look at the trade-offs associated with allocating enough cores to fully utilize the Aries high-speed network versus cores dedicated to computation, e.g., the trade-off between MPI and OpenMP. In addition, we evaluate new features of Cray MPI in support of KNL, such as internode optimizations. We also evaluate one-sided programming models such as Unified Parallel C. We quantify the impact of the above trade-offs and features using a suite of National Energy Research Scientific Computing Center applications.

  13. Performance Evaluation of an Intel Haswell- and Ivy Bridge-Based Supercomputer Using Scientific and Engineering Applications

    Science.gov (United States)

    Saini, Subhash; Hood, Robert T.; Chang, Johnny; Baron, John

    2016-01-01

    We present a performance evaluation conducted on a production supercomputer of the Intel Xeon Processor E5- 2680v3, a twelve-core implementation of the fourth-generation Haswell architecture, and compare it with Intel Xeon Processor E5-2680v2, an Ivy Bridge implementation of the third-generation Sandy Bridge architecture. Several new architectural features have been incorporated in Haswell including improvements in all levels of the memory hierarchy as well as improvements to vector instructions and power management. We critically evaluate these new features of Haswell and compare with Ivy Bridge using several low-level benchmarks including subset of HPCC, HPCG and four full-scale scientific and engineering applications. We also present a model to predict the performance of HPCG and Cart3D within 5%, and Overflow within 10% accuracy.

  14. Detection and Characterization of Engineered Nanomaterials in the Environment: Current State-of-the-art and Future Directions Report, Annotated Bibliography, and Image Library

    Science.gov (United States)

    The increasing manufacture and implementation of engineered nanomaterials (ENMs) will continue to lead to the release of these materials into the environment. Reliably assessing the environmental exposure risk of ENMs will depend highly on the ability to quantify and characterize...

  15. Informatics and Nursing in a Post-Nursing Informatics World: Future Directions for Nurses in an Automated, Artificially Intelligent, Social-Networked Healthcare Environment.

    Science.gov (United States)

    Booth, Richard G

    2016-01-01

    The increased adoption and use of technology within healthcare and society has influenced the nursing informatics specialty in a multitude of fashions. Namely, the nursing informatics specialty currently faces a range of important decisions related to its knowledge base, established values and future directions - all of which are in need of development and future-proofing. In light of the increased use of automation, artificial intelligence and big data in healthcare, the specialty must also reconceptualize the roles of both nurses and informaticians to ensure that the nursing profession is ready to operate within future digitalized healthcare ecosystems. To explore these goals, the author of this manuscript outlines an examination of technological advancements currently taking place within healthcare, and also proposes implications for the nursing role and the nursing informatics specialty. Finally, recommendations and insights towards how the roles of nurses and informaticians might evolve or be shaped in the growing post-nursing informatics era are presented. Copyright © 2016 Longwoods Publishing.

  16. Lisbon: Supercomputer for Portugal financed from 'CERN Fund'

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    A powerful new computer is now in use at the Portuguese National Foundation for Scientific Computation (FCCN Lisbon), set up in 1987 to help fund university computing, to anticipate future requirements and to provide a fast computer at the National Civil Engineering Laboratory (LNEC) as a central node for remote access by major research institutes

  17. The Relationships among Students' Future-Oriented Goals and Subgoals, Perceived Task Instrumentality, and Task-Oriented Self-Regulation Strategies in an Academic Environment

    Science.gov (United States)

    Tabachnick, Sharon E.; Miller, Raymond B.; Relyea, George E.

    2008-01-01

    The authors performed path analysis, followed by a bootstrap procedure, to test the predictions of a model explaining the relationships among students' distal future goals (both extrinsic and intrinsic), their adoption of a middle-range subgoal, their perceptions of task instrumentality, and their proximal task-oriented self-regulation strategies.…

  18. Assessing and Managing the Current and Future Pest Risk from Water Hyacinth, (Eichhornia crassipes), an Invasive Aquatic Plant Threatening the Environment and Water Security.

    Science.gov (United States)

    Kriticos, Darren J; Brunel, Sarah

    2016-01-01

    Understanding and managing the biological invasion threats posed by aquatic plants under current and future climates is a growing challenge for biosecurity and land management agencies worldwide. Eichhornia crassipes is one of the world's worst aquatic weeds. Presently, it threatens aquatic ecosystems, and hinders the management and delivery of freshwater services in both developed and developing parts of the world. A niche model was fitted using CLIMEX, to estimate the potential distribution of E. crassipes under historical and future climate scenarios. Under two future greenhouse gas emission scenarios for 2080 simulated with three Global Climate Models, the area with a favourable temperature regime appears set to shift polewards. The greatest potential for future range expansion lies in Europe. Elsewhere in the northern hemisphere temperature gradients are too steep for significant geographical range expansion under the climate scenarios explored here. In the Southern Hemisphere, the southern range boundary for E. crassipes is set to expand southwards in Argentina, Australia and New Zealand; under current climate conditions it is already able to invade the southern limits of Africa. The opportunity exists to prevent its spread into the islands of Tasmania in Australia and the South Island of New Zealand, both of which depend upon hydroelectric facilities that would be threatened by the presence of E. crassipes. In Europe, efforts to slow or stop the spread of E. crassipes will face the challenge of limited internal biosecurity capacity. The modelling technique demonstrated here is the first application of niche modelling for an aquatic weed under historical and projected future climates. It provides biosecurity agencies with a spatial tool to foresee and manage the emerging invasion threats in a manner that can be included in the international standard for pest risk assessments. It should also support more detailed local and regional management.

  19. Assessing and Managing the Current and Future Pest Risk from Water Hyacinth, (Eichhornia crassipes, an Invasive Aquatic Plant Threatening the Environment and Water Security.

    Directory of Open Access Journals (Sweden)

    Darren J Kriticos

    Full Text Available Understanding and managing the biological invasion threats posed by aquatic plants under current and future climates is a growing challenge for biosecurity and land management agencies worldwide. Eichhornia crassipes is one of the world's worst aquatic weeds. Presently, it threatens aquatic ecosystems, and hinders the management and delivery of freshwater services in both developed and developing parts of the world. A niche model was fitted using CLIMEX, to estimate the potential distribution of E. crassipes under historical and future climate scenarios. Under two future greenhouse gas emission scenarios for 2080 simulated with three Global Climate Models, the area with a favourable temperature regime appears set to shift polewards. The greatest potential for future range expansion lies in Europe. Elsewhere in the northern hemisphere temperature gradients are too steep for significant geographical range expansion under the climate scenarios explored here. In the Southern Hemisphere, the southern range boundary for E. crassipes is set to expand southwards in Argentina, Australia and New Zealand; under current climate conditions it is already able to invade the southern limits of Africa. The opportunity exists to prevent its spread into the islands of Tasmania in Australia and the South Island of New Zealand, both of which depend upon hydroelectric facilities that would be threatened by the presence of E. crassipes. In Europe, efforts to slow or stop the spread of E. crassipes will face the challenge of limited internal biosecurity capacity. The modelling technique demonstrated here is the first application of niche modelling for an aquatic weed under historical and projected future climates. It provides biosecurity agencies with a spatial tool to foresee and manage the emerging invasion threats in a manner that can be included in the international standard for pest risk assessments. It should also support more detailed local and regional

  20. Our World, Our Future: Bilingual Activities on Population and the Environment = Nuestro Mundo, Nuestro Futuro: Actividades Bilingues Acerca de la Poblacion y el Medio Ambiente.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This bilingual activity guide helps to develop students' understandings of the interdependence of people and the environment. Interdisciplinary resources are provided featuring environmental education lessons with applications to the social studies, science, math, and family life education curricula. It is designed for the middle school level, but…

  1. Supercomputer methods for the solution of fundamental problems of particle physics

    International Nuclear Information System (INIS)

    Moriarty, K.J.M.; Rebbi, C.

    1990-01-01

    The authors present motivation and methods for computer investigations in particle theory. They illustrate the computational formulation of quantum chromodynamics and selected application to the calculation of hadronic properties. They discuss possible extensions of the methods developed for particle theory to different areas of applications, such as cosmology and solid-state physics, that share common methods. Because of the commonality of methodology, advances in one area stimulate advances in other ares. They also outline future plans of research

  2. LDRD final report : a lightweight operating system for multi-core capability class supercomputers.

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Suzanne Marie; Hudson, Trammell B. (OS Research); Ferreira, Kurt Brian; Bridges, Patrick G. (University of New Mexico); Pedretti, Kevin Thomas Tauke; Levenhagen, Michael J.; Brightwell, Ronald Brian

    2010-09-01

    The two primary objectives of this LDRD project were to create a lightweight kernel (LWK) operating system(OS) designed to take maximum advantage of multi-core processors, and to leverage the virtualization capabilities in modern multi-core processors to create a more flexible and adaptable LWK environment. The most significant technical accomplishments of this project were the development of the Kitten lightweight kernel, the co-development of the SMARTMAP intra-node memory mapping technique, and the development and demonstration of a scalable virtualization environment for HPC. Each of these topics is presented in this report by the inclusion of a published or submitted research paper. The results of this project are being leveraged by several ongoing and new research projects.

  3. Water use efficiency and crop water balance of rainfed wheat in a semi-arid environment: sensitivity of future changes to projected climate changes and soil type

    Science.gov (United States)

    Yang, Yanmin; Liu, De Li; Anwar, Muhuddin Rajin; O'Leary, Garry; Macadam, Ian; Yang, Yonghui

    2016-02-01

    Wheat production is expected to be affected by climate change through changing components of the crop water balance such as rainfall, evapotranspiration (ET), runoff and drainage. We used the Agricultural Production Systems Simulator (APSIM)-wheat model to simulate the potential impact of climate change on field water balance, ET and water use efficiency (WUE) under the SRES A2 emissions scenario. We ran APSIM with daily climate data statistically downscaled from 18 Global Circulation Models (GCMs). Twelve soil types of varying plant available water holding capacity (PAWC) at six sites across semi-arid southeastern Australia were considered. Biases in the GCM-simulated climate data were bias-corrected against observations for the 1961-1999 baseline period. However, biases in the APSIM output data relative to APSIM simulations forced with climate observations remained. A secondary bias correction was therefore performed on the APSIM outputs. Bias-corrected APSIM outputs for a future period (2021-2040) were compared with APSIM outputs generated using observations for the baseline period to obtain future changes. The results show that effective rainfall was decreased over all sites due to decreased growing season rainfall. ET was decreased through reduced soil evaporation and crop transpiration. There were no significant changes in runoff at any site. The variation in deep drainage between sites was much greater than for runoff, ranging from less than a few millimetres at the drier sites to over 100 mm at the wetter. However, in general, the averaged drainage over different soil types were not significantly different between the baseline (1961-1999) and future period of 2021-2040 ( P > 0.05). For the wetter sites, the variations in the future changes in drainage and runoff between the 18 GCMs were larger than those of the drier sites. At the dry sites, the variation in drainage decreased as PAWC increased. Overall, water use efficiency based on transpiration (WUE

  4. Computational Solutions for Today’s Navy: New Methods are Being Employed to Meet the Navy’s Changing Software-Development Environment

    Science.gov (United States)

    2008-03-01

    software- development environment. ▶ Frank W. Bentrem, Ph.D., John T. Sample, Ph.D., and Michael M. Harris he Naval Research Labor - atory (NRL) is the...sonars (Through-the-Sensor technology), supercomputer generated numer- ical models, and historical/ clima - tological databases. It uses a vari- ety of

  5. An investigation into the challenges facing the future provision of continuing professional development for allied health professionals in a changing healthcare environment

    International Nuclear Information System (INIS)

    Gibbs, Vivien

    2011-01-01

    This paper outlines current challenges facing healthcare providers and education providers in trying to ensure Allied Health Professionals (AHPs) are fit for practice, in a climate driven by financial constraints and service improvement directives from the Department of Health (DH). Research was undertaken in 2009 to investigate the current provision of Continuing Professional Development (CPD) in the southwest region of England. The purpose was to define exactly what problems existed with this provision, and to propose changes which could be implemented in order to ensure that the provision meets the needs of stakeholders in future years.

  6. SOFTWARE FOR SUPERCOMPUTER SKIF “ProLit-lC” and “ProNRS-lC” FOR FOUNDRY AND METALLURGICAL PRODUCTIONS

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2008-01-01

    Full Text Available The data of modeling on supercomputer system SKIF of technological process of  molds filling by means of computer system 'ProLIT-lc', and also data of modeling of the steel pouring process by means ofTroNRS-lc'are presented. The influence of number of  processors of  multinuclear computer system SKIF on acceleration and time of  modeling of technological processes, connected with production of castings and slugs, is shown.

  7. A complete implementation of the conjugate gradient algorithm on a reconfigurable supercomputer

    International Nuclear Information System (INIS)

    Dubois, David H.; Dubois, Andrew J.; Connor, Carolyn M.; Boorman, Thomas M.; Poole, Stephen W.

    2008-01-01

    The conjugate gradient is a prominent iterative method for solving systems of sparse linear equations. Large-scale scientific applications often utilize a conjugate gradient solver at their computational core. In this paper we present a field programmable gate array (FPGA) based implementation of a double precision, non-preconditioned, conjugate gradient solver for fmite-element or finite-difference methods. OUf work utilizes the SRC Computers, Inc. MAPStation hardware platform along with the 'Carte' software programming environment to ease the programming workload when working with the hybrid (CPUIFPGA) environment. The implementation is designed to handle large sparse matrices of up to order N x N where N <= 116,394, with up to 7 non-zero, 64-bit elements per sparse row. This implementation utilizes an optimized sparse matrix-vector multiply operation which is critical for obtaining high performance. Direct parallel implementations of loop unrolling and loop fusion are utilized to extract performance from the various vector/matrix operations. Rather than utilize the FPGA devices as function off-load accelerators, our implementation uses the FPGAs to implement the core conjugate gradient algorithm. Measured run-time performance data is presented comparing the FPGA implementation to a software-only version showing that the FPGA can outperform processors running up to 30x the clock rate. In conclusion we take a look at the new SRC-7 system and estimate the performance of this algorithm on that architecture.

  8. E-health systems for management of MDR-TB in resource-poor environments: a decade of experience and recommendations for future work.

    Science.gov (United States)

    Fraser, Hamish S F; Habib, Ali; Goodrich, Mark; Thomas, David; Blaya, Joaquin A; Fils-Aime, Joseph Reginald; Jazayeri, Darius; Seaton, Michael; Khan, Aamir J; Choi, Sharon S; Kerrison, Foster; Falzon, Dennis; Becerra, Mercedes C

    2013-01-01

    Multi-drug resistant TB (MDR-TB) is a complex infectious disease that is a growing threat to global health. It requires lengthy treatment with multiple drugs and specialized laboratory testing. To effectively scale up treatment to thousands of patients requires good information systems to support clinical care, reporting, drug forecasting, supply chain management and monitoring. Over the last decade we have developed the PIH-EMR electronic medical record system, and subsequently OpenMRS-TB, to support the treatment of MDR-TB in Peru, Haiti, Pakistan, and other resource-poor environments. We describe here the experience with implementing these systems and evaluating many aspects of their performance, and review other systems for MDR-TB management. We recommend a new approach to information systems to address the barriers to scale up MDR-TB treatment, particularly access to the appropriate drugs and lab data. We propose moving away from fragmented, vertical systems to focus on common platforms, addressing all stages of TB care, support for open data standards and interoperability, care for a wide range of diseases including HIV, integration with mHealth applications, and ability to function in resource-poor environments.

  9. The road towards an energy-efficient future. Report to the Ministerial Conference 'Environment for Europe', Kiev, Ukraine, May 21-23, 2003

    International Nuclear Information System (INIS)

    2003-01-01

    The report gives a comprehensive overview of progress made in improving energy efficiency by UN-ECE and PEEREA participating states since the last 'Environment for Europe' Ministerial in Aarhus, Denmark in 1998. Energy efficiency is assessed in the report from the angle of its contribution to addressing climate change, increasing the security of supply, and supporting restructuring in transition economies. In particular, the report assesses the changing environment in which governments are now required to pursue energy efficiency objectives, within the context of energy market liberalisation in many parts of the UN-ECE constituency, and draws some conclusions as to priority sectors of the economy that should be focused on in terms of energy efficiency gains. The role of the Energy Charter Protocol on Energy Efficiency and Related Environmental Aspects (PEEREA) as a vehicle for sharing best practice recommendations and advice on energy efficiency policies among governments in the Eurasian area is underlined, and was also explicitly recognised in the Statement on Energy Efficiency adopted by the Kiev Ministerial Conference

  10. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  11. Supercomputations and big-data analysis in strong-field ultrafast optical physics: filamentation of high-peak-power ultrashort laser pulses

    Science.gov (United States)

    Voronin, A. A.; Panchenko, V. Ya; Zheltikov, A. M.

    2016-06-01

    High-intensity ultrashort laser pulses propagating in gas media or in condensed matter undergo complex nonlinear spatiotemporal evolution where temporal transformations of optical field waveforms are strongly coupled to an intricate beam dynamics and ultrafast field-induced ionization processes. At the level of laser peak powers orders of magnitude above the critical power of self-focusing, the beam exhibits modulation instabilities, producing random field hot spots and breaking up into multiple noise-seeded filaments. This problem is described by a (3  +  1)-dimensional nonlinear field evolution equation, which needs to be solved jointly with the equation for ultrafast ionization of a medium. Analysis of this problem, which is equivalent to solving a billion-dimensional evolution problem, is only possible by means of supercomputer simulations augmented with coordinated big-data processing of large volumes of information acquired through theory-guiding experiments and supercomputations. Here, we review the main challenges of supercomputations and big-data processing encountered in strong-field ultrafast optical physics and discuss strategies to confront these challenges.

  12. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-03-29

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  13. The ASCI Network for SC '99: A Step on the Path to a 100 Gigabit Per Second Supercomputing Network

    Energy Technology Data Exchange (ETDEWEB)

    PRATT,THOMAS J.; TARMAN,THOMAS D.; MARTINEZ,LUIS M.; MILLER,MARC M.; ADAMS,ROGER L.; CHEN,HELEN Y.; BRANDT,JAMES M.; WYCKOFF,PETER S.

    2000-07-24

    This document highlights the Discom{sup 2}'s Distance computing and communication team activities at the 1999 Supercomputing conference in Portland, Oregon. This conference is sponsored by the IEEE and ACM. Sandia, Lawrence Livermore and Los Alamos National laboratories have participated in this conference for eleven years. For the last four years the three laboratories have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives rubric. Communication support for the ASCI exhibit is provided by the ASCI DISCOM{sup 2} project. The DISCOM{sup 2} communication team uses this forum to demonstrate and focus communication and networking developments within the community. At SC 99, DISCOM built a prototype of the next generation ASCI network demonstrated remote clustering techniques, demonstrated the capabilities of the emerging Terabit Routers products, demonstrated the latest technologies for delivering visualization data to the scientific users, and demonstrated the latest in encryption methods including IP VPN technologies and ATM encryption research. The authors also coordinated the other production networking activities within the booth and between their demonstration partners on the exhibit floor. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support Sandia's overall strategies in ASCI networking.

  14. Getting To Exascale: Applying Novel Parallel Programming Models To Lab Applications For The Next Generation Of Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Dube, Evi [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shereda, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nau, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Harris, Lance [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-09-27

    As supercomputing moves toward exascale, node architectures will change significantly. CPU core counts on nodes will increase by an order of magnitude or more. Heterogeneous architectures will become more commonplace, with GPUs or FPGAs providing additional computational power. Novel programming models may make better use of on-node parallelism in these new architectures than do current models. In this paper we examine several of these novel models – UPC, CUDA, and OpenCL –to determine their suitability to LLNL scientific application codes. Our study consisted of several phases: We conducted interviews with code teams and selected two codes to port; We learned how to program in the new models and ported the codes; We debugged and tuned the ported applications; We measured results, and documented our findings. We conclude that UPC is a challenge for porting code, Berkeley UPC is not very robust, and UPC is not suitable as a general alternative to OpenMP for a number of reasons. CUDA is well supported and robust but is a proprietary NVIDIA standard, while OpenCL is an open standard. Both are well suited to a specific set of application problems that can be run on GPUs, but some problems are not suited to GPUs. Further study of the landscape of novel models is recommended.

  15. AUTODYN - an interactive non-linear dynamic analysis program for microcomputers through supercomputers

    International Nuclear Information System (INIS)

    Birnbaum, N.K.; Cowler, M.S.; Itoh, M.; Katayama, M.; Obata, H.

    1987-01-01

    AUTODYN uses a two dimensional coupled finite difference approach similar to the one described by Cowler and Hancock (1979). Both translational and axial symmetry are treated. The scheme allows alternative numerical processors to be selectively used to model different components/regions of a problem. Finite difference grids operated on by these processors can be coupled together in space and time to efficiently compute structural (or fluid-structure) interactions. AUTODYN currently includes a Lagrange processor for modeling solid continua and structures, an Euler processor for modeling fluids and the large distortion of solids, an ALE (Arbitrary Lagrange Euler) processor for specialized flow models and a shell processor for modeling thin structures. At present, all four processors use explicit time integration but implicit options will be added to the Lagrange and ALE processors in the near future. Material models are included for solids, liquids and gases (including HE detonation products). (orig.)

  16. Advances in Supercomputing for the Modeling of Atomic Processes in Plasmas

    International Nuclear Information System (INIS)

    Ludlow, J. A.; Ballance, C. P.; Loch, S. D.; Lee, T. G.; Pindzola, M. S.; Griffin, D. C.; McLaughlin, B. M.; Colgan, J.

    2009-01-01

    An overview will be given of recent atomic and molecular collision methods developed to take advantage of modern massively parallel computers. The focus will be on direct solutions of the time-dependent Schroedinger equation for simple systems using large numerical lattices, as found in the time-dependent close-coupling method, and for configuration interaction solutions of the time-independent Schroedinger equation for more complex systems using large numbers of basis functions, as found in the R-matrix with pseudo-states method. Results from these large scale calculations are extremely useful in benchmarking less accurate theoretical methods and experimental data. To take full advantage of future petascale and exascale computing resources, it appears that even finer grain parallelism will be needed.

  17. Recent CFD Simulations of turbulent reactive flows with supercomputing for hydrogen safety

    International Nuclear Information System (INIS)

    Rehm, W.

    2001-01-01

    This paper describes the R and D work performed within the scope of joint project activities concerning the numerical simulation of reacting flow in complex geometries. The aim is the refinement of numerical methods used in computational fluid dynamics (CFD) by introducing high-performance computations (HPC) to analyse explosion processes in technical systems in more detail. Application examples concern conventional and nuclear energy systems, especially the safety aspects of future hydrogen technology. The project work is mainly focused on the modelling of the accident-related behaviour of hydrogen in safety enclosures regarding the distribution and combustion of burnable gas mixtures, ranging from slow to fast or even rapid flames. For fire and explosion protection, special models and criteria are being developed for the assessment of adequate safety measures to control deflagration-to-detonation transition (DDT) processes. Therefore, the physical mixing concept with dilution and inertization media is studied and recommended. (orig.) [de

  18. Changing the Learning Environment in the College of Engineering and Applied Science: The impact of Educational Training on Future Faculty and Student- Centered Pedagogy on Undergraduate Students

    Science.gov (United States)

    Gaskins, Whitney

    Over the past 20 years there have been many changes to the primary and secondary educational system that have impacted students, teachers, and post-secondary institutions across the United States of America. One of the most important is the large number of standardized tests students are required to take to show adequate performance in school. Students think differently because they are taught differently due to this focus on standardized testing, thus changing the skill sets students acquire in secondary school. This presents a critical problem for colleges and universities, as they now are using practices for and have expectations of these students that are unrealistic for the changing times. High dropout rates in the College of Engineering have been attributed to the cultural atmosphere of the institution. Students have reported a low sense of belonging and low relatability to course material. This study developed a "preparing the future" faculty program that gave graduate students at the University of Cincinnati a unique training experience that helped them understand the students they will educate. They received educational training, developed from a future educator's curriculum that covered classroom management, standards, and pedagogy. Graduate students who participated in the training program reported increases in self-efficacy and student understanding. To reduce negative experiences and increase motivation, Challenge Based Learning (CBL) was introduced in an undergraduate Basic Electric Circuits (BEC) course. CBL is a structured model for course content with a foundation in problem-based learning. CBL offers general concepts from which students derive the challenges they will address. Results show an improved classroom experience for students who were taught with CBL.

  19. Assessing mobile food vendors (a.k.a. street food vendors)--methods, challenges, and lessons learned for future food-environment research.

    Science.gov (United States)

    Lucan, S C; Varona, M; Maroko, A R; Bumol, J; Torrens, L; Wylie-Rosett, J

    2013-08-01

    Mobile food vendors (also known as street food vendors) may be important sources of food, particularly in minority and low-income communities. Unfortunately, there are no good data sources on where, when, or what vendors sell. The lack of a published assessment method may contribute to the relative exclusion of mobile food vendors from existing food-environment research. A goal of this study was to develop, pilot, and refine a method to assess mobile food vendors. Cross-sectional assessment of mobile food vendors through direct observations and brief interviews. Using printed maps, investigators canvassed all streets in Bronx County, NY (excluding highways but including entrance and exit ramps) in 2010, looking for mobile food vendors. For each vendor identified, researchers recorded a unique identifier, the vendor's location, and direct observations. Investigators also recorded vendors answers to where, when, and what they sold. Of 372 identified vendors, 38% did not answer brief-interview questions (19% were 'in transit', 15% refused; others were absent from their carts/trucks/stands or with customers). About 7% of vendors who ultimately answered questions were reluctant to engage with researchers. Some vendors expressed concerns about regulatory authority; only 34% of vendors had visible permits or licenses and many vendors had improvised illegitimate-appearing set-ups. The majority of vendors (75% of those responding) felt most comfortable speaking Spanish; 5% preferred other non-English languages. Nearly a third of vendors changed selling locations (streets, neighbourhoods, boroughs) day-to-day or even within a given day. There was considerable variability in times (hours, days, months) in which vendors reported doing business; for 86% of vendors, weather was a deciding factor. Mobile food vendors have a variable and fluid presence in an urban environment. Variability in hours and locations, having most comfort with languages other than English, and reluctance

  20. Assessing mobile food vendors (a.k.a. street food vendors)—methods, challenges, and lessons learned for future food-environment research

    Science.gov (United States)

    Lucan, Sean C.; Varona, Monica; Maroko, Andrew R.; Bumol, Joel; Torrens, Luis; Wylie-Rosett, Judith

    2013-01-01

    OBJECTIVES Mobile food vendors (also known as street food vendors) may be important sources of food, particularly in minority and low-income communities. Unfortunately, there are no good data sources on where, when, or what vendors sell. The lack of a published assessment method may contribute to the relative exclusion of mobile food vendors from existing food-environment research. A goal of this study was to develop, pilot, and troubleshoot a method to assess mobile food vendors. STUDY DESIGN Cross-sectional assessment of mobile food vendors through direct observations and brief interviews. METHODS Using printed maps, investigators canvassed all streets in Bronx County, NY (excluding highways but including entrance and exit ramps) in 2010, looking for mobile food vendors. For each vendor identified, researchers recorded a unique identifier, the vendor’s location, and direct observations. Investigators also recorded vendors answers to where, when, and what they sold. RESULTS Of 372 identified vendors, 38% did not answer brief-interview questions (19% were “in transit”, 15% refused; others were absent from their carts/trucks/stands or with customers). About 7% of vendors who ultimately answered questions were reluctant to engage with researchers. Some vendors expressed concerns about regulatory authority; only 34% of vendors had visible permits or licenses and many vendors had improvised illegitimate-appearing set-ups. The majority of vendors (75% of those responding) felt most comfortable speaking Spanish; 5% preferred other non-English languages. Nearly a third of vendors changed selling locations (streets, neighborhoods, boroughs) day-to-day or even within a given day. There was considerable variability in times (hours, days, months) in which vendors reported doing business; for 86% of vendors, weather was a deciding factor. CONCLUSIONS Mobile food vendors have a variable and fluid presence in an urban environment. Variability in hours and locations, having

  1. Results of induced atmosphere measurements from the Apollo program. [possible effects of the induced environment in the vicinity of manned spacecraft on future manned laboratory experiments

    Science.gov (United States)

    Naumann, R. J.

    1974-01-01

    Experiments on Apollo missions 15, 16, and 17 were utilized in an attempt to learn about the induced environment in the vicinity of manned spacecraft. Photographic sequences were examined to obtain scattered light data from the spacecraft-generated particulates during quiescence periods and after liquid dumps. The results allowed estimates of the obscuration factor and the clearing times after dumps. It was found that the clearing times were substantially longer than anticipated. The mass spectrometer detected a high molecular flux in lunar orbit which was induced by the spacecraft. It is shown that this is most likely caused by small ice crystals being continually produced in lunar orbit. Other data from the ultraviolet spectrometer and the stellar camera are also analyzed, and estimated values or upper limits are placed on the total scattering background, the size and number of particles generated, the velocity range, and the column density.

  2. Public (Q)SAR Services, Integrated Modeling Environments, and Model Repositories on the Web: State of the Art and Perspectives for Future Development.

    Science.gov (United States)

    Tetko, Igor V; Maran, Uko; Tropsha, Alexander

    2017-03-01

    Thousands of (Quantitative) Structure-Activity Relationships (Q)SAR models have been described in peer-reviewed publications; however, this way of sharing seldom makes models available for the use by the research community outside of the developer's laboratory. Conversely, on-line models allow broad dissemination and application representing the most effective way of sharing the scientific knowledge. Approaches for sharing and providing on-line access to models range from web services created by individual users and laboratories to integrated modeling environments and model repositories. This emerging transition from the descriptive and informative, but "static", and for the most part, non-executable print format to interactive, transparent and functional delivery of "living" models is expected to have a transformative effect on modern experimental research in areas of scientific and regulatory use of (Q)SAR models. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Defining the cellular environment in the organ of Corti following extensive hair cell loss: a basis for future sensory cell replacement in the Cochlea.

    Directory of Open Access Journals (Sweden)

    Ruth R Taylor

    Full Text Available BACKGROUND: Following the loss of hair cells from the mammalian cochlea, the sensory epithelium repairs to close the lesions but no new hair cells arise and hearing impairment ensues. For any cell replacement strategy to be successful, the cellular environment of the injured tissue has to be able to nurture new hair cells. This study defines characteristics of the auditory sensory epithelium after hair cell loss. METHODOLOGY/PRINCIPAL FINDINGS: Studies were conducted in C57BL/6 and CBA/Ca mice. Treatment with an aminoglycoside-diuretic combination produced loss of all outer hair cells within 48 hours in both strains. The subsequent progressive tissue re-organisation was examined using immunohistochemistry and electron microscopy. There was no evidence of significant de-differentiation of the specialised columnar supporting cells. Kir4.1 was down regulated but KCC4, GLAST, microtubule bundles, connexin expression patterns and pathways of intercellular communication were retained. The columnar supporting cells became covered with non-specialised cells migrating from the outermost region of the organ of Corti. Eventually non-specialised, flat cells replaced the columnar epithelium. Flat epithelium developed in distributed patches interrupting regions of columnar epithelium formed of differentiated supporting cells. Formation of the flat epithelium was initiated within a few weeks post-treatment in C57BL/6 mice but not for several months in CBA/Ca's, suggesting genetic background influences the rate of re-organisation. CONCLUSIONS/SIGNIFICANCE: The lack of dedifferentiation amongst supporting cells and their replacement by cells from the outer side of the organ of Corti are factors that may need to be considered in any attempt to promote endogenous hair cell regeneration. The variability of the cellular environment along an individual cochlea arising from patch-like generation of flat epithelium, and the possible variability between individuals

  4. Future directions in shielding methods and analysis

    International Nuclear Information System (INIS)

    Goldstein, H.

    1987-01-01

    Over the nearly half century history of shielding against reactor radiation, there has been a see-saw battle between theory and measurement. During that period the capability and accuracy of calculational methods have been enormously improved. The microscopic cross sections needed as input to the theoretical computations are now also known to adequate accuracy (with certain exceptions). Nonetheless, there remain substantial classes of shielding problems not yet accessible to satisfactory computational methods, particularly where three-dimensional geometries are involved. This paper discusses promising avenues to approach such problems, especially in the light of recent and expected advances in supercomputers. In particular, it seems that Monte Carlo methods should be much more advantageous in the new computer environment than they have been in the past

  5. Uranium, its impact on the national and global energy mix; and its history, distribution, production, nuclear fuel-cycle, future, and relation to the environment

    Science.gov (United States)

    Finch, Warren Irvin

    1997-01-01

    The many aspects of uranium, a heavy radioactive metal used to generate electricity throughout the world, are briefly described in relatively simple terms intended for the lay reader. An adequate glossary of unfamiliar terms is given. Uranium is a new source of electrical energy developed since 1950, and how we harness energy from it is explained. It competes with the organic coal, oil, and gas fuels as shown graphically. Uranium resources and production for the world are tabulated and discussed by country and for various energy regions in the United States. Locations of major uranium deposits and power reactors in the United States are mapped. The nuclear fuel-cycle of uranium for a typical light-water reactor is illustrated at the front end-beginning with its natural geologic occurrence in rocks through discovery, mining, and milling; separation of the scarce isotope U-235, its enrichment, and manufacture into fuel rods for power reactors to generate electricity-and at the back end-the reprocessing and handling of the spent fuel. Environmental concerns with the entire fuel cycle are addressed. The future of the use of uranium in new, simplified, 'passively safe' reactors for the utility industry is examined. The present resource assessment of uranium in the United States is out of date, and a new assessment could aid the domestic uranium industry.

  6. Health-promoting compounds of broccoli (Brassica oleracea L. var. italica) plants as affected by nitrogen fertilisation in projected future climatic change environments.

    Science.gov (United States)

    Zaghdoud, Chokri; Carvajal, Micaela; Moreno, Diego A; Ferchichi, Ali; Del Carmen Martínez-Ballesta, María

    2016-01-30

    The complex interactions between CO2 increase and salinity were investigated in relation to decreased N supply, in order to determine the nutritional quality of broccoli (Brassica oleracea L. var. italica) plants under these conditions. Three different decreased N fertilisation regimes (NO3(-)/NH4(+) ratios of 100:0, 50:50 and 0:100 respectively) were combined with ambient (380 ppm) and elevated (800 ppm) [CO2 ] under non-saline (0 mmol L(-1) NaCl) and saline (80 mmol L(-1) NaCl) conditions. Nutrients (minerals, soluble protein and total amino acids) and natural antioxidants (glucosinolates, phenolic acids, flavonoids and vitamin C) were determined. In NH4(+) -fed broccoli plants, a marked growth reduction was shown and a redistribution of amino acids to cope with NH4(+) toxicity resulted in higher levels of indolic glucosinolate and total phenolic compounds. However, the positive effect of the higher [CO2] - ameliorating adverse effects of salinity--was only observed when N was supplied as NO3(-). Under reduced N fertilisation, the total glucosinolates were increased by a decreased NO3(-)/NH4 (+) ratio and elevated [CO2] but were unaffected by salinity. Under future climatic challenges, such as increased salinity and elevated [CO2], a clear genotypic dependence of S metabolism was observed in broccoli plants. In addition, an influence of the form in which N was supplied on plant nutritional quality was observed; a combined NO3(-)/NH4(+) (50:50) supply allowed broccoli plants not only to deal with NH4(+) toxicity but also to modify their glucosinolate content and profile. Thus, for different modes of N fertilisation, the interaction with climatic factors must be considered in the search for an optimal balance between yield and nutritional quality. © 2015 Society of Chemical Industry.

  7. Finger Millet: A "Certain" Crop for an "Uncertain" Future and a Solution to Food Insecurity and Hidden Hunger under Stressful Environments.

    Science.gov (United States)

    Gupta, Sanjay Mohan; Arora, Sandeep; Mirza, Neelofar; Pande, Anjali; Lata, Charu; Puranik, Swati; Kumar, J; Kumar, Anil

    2017-01-01

    Crop growth and productivity has largely been vulnerable to various abiotic and biotic stresses that are only set to be compounded due to global climate change. Therefore developing improved varieties and designing newer approaches for crop improvement against stress tolerance have become a priority now-a-days. However, most of the crop improvement strategies are directed toward staple cereals such as rice, wheat, maize etc., whereas attention on minor cereals such as finger millet [ Eleusine coracana (L.) Gaertn.] lags far behind. It is an important staple in several semi-arid and tropical regions of the world with excellent nutraceutical properties as well as ensuring food security in these areas even during harsh environment. This review highlights the importance of finger millet as a model nutraceutical crop. Progress and prospects in genetic manipulation for the development of abiotic and biotic stress tolerant varieties is also discussed. Although limited studies have been conducted for genetic improvement of finger millets, its nutritional significance in providing minerals, calories and protein makes it an ideal model for nutrition-agriculture research. Therefore, improved genetic manipulation of finger millets for resistance to both abiotic and biotic stresses, as well as for enhancing nutrient content will be very effective in millet improvement. Key message: Apart from the excellent nutraceutical value of finger millet, its ability to tolerate various abiotic stresses and resist pathogens make it an excellent model for exploring vast genetic and genomic potential of this crop, which provide us a wide choice for developing strategies for making climate resilient staple crops.

  8. Finger Millet: A “Certain” Crop for an “Uncertain” Future and a Solution to Food Insecurity and Hidden Hunger under Stressful Environments

    Science.gov (United States)

    Gupta, Sanjay Mohan; Arora, Sandeep; Mirza, Neelofar; Pande, Anjali; Lata, Charu; Puranik, Swati; Kumar, J.; Kumar, Anil

    2017-01-01

    Crop growth and productivity has largely been vulnerable to various abiotic and biotic stresses that are only set to be compounded due to global climate change. Therefore developing improved varieties and designing newer approaches for crop improvement against stress tolerance have become a priority now-a-days. However, most of the crop improvement strategies are directed toward staple cereals such as rice, wheat, maize etc., whereas attention on minor cereals such as finger millet [Eleusine coracana (L.) Gaertn.] lags far behind. It is an important staple in several semi-arid and tropical regions of the world with excellent nutraceutical properties as well as ensuring food security in these areas even during harsh environment. This review highlights the importance of finger millet as a model nutraceutical crop. Progress and prospects in genetic manipulation for the development of abiotic and biotic stress tolerant varieties is also discussed. Although limited studies have been conducted for genetic improvement of finger millets, its nutritional significance in providing minerals, calories and protein makes it an ideal model for nutrition-agriculture research. Therefore, improved genetic manipulation of finger millets for resistance to both abiotic and biotic stresses, as well as for enhancing nutrient content will be very effective in millet improvement. Key message: Apart from the excellent nutraceutical value of finger millet, its ability to tolerate various abiotic stresses and resist pathogens make it an excellent model for exploring vast genetic and genomic potential of this crop, which provide us a wide choice for developing strategies for making climate resilient staple crops. PMID:28487720

  9. Finger Millet: A “Certain” Crop for an “Uncertain” Future and a Solution to Food Insecurity and Hidden Hunger under Stressful Environments

    Directory of Open Access Journals (Sweden)

    Anil Kumar

    2017-04-01

    Full Text Available Crop growth and productivity has largely been vulnerable to various abiotic and biotic stresses that are only set to be compounded due to global climate change. Therefore developing improved varieties and designing newer approaches for crop improvement against stress tolerance have become a priority now-a-days. However, most of the crop improvement strategies are directed toward staple cereals such as rice, wheat, maize etc., whereas attention on minor cereals such as finger millet [Eleusine coracana (L. Gaertn.] lags far behind. It is an important staple in several semi-arid and tropical regions of the world with excellent nutraceutical properties as well as ensuring food security in these areas even during harsh environment. This review highlights the importance of finger millet as a model nutraceutical crop. Progress and prospects in genetic manipulation for the development of abiotic and biotic stress tolerant varieties is also discussed. Although limited studies have been conducted for genetic improvement of finger millets, its nutritional significance in providing minerals, calories and protein makes it an ideal model for nutrition-agriculture research. Therefore, improved genetic manipulation of finger millets for resistance to both abiotic and biotic stresses, as well as for enhancing nutrient content will be very effective in millet improvement.Key message: Apart from the excellent nutraceutical value of finger millet, its ability to tolerate various abiotic stresses and resist pathogens make it an excellent model for exploring vast genetic and genomic potential of this crop, which provide us a wide choice for developing strategies for making climate resilient staple crops.

  10. Gas, a decisive pillar of the sustainable future of the world: Contribution of the gas industry to fight against climate change and for sustainable development. LPG: a beneficial solution for the environment

    International Nuclear Information System (INIS)

    Le Gourrierec, Meline

    2015-01-01

    Gas has a crucial role to play in developing energy that is less carbon-intensive and more respectful of the environment. Recognised as being the cleanest fossil energy, its use in different sectors of activity leads to a significant reduction in greenhouse gas emissions. In addition, gas contributes to the development of renewable energies and becomes itself renewable through biomethane production. The development of this green gas based on a circular economy has opened up new prospects for the use of gas in all parts of the world. The 21. Conference of the Parties of the Framework Convention of the United Nations, that will be held in Paris from 30 November to 11 December 2015 (COP21) is a decisive step in the negotiation of the future international agreement on the climate that will enter in force in 2020. The target is ambitious: restricting the global warming below the critical threshold of 2 deg. C by 2100. Aware of the climate challenge and the essential role of all the economic actors, the gas industry has embarked on a series of measures contributing to keeping to this global target and facilitating sustainable development through access to energy that is less carbon-intensive and more respectful of the environment. Changing from a solid fuel to a liquid or gaseous fuel provides modern domestic energy with beneficial effects on the environment and on the quality of life. The World LPG Association has the ambition that a billion people making this transition

  11. New generation of docking programs: Supercomputer validation of force fields and quantum-chemical methods for docking.

    Science.gov (United States)

    Sulimov, Alexey V; Kutov, Danil C; Katkova, Ekaterina V; Ilin, Ivan S; Sulimov, Vladimir B

    2017-11-01

    Discovery of new inhibitors of the protein associated with a given disease is the initial and most important stage of the whole process of the rational development of new pharmaceutical substances. New inhibitors block the active site of the target protein and the disease is cured. Computer-aided molecular modeling can considerably increase effectiveness of new inhibitors development. Reliable predictions of the target protein inhibition by a small molecule, ligand, is defined by the accuracy of docking programs. Such programs position a ligand in the target protein and estimate the protein-ligand binding energy. Positioning accuracy of modern docking programs is satisfactory. However, the accuracy of binding energy calculations is too low to predict good inhibitors. For effective application of docking programs to new inhibitors development the accuracy of binding energy calculations should be higher than 1kcal/mol. Reasons of limited accuracy of modern docking programs are discussed. One of the most important aspects limiting this accuracy is imperfection of protein-ligand energy calculations. Results of supercomputer validation of several force fields and quantum-chemical methods for docking are presented. The validation was performed by quasi-docking as follows. First, the low energy minima spectra of 16 protein-ligand complexes were found by exhaustive minima search in the MMFF94 force field. Second, energies of the lowest 8192 minima are recalculated with CHARMM force field and PM6-D3H4X and PM7 quantum-chemical methods for each complex. The analysis of minima energies reveals the docking positioning accuracies of the PM7 and PM6-D3H4X quantum-chemical methods and the CHARMM force field are close to one another and they are better than the positioning accuracy of the MMFF94 force field. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Energy futures

    International Nuclear Information System (INIS)

    Treat, J.E.

    1990-01-01

    This book provides fifteen of the futures industry's leading authorities with broader background in both theory and practice of energy futures trading in this updated text. The authors review the history of the futures market and the fundamentals of trading, hedging, and technical analysis; then they update you with the newest trends in energy futures trading - natural gas futures, options, regulations, and new information services. The appendices outline examples of possible contracts and their construction

  13. Futuring for Future Ready Librarians

    Science.gov (United States)

    Figueroa, Miguel A.

    2018-01-01

    Futurists and foresight professionals offer several guiding principles for thinking about the future. These principles can help people to think about the future and become more powerful players in shaping the preferred futures they want for themselves and their communities. The principles also fit in well as strategies to support the Future Ready…

  14. Superconductivity and the environment: a Roadmap

    International Nuclear Information System (INIS)

    Nishijima, Shigehiro; Eckroad, Steven; Marian, Adela; Choi, Kyeongdal; Kim, Woo Seok; Terai, Motoaki; Deng, Zigang; Zheng, Jun; Wang, Jiasu; Umemoto, Katsuya; Du, Jia; Keenan, Shane; Foley, Cathy P; Febvre, Pascal; Mukhanov, Oleg; Cooley, Lance D; Hassenzahl, William V; Izumi, Mitsuru

    2013-01-01

    disasters will be helped by future supercomputer technologies that support huge amounts of data and sophisticated modeling, and with the aid of superconductivity these systems might not require the energy of a large city. We present different sections on applications that could address (or are addressing) a range of environmental issues. The Roadmap covers water purification, power distribution and storage, low-environmental impact transport, environmental sensing (particularly for the removal of unexploded munitions), monitoring the Earth’s magnetic fields for earthquakes and major solar activity, and, finally, developing a petaflop supercomputer that only requires 3% of the current supercomputer power provision while being 50 times faster. Access to fresh water. With only 2.5% of the water on Earth being fresh and climate change modeling forecasting that many areas will become drier, the ability to recycle water and achieve compact water recycling systems for sewage or ground water treatment is critical. The first section (by Nishijima) points to the potential of superconducting magnetic separation to enable water recycling and reuse. Energy. The Equinox Summit held in Waterloo Canada 2011 (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources) identified electricity use as humanity’s largest contributor to greenhouse gas emissions. Our appetite for electricity is growing faster than for any other form of energy. The communiqué from the summit said ‘Transforming the ways we generate, distribute and store electricity is among the most pressing challenges facing society today…. If we want to stabilize CO 2 levels in our atmosphere at 550 parts per million, all of that growth needs to be met by non-carbon forms of energy’ (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources). Superconducting technologies can provide the energy efficiencies to achieve, in the European Union alone, 33–65% of the required reduction in

  15. Superconductivity and the environment: a Roadmap

    Science.gov (United States)

    Nishijima, Shigehiro; Eckroad, Steven; Marian, Adela; Choi, Kyeongdal; Kim, Woo Seok; Terai, Motoaki; Deng, Zigang; Zheng, Jun; Wang, Jiasu; Umemoto, Katsuya; Du, Jia; Febvre, Pascal; Keenan, Shane; Mukhanov, Oleg; Cooley, Lance D.; Foley, Cathy P.; Hassenzahl, William V.; Izumi, Mitsuru

    2013-11-01

    disasters will be helped by future supercomputer technologies that support huge amounts of data and sophisticated modeling, and with the aid of superconductivity these systems might not require the energy of a large city. We present different sections on applications that could address (or are addressing) a range of environmental issues. The Roadmap covers water purification, power distribution and storage, low-environmental impact transport, environmental sensing (particularly for the removal of unexploded munitions), monitoring the Earth’s magnetic fields for earthquakes and major solar activity, and, finally, developing a petaflop supercomputer that only requires 3% of the current supercomputer power provision while being 50 times faster. Access to fresh water. With only 2.5% of the water on Earth being fresh and climate change modeling forecasting that many areas will become drier, the ability to recycle water and achieve compact water recycling systems for sewage or ground water treatment is critical. The first section (by Nishijima) points to the potential of superconducting magnetic separation to enable water recycling and reuse. Energy. The Equinox Summit held in Waterloo Canada 2011 (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources) identified electricity use as humanity’s largest contributor to greenhouse gas emissions. Our appetite for electricity is growing faster than for any other form of energy. The communiqué from the summit said ‘Transforming the ways we generate, distribute and store electricity is among the most pressing challenges facing society today…. If we want to stabilize CO2 levels in our atmosphere at 550 parts per million, all of that growth needs to be met by non-carbon forms of energy’ (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources). Superconducting technologies can provide the energy efficiencies to achieve, in the European Union alone, 33-65% of the required reduction in greenhouse

  16. Cities and environment. Indicators of environmental performance in the 'Cities of the future'; Byer og miljoe : indikatorer for miljoeutviklingen i 'Framtidens byer'

    Energy Technology Data Exchange (ETDEWEB)

    Haagensen, Trine

    2012-07-15

    This report contains selected indicators and statistics that describe the urban environmental status and development in 13 of the largest municipalities in Norway. These cities are part of the program 'Cities of the Future' agreed upon between 13 cities, the private sector and the state, led by the Ministry of the Environment. Cities of the Future had about 1.7 million inhabitants (as of 1 January 2010), equivalent to about 1/3 of the population in Norway. In 2009 the population growth in these municipalities was about 49 per cent of the total population growth. Some of the greatest challenges to combine urban development with environmental considerations are therefore found here. The white paper no. 26 (2006-2007) The government's environmental policy and the state of the environment in Norway, has also added to the importance of the urban environment with a comprehensive description of the land use and transport policy. Good land use management contains indicators related to the density of land use and construction activities within urban settlements. Within urban settlements, the area per inhabitant decreased both within the Cities of the Future and in all municipalities in Norway (2000-2009). The coalescing within the urban settlements decreased per inhabitant (2004-2009), which means that new buildings have been built outside already established urban settlements in this period. Too high density of built-up areas may be at the expense of access to playgrounds, recreational areas or touring grounds, indicators of the population's access to these areas show that there has been a reduction in access in the Cities of the Future as for the municipalities in Norway. Within transport, the focus is on the degree to which the inhabitants choose to use environmentally-friendly transportation instead of cars. Only Oslo has more than 50 per cent of daily travel by environmentally-friendly transportation. Among the Cities of the Future, the use of

  17. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  18. Global Learning and Observation to Benefit the Environment (GLOBE) Mission EARTH (GME) program delivers climate change science content, pedagogy, and data resources to K12 educators, future teachers, and professional development providers.

    Science.gov (United States)

    Ostrom, T.

    2017-12-01

    This presentation will include a series of visuals that discuss how hands-on learning activities and field investigations from the the Global Learning and Observation to Benefit the Environment (GLOBE) Mission EARTH (GME) program deliver climate change science content, pedagogy, and data resources to K12 educators, future teachers, and professional development providers. The GME program poster presentation will also show how teachers strengthen student preparation for Science, Technology, Engineering, Art and Mathematics (STEAM)-related careers while promoting diversity in the future STEM workforce. In addition to engaging students in scientific inquiry, the GME program poster will show how career exploration and preparation experiences is accomplished through direct connection to scientists and real science practices. The poster will show which hands-on learning activities that are being implemented in more than 30,000 schools worldwide, with over a million students, teachers, and scientists collecting environmental measurements using the GLOBE scientific protocols. This poster will also include how Next Generation Science Standards connect to GME learning progressions by grade strands. The poster will present the first year of results from the implementation of the GME program. Data is currently being agrigated by the east, midwest and westen regional operations.

  19. Future food.

    Science.gov (United States)

    Wahlqvist, Mark L

    2016-12-01

    Food systems have changed markedly with human settlement and agriculture, industrialisation, trade, migration and now the digital age. Throughout these transitions, there has been a progressive population explosion and net ecosystem loss and degradation. Climate change now gathers pace, exacerbated by ecological dysfunction. Our health status has been challenged by a developing people-environment mismatch. We have regarded ecological conquest and innovative technology as solutions, but have not understood how ecologically dependent and integrated we are. We are ecological creatures interfaced by our sensoriness, microbiomes, shared regulatory (endocrine) mechanisms, immune system, biorhythms and nutritional pathways. Many of us are 'nature-deprived'. We now suffer what might be termed ecological health disorders (EHD). If there were less of us, nature's resilience might cope, but more than 9 billion people by 2050 is probably an intolerable demand on the planet. Future food must increasingly take into account the pressures on ecosystem-dependent food systems, with foods probably less biodiverse, although eating in this way allows optimal health; energy dysequilibrium with less physical activity and foods inappropriately energy dense; and less socially-conducive food habits. 'Personalised Nutrition', with extensive and resource-demanding nutrigenomic, metabolomic and microbiomic data may provide partial health solutions in clinical settings, but not be justified for ethical, risk management or sustainability reasons in public health. The globally prevalent multidimensional malnutritional problems of food insecurity, quality and equity require local, regional and global action to prevent further ecosystem degradation as well as to educate, provide sustainable livelihoods and encourage respectful social discourse and practice about the role of food.

  20. Super-computer architecture

    CERN Document Server

    Hockney, R W

    1977-01-01

    This paper examines the design of the top-of-the-range, scientific, number-crunching computers. The market for such computers is not as large as that for smaller machines, but on the other hand it is by no means negligible. The present work-horse machines in this category are the CDC 7600 and IBM 360/195, and over fifty of the former machines have been sold. The types of installation that form the market for such machines are not only the major scientific research laboratories in the major countries-such as Los Alamos, CERN, Rutherford laboratory-but also major universities or university networks. It is also true that, as with sports cars, innovations made to satisfy the top of the market today often become the standard for the medium-scale computer of tomorrow. Hence there is considerable interest in examining present developments in this area. (0 refs).

  1. The GF11 supercomputer

    International Nuclear Information System (INIS)

    Beetem, J.; Weingarten, D.

    1986-01-01

    GF11 is a parallel computer currently under construction at the IBM Yorktown Research Center. The machine incorporates 576 floating-point processors arrangedin a modified SIMD architecture. Each has space for 2 Mbytes of memory and is capable of 20 Mflops, giving the total machine a peak of 1.125 Gbytes of memory and 11.52 Gflops. The floating-point processors are interconnected by a dynamically reconfigurable non-blocking switching network. At each machine cycle any of 1024 pre-selected permutations of data can be realized among the processors. The main intended application of GF11 is a class of calculations arising from quantum chromodynamics

  2. The GF11 supercomputer

    International Nuclear Information System (INIS)

    Beetem, J.; Denneau, M.; Weingarten, D.

    1985-01-01

    GF11 is a parallel computer currently under construction at the IBM Yorktown Research Center. The machine incorporates 576 floating- point processors arranged in a modified SIMD architecture. Each has space for 2 Mbytes of memory and is capable of 20 Mflops, giving the total machine a peak of 1.125 Gbytes of memory and 11.52 Gflops. The floating-point processors are interconnected by a dynamically reconfigurable nonblocking switching network. At each machine cycle any of 1024 pre-selected permutations of data can be realized among the processors. The main intended application of GF11 is a class of calculations arising from quantum chromodynamics

  3. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  4. The GF11 supercomputer

    International Nuclear Information System (INIS)

    Beetem, J.; Denneau, M.; Weingarten, D.

    1985-01-01

    GF11 is a parallel computer currently under construction at the Yorktown Research Center. The machine incorporates 576 floating-point processors arranged in a modified SIMD architecture. Each processor has space for 2 Mbytes of memory and is capable of 20 MFLOPS, giving the total machine a peak of 1.125 Gbytes of memory and 11.52 GFLOPS. The floating-point processors are interconnected by a dynamically reconfigurable non-blocking switching network. At each machine cycle any of 1024 pre-selected permutations of data can be realized among the processors. The main intended application of GF11 is a class of calculations arising from quantum chromodynamics, a proposed theory of the elementary particles which participate in nuclear interactions

  5. Future batteries will be environment-friendly

    International Nuclear Information System (INIS)

    Larcher, D.; Tarascon, J.M.

    2012-01-01

    Since the beginning of the nineties, efficient batteries have been built thanks to lithium. The use of nano-materials for the electrodes have recently opened the way to a cheaper and more environmental friendly technologies like lithium-iron-phosphate (LiFePO 4 ) batteries instead of classical lithium-ion batteries. Nano-materials enable the batteries to use more efficiently the electrode and to store more energy. Sustainable development requires the elaboration of clean processes to produce nano-materials, it appears that micro-organisms might be able to produce nano-metric minerals through bio-mineralisation, it is particularly true for FePO 4 because iron and phosphates are abundant biological components. (A.C.)

  6. Dynamic Training Environments of the Future

    Science.gov (United States)

    2008-03-13

    for cyber attacks, espionage, & command and control  MMORPG used to generate revenue through the sale of in-game items for real-world currency...The video game is the new media in cyberspace, it can be used by us and it will be used against us The MMORPG is the emergence of a new society Types

  7. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  8. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  9. Computational fluid dynamics: complex flows requiring supercomputers. January 1975-July 1988 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-July 1988

    International Nuclear Information System (INIS)

    1988-08-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets, and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate published search. (Contains 83 citations fully indexed and including a title list.)

  10. Future Educators' Explaining Voices

    Science.gov (United States)

    de Oliveira, Janaina Minelli; Caballero, Pablo Buenestado; Camacho, Mar

    2013-01-01

    Teacher education programs must offer pre-service students innovative technology-supported learning environments, guiding them in the revision of their preconceptions on literacy and technology. This present paper presents a case study that uses podcast to inquiry into future educators' views on technology and the digital age. Results show future…

  11. Clinical experimentation with aerosol antibiotics: current and future methods of administration

    Directory of Open Access Journals (Sweden)

    Zarogoulidis P

    2013-10-01

    Full Text Available Paul Zarogoulidis,1,2 Ioannis Kioumis,1 Konstantinos Porpodis,1 Dionysios Spyratos,1 Kosmas Tsakiridis,3 Haidong Huang,4 Qiang Li,4 J Francis Turner,5 Robert Browning,6 Wolfgang Hohenforst-Schmidt,7 Konstantinos Zarogoulidis1 1Pulmonary Department, G Papanikolaou General Hospital, Aristotle University of Thessaloniki, Thessaloniki, Greece; 2Department of Interventional Pneumology, Ruhrlandklinik, West German Lung Center, University Hospital, University Duisburg-Essen, Essen, Germany; 3Cardiothoracic Surgery Department, Saint Luke Private Hospital of Health Excellence, Thessaloniki, Greece; 4Department of Respiratory Diseases, Shanghai Hospital/First Affiliated Hospital of the Second Military Medical University, Shanghai, People’s Republic of China; 5Pulmonary Medicine, University of Nevada School of Medicine, National Supercomputing Center for Energy and the Environment University of Nevada, Las Vegas, NV, USA; 6Pulmonary and Critical Care Medicine, Interventional Pulmonology, National Naval Medical Center, Walter Reed Army Medical Center, Bethesda, MD, USA; 7II Medical Department, Regional Clinic of Coburg, University of Wuerzburg, Coburg, Germany Abstract: Currently almost all antibiotics are administered by the intravenous route. Since several systems and situations require more efficient methods of administration, investigation and experimentation in drug design has produced local treatment modalities. Administration of antibiotics in aerosol form is one of the treatment methods of increasing interest. As the field of drug nanotechnology grows, new molecules have been produced and combined with aerosol production systems. In the current review, we discuss the efficiency of aerosol antibiotic studies along with aerosol production systems. The different parts of the aerosol antibiotic methodology are presented. Additionally, information regarding the drug molecules used is presented and future applications of this method are discussed

  12. Future Textiles

    DEFF Research Database (Denmark)

    Hansen, Anne-Louise Degn; Jensen, Hanne Troels Fusvad; Hansen, Martin

    2011-01-01

    Magasinet Future Textiles samler resultaterne fra projektet Future Textiles, der markedsfører området intelligente tekstiler. I magasinet kan man læse om trends, drivkræfter, udfordringer samt få ideer til nye produkter inden for intelligente tekstiler. Områder som bæredygtighed og kundetilpasning...

  13. Futures Brokerages Face uncertain Future

    Institute of Scientific and Technical Information of China (English)

    WANG PEI

    2006-01-01

    @@ 2005 was a quiet year for China's futures market.After four new trading products, including cotton, fuel oil and corn, were launched on the market in 2004, the development of the market seemed to stagnate. The trade value of the futures market totaled 13.4 trillion yuan (US$ 1.67 trillion) in 2005, down 8.5 percent year-on-year. Although the decrease is quite small and the trade value was still the second highest in the market's history, the majority of futures brokerage firms were running in the red. In some areas, up to 80 percent of futures companies made losses.

  14. Heuristic Scheduling in Grid Environments: Reducing the Operational Energy Demand

    Science.gov (United States)

    Bodenstein, Christian

    In a world where more and more businesses seem to trade in an online market, the supply of online services to the ever-growing demand could quickly reach its capacity limits. Online service providers may find themselves maxed out at peak operation levels during high-traffic timeslots but too little demand during low-traffic timeslots, although the latter is becoming less frequent. At this point deciding which user is allocated what level of service becomes essential. The concept of Grid computing could offer a meaningful alternative to conventional super-computing centres. Not only can Grids reach the same computing speeds as some of the fastest supercomputers, but distributed computing harbors a great energy-saving potential. When scheduling projects in such a Grid environment however, simply assigning one process to a system becomes so complex in calculation that schedules are often too late to execute, rendering their optimizations useless. Current schedulers attempt to maximize the utility, given some sort of constraint, often reverting to heuristics. This optimization often comes at the cost of environmental impact, in this case CO 2 emissions. This work proposes an alternate model of energy efficient scheduling while keeping a respectable amount of economic incentives untouched. Using this model, it is possible to reduce the total energy consumed by a Grid environment using 'just-in-time' flowtime management, paired with ranking nodes by efficiency.

  15. Visualization system for grid environment in the nuclear field

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Matsumoto, Nobuko; Idomura, Yasuhiro; Tani, Masayuki

    2006-01-01

    An innovative scientific visualization system is needed to integratedly visualize large amount of data which are distributedly generated in remote locations as a result of a large-scale numerical simulation using a grid environment. One of the important functions in such a visualization system is a parallel visualization which enables to visualize data using multiple CPUs of a supercomputer. The other is a distributed visualization which enables to execute visualization processes using a local client computer and remote computers. We have developed a toolkit including these functions in cooperation with the commercial visualization software AVS/Express, called Parallel Support Toolkit (PST). PST can execute visualization processes with three kinds of parallelism (data parallelism, task parallelism and pipeline parallelism) using local and remote computers. We have evaluated PST for large amount of data generated by a nuclear fusion simulation. Here, two supercomputers Altix3700Bx2 and Prism installed in JAEA are used. From the evaluation, it can be seen that PST has a potential to efficiently visualize large amount of data in a grid environment. (author)

  16. The past, present, and future of test and research reactor physics

    International Nuclear Information System (INIS)

    Ryskamp, J.M.

    1992-01-01

    Reactor physics calculations have been performed on research reactors since the first one was built 50 yr ago under the University of Chicago stadium. Since then, reactor physics calculations have evolved from Fermi-age theory calculations performed with slide rules to three-dimensional, continuous-energy, coupled neutron-photon Monte Carlo computations performed with supercomputers and workstations. Such enormous progress in reactor physics leads us to believe that the next 50 year will be just as exciting. This paper reviews this transition from the past to the future

  17. Sustainable Futures

    Science.gov (United States)

    Sustainable Futures is a voluntary program that encourages industry to use predictive models to screen new chemicals early in the development process and offers incentives to companies subject to TSCA section 5.

  18. Summaries of research and development activities by using supercomputer system of JAEA in FY2015. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2015, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2015, as well as user support, operational records and overviews of the system, and so on. (author)

  19. Summaries of research and development activities by using supercomputer system of JAEA in FY2014. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2016-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2014, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2014, as well as user support, operational records and overviews of the system, and so on. (author)

  20. Summaries of research and development activities by using supercomputer system of JAEA in FY2013. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. About 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2013, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue. This report presents a great amount of R and D results accomplished by using the system in FY2013, as well as user support, operational records and overviews of the system, and so on. (author)

  1. The design and implementation of cost-effective algorithms for direct solution of banded linear systems on the vector processor system 32 supercomputer

    Science.gov (United States)

    Samba, A. S.

    1985-01-01

    The problem of solving banded linear systems by direct (non-iterative) techniques on the Vector Processor System (VPS) 32 supercomputer is considered. Two efficient direct methods for solving banded linear systems on the VPS 32 are described. The vector cyclic reduction (VCR) algorithm is discussed in detail. The performance of the VCR on a three parameter model problem is also illustrated. The VCR is an adaptation of the conventional point cyclic reduction algorithm. The second direct method is the Customized Reduction of Augmented Triangles' (CRAT). CRAT has the dominant characteristics of an efficient VPS 32 algorithm. CRAT is tailored to the pipeline architecture of the VPS 32 and as a consequence the algorithm is implicitly vectorizable.

  2. Summaries of research and development activities by using supercomputer system of JAEA in FY2012. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2012, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as apriority issue. This report presents a great amount of R and D results accomplished by using the system in FY2012, as well as user support, operational records and overviews of the system, and so on. (author)

  3. Summaries of research and development activities by using supercomputer system of JAEA in FY2011. April 1, 2011 - March 31, 2012

    International Nuclear Information System (INIS)

    2013-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2011, the system was used for analyses of the accident at the Fukushima Daiichi Nuclear Power Station and establishment of radioactive decontamination plan, as well as the JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great amount of R and D results accomplished by using the system in FY2011, as well as user support structure, operational records and overviews of the system, and so on. (author)

  4. Use of QUADRICS supercomputer as embedded simulator in emergency management systems; Utilizzo del calcolatore QUADRICS come simulatore in linea in un sistema di gestione delle emergenze

    Energy Technology Data Exchange (ETDEWEB)

    Bove, R.; Di Costanzo, G.; Ziparo, A. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system.

  5. Sandia`s network for Supercomputing `94: Linking the Los Alamos, Lawrence Livermore, and Sandia National Laboratories using switched multimegabit data service

    Energy Technology Data Exchange (ETDEWEB)

    Vahle, M.O.; Gossage, S.A.; Brenkosh, J.P. [Sandia National Labs., Albuquerque, NM (United States). Advanced Networking Integration Dept.

    1995-01-01

    Supercomputing `94, a high-performance computing and communications conference, was held November 14th through 18th, 1994 in Washington DC. For the past four years, Sandia National Laboratories has used this conference to showcase and focus its communications and networking endeavors. At the 1994 conference, Sandia built a Switched Multimegabit Data Service (SMDS) network running at 44.736 megabits per second linking its private SMDS network between its facilities in Albuquerque, New Mexico and Livermore, California to the convention center in Washington, D.C. For the show, the network was also extended from Sandia, New Mexico to Los Alamos National Laboratory and from Sandia, California to Lawrence Livermore National Laboratory. This paper documents and describes this network and how it was used at the conference.

  6. The future of energy

    CERN Document Server

    Towler, Brian F

    2014-01-01

    Using the principle that extracting energy from the environment always involves some type of impact on the environment, The Future of Energy discusses the sources, technologies, and tradeoffs involved in meeting the world's energy needs. A historical, scientific, and technical background set the stage for discussions on a wide range of energy sources, including conventional fossil fuels like oil, gas, and coal, as well as emerging renewable sources like solar, wind, geothermal, and biofuels. Readers will learn that there are no truly ""green"" energy sources-all energy usage involves some trad

  7. Future perspectives

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    International involvement in particle physics is what the International Committee for Future Accelerators (ICFA) is all about. At the latest Future Perspectives meeting at Brookhaven from 5-10 October (after a keynote speech by doyen Viktor Weisskopf, who regretted the emergence of 'a nationalistic trend'), ICFA reviewed progress and examined its commitments in the light of the evolving world particle physics scene. Particular aims were to review worldwide accelerator achievements and plans, to survey the work of the four panels, and to discuss ICFA's special role in future cooperation in accelerator construction and use, and in research and development work for both accelerators and for detectors

  8. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  9. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  10. Future Savvy

    DEFF Research Database (Denmark)

    Gordon, Adam

    There's no shortage of predictions available to organizations looking to anticipate and profit from future events or trends. Apparently helpful forecasts are ubiquitous in everyday communications such as newspapers and business magazines, and in specialized sources such as government and think......-tank forecasts, consultant reports, and stock-market guides. These resources are crucial, but they are also of very mixed quality. How can decision-makers know which predictions to take seriously, which to be wary of, and which to throw out entirely? Future Savvy provides analytical filters to judging predictive...... systematic "forecast filtering" to reveal strengths and weakness in the predictions they face. Future Savvy empowers both business and policy/government decision-makers to use forecasts wisely and so improve their judgment in anticipating opportunities, avoiding threats, and managing uncertainty....

  11. Energy Futures

    DEFF Research Database (Denmark)

    Davies, Sarah Rachael; Selin, Cynthia

    2012-01-01

    foresight and public and stakeholder engagement are used to reflect on?and direct?the impacts of new technology. In this essay we draw on our experience of anticipatory governance, in the shape of the ?NanoFutures? project on energy futures, to present a reflexive analysis of engagement and deliberation. We...... draw out five tensions of the practice of deliberation on energy technologies. Through tracing the lineages of these dilemmas, we discuss some of the implications of these tensions for the practice of civic engagement and deliberation in a set of questions for this community of practitioner-scholars....

  12. Games and Entertainment in Ambient Intelligence Environments

    NARCIS (Netherlands)

    Nijholt, Antinus; Reidsma, Dennis; Poppe, Ronald Walter; Aghajan, H.; López-Cózar Delgado, R.; Augusto, J.C.

    2009-01-01

    In future ambient intelligence (AmI) environments we assume intelligence embedded in the environment and its objects (floors, furniture, mobile robots). These environments support their human inhabitants in their activities and interactions by perceiving them through sensors (proximity sensors,

  13. Navy Telemedicine: Current Research and Future Directions

    National Research Council Canada - National Science Library

    Reed, Cheryl

    2002-01-01

    .... This report reviews military and civilian models for evaluating telemedicine systems in order to determine future directions for Navy telemedicine research within the current funding environment...

  14. Capabilities of Future Training Support Packages

    National Research Council Canada - National Science Library

    Burnside, Billy

    2004-01-01

    .... This report identifies and analyzes five key capabilities needed in future TSPs: rapid tailoring or modification, reach, simulated operating environment, performance measurement, and pretests/selection criteria...

  15. Iraq's future

    International Nuclear Information System (INIS)

    Henderson, S.

    1998-01-01

    The large oil reserves of Iraq make it an important player in the long-term political energy world. This article briefly reviews the oil industry''s development and current status in Iraq and discusses the planned oil and gas field development. Finally there is a political discussion regarding the future of Iraq in terms of religion, race and neighbouring countries. (UK)

  16. Bitcoin futures

    DEFF Research Database (Denmark)

    Brøgger, Søren Bundgaard

    2018-01-01

    Med introduktionen af et futures-marked er Bitcoin-eksponering blevet tilgængelig for en bredere gruppe af investorer, som hidtil ikke har kunnet eller villet tilgå det underliggende marked for Bitcoin. Artiklen finder, at kontrakterne umiddelbart favoriserer spekulanter på bekostning af hedgers og...

  17. Toward sustainable energy futures

    Energy Technology Data Exchange (ETDEWEB)

    Pasztor, J. (United Nations Environment Programme, Nairobi (Kenya))

    1990-01-01

    All energy systems have adverse as well as beneficial impacts on the environment. They vary in quality, quantity, in time and in space. Environmentally sensitive energy management tries to minimize the adverse impacts in an equitable manner between different groups in the most cost-effective ways. Many of the enviornmental impacts of energy continue to be externalized. Consequently, these energy systems which can externalize their impacts more easily are favoured, while others remain relatively expensive. The lack of full integration of environmental factors into energy policy and planning is the overriding problem to be resolved before a transition towards sustainable energy futures can take place. The most pressing problem in the developing countries relates to the unsustainable and inefficient use of biomass resources, while in the industrialized countries, the major energy-environment problems arise out of the continued intensive use of fossil fuel resources. Both of these resource issues have their role to play in climate change. Although there has been considerable improvement in pollution control in a number of situations, most of the adverse impacts will undoubtedly increase in the future. Population growth will lead to increased demand, and there will also be greater use of lower grade fuels. Climate change and the crisis in the biomass resource base in the developing countries are the most critical energy-environment issues to be resolved in the immediate future. In both cases, international cooperation is an essential requirement for successful resolution. 26 refs.

  18. Parliamentarians and environment

    International Nuclear Information System (INIS)

    Boy, D.

    2004-01-01

    The data presented in this report come from an inquiry carried out by Sofres between March 5 and April 23, 2003, with a sample of 200 parliamentarians (122 deputies and 78 senators) who explained their attitude with respect to the question of environment. The questionnaire comprises 5 main dimensions dealing with: the relative importance of the environment stake, the attitudes with respect to past, present and future environment policies, the attitude with respect to specific stakes (energy, wastes), the attitude with respect to some problems of conservation of the natural heritage, and the attitude with respect to the participation of the public to some environment-related decisions. (J.S.)

  19. Mining the Home Environment

    Science.gov (United States)

    Cook, Diane J.; Krishnan, Narayanan

    2014-01-01

    Individuals spend a majority of their time in their home or workplace and for many, these places are our sanctuaries. As society and technology advance there is a growing interest in improving the intelligence of the environments in which we live and work. By filling home environments with sensors and collecting data during daily routines, researchers can gain insights on human daily behavior and the impact of behavior on the residents and their environments. In this article we provide an overview of the data mining opportunities and challenges that smart environments provide for researchers and offer some suggestions for future work in this area. PMID:25506128

  20. Mining the Home Environment.

    Science.gov (United States)

    Cook, Diane J; Krishnan, Narayanan

    2014-12-01

    Individuals spend a majority of their time in their home or workplace and for many, these places are our sanctuaries. As society and technology advance there is a growing interest in improving the intelligence of the environments in which we live and work. By filling home environments with sensors and collecting data during daily routines, researchers can gain insights on human daily behavior and the impact of behavior on the residents and their environments. In this article we provide an overview of the data mining opportunities and challenges that smart environments provide for researchers and offer some suggestions for future work in this area.