WorldWideScience

Sample records for future supercomputer environments

  1. Supercomputing - Use Cases, Advances, The Future (2/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the second day, we will focus on software and software paradigms driving supercomputers, workloads that need supercomputing treatment, advances in technology and possible future developments. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and i...

  2. Supercomputing - Use Cases, Advances, The Future (1/2)

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Supercomputing has become a staple of science and the poster child for aggressive developments in silicon technology, energy efficiency and programming. In this series we examine the key components of supercomputing setups and the various advances – recent and past – that made headlines and delivered bigger and bigger machines. We also take a closer look at the future prospects of supercomputing, and the extent of its overlap with high throughput computing, in the context of main use cases ranging from oil exploration to market simulation. On the first day, we will focus on the history and theory of supercomputing, the top500 list and the hardware that makes supercomputers tick. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP an...

  3. Supercomputers and the future of computational atomic scattering physics

    International Nuclear Information System (INIS)

    Younger, S.M.

    1989-01-01

    The advent of the supercomputer has opened new vistas for the computational atomic physicist. Problems of hitherto unparalleled complexity are now being examined using these new machines, and important connections with other fields of physics are being established. This talk briefly reviews some of the most important trends in computational scattering physics and suggests some exciting possibilities for the future. 7 refs., 2 figs

  4. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  5. Programming Environment for a High-Performance Parallel Supercomputer with Intelligent Communication

    Directory of Open Access Journals (Sweden)

    A. Gunzinger

    1996-01-01

    Full Text Available At the Electronics Laboratory of the Swiss Federal Institute of Technology (ETH in Zürich, the high-performance parallel supercomputer MUSIC (MUlti processor System with Intelligent Communication has been developed. As applications like neural network simulation and molecular dynamics show, the Electronics Laboratory supercomputer is absolutely on par with those of conventional supercomputers, but electric power requirements are reduced by a factor of 1,000, weight is reduced by a factor of 400, and price is reduced by a factor of 100. Software development is a key issue of such parallel systems. This article focuses on the programming environment of the MUSIC system and on its applications.

  6. Supercomputational science

    CERN Document Server

    Wilson, S

    1990-01-01

    In contemporary research, the supercomputer now ranks, along with radio telescopes, particle accelerators and the other apparatus of "big science", as an expensive resource, which is nevertheless essential for state of the art research. Supercomputers are usually provided as shar.ed central facilities. However, unlike, telescopes and accelerators, they are find a wide range of applications which extends across a broad spectrum of research activity. The difference in performance between a "good" and a "bad" computer program on a traditional serial computer may be a factor of two or three, but on a contemporary supercomputer it can easily be a factor of one hundred or even more! Furthermore, this factor is likely to increase with future generations of machines. In keeping with the large capital and recurrent costs of these machines, it is appropriate to devote effort to training and familiarization so that supercomputers are employed to best effect. This volume records the lectures delivered at a Summer School ...

  7. KAUST Supercomputing Laboratory

    KAUST Repository

    Bailey, April Renee

    2011-11-15

    KAUST has partnered with IBM to establish a Supercomputing Research Center. KAUST is hosting the Shaheen supercomputer, named after the Arabian falcon famed for its swiftness of flight. This 16-rack IBM Blue Gene/P system is equipped with 4 gigabyte memory per node and capable of 222 teraflops, making KAUST campus the site of one of the world’s fastest supercomputers in an academic environment. KAUST is targeting petaflop capability within 3 years.

  8. Emerging supercomputer architectures

    Energy Technology Data Exchange (ETDEWEB)

    Messina, P.C.

    1987-01-01

    This paper will examine the current and near future trends for commercially available high-performance computers with architectures that differ from the mainstream ''supercomputer'' systems in use for the last few years. These emerging supercomputer architectures are just beginning to have an impact on the field of high performance computing. 7 refs., 1 tab.

  9. LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments

    Science.gov (United States)

    2015-11-20

    1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...script. In addition, supercomputing schedulers have the advantage of being programming language agnostic. The Apache Hadoop map-reduce implementation...supercomputers systems [26], using the Lustre central storage system instead of the Apache Hadoop distributed filesystem (HDFS) [37]. LLMapReduce can

  10. Future leisure environments

    Science.gov (United States)

    Elwood L. Shafer; George H. Moeller; Russell E. Getty

    1974-01-01

    As an aid to policy- and decision-making about future environmental problems, a panel of experts was asked to predict the probabilities of future events associated with natural-resource management, wildland-recreation management, environmental pollution, population-workforce-leisure, and urban environments. Though some of the predictions projected to the year 2050 may...

  11. Supercomputers Today

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 11. Supercomputers Today - Parallelism is the Name of the Game. V Rajaraman. General Article Volume 3 Issue 11 November 1998 pp 54-68. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. UbiWorld: An environment integrating virtual reality, supercomputing, and design

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.; Papka, M.E.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1997-07-01

    UbiWorld is a concept being developed by the Futures Laboratory group at Argonne National Laboratory that ties together the notion of ubiquitous computing (Ubicomp) with that of using virtual reality for rapid prototyping. The goal is to develop an environment where one can explore Ubicomp-type concepts without having to build real Ubicomp hardware. The basic notion is to extend object models in a virtual world by using distributed wide area heterogeneous computing technology to provide complex networking and processing capabilities to virtual reality objects.

  13. Enabling department-scale supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D.S.; Hart, W.E.; Phillips, C.A.

    1997-11-01

    The Department of Energy (DOE) national laboratories have one of the longest and most consistent histories of supercomputer use. The authors summarize the architecture of DOE`s new supercomputers that are being built for the Accelerated Strategic Computing Initiative (ASCI). The authors then argue that in the near future scaled-down versions of these supercomputers with petaflop-per-weekend capabilities could become widely available to hundreds of research and engineering departments. The availability of such computational resources will allow simulation of physical phenomena to become a full-fledged third branch of scientific exploration, along with theory and experimentation. They describe the ASCI and other supercomputer applications at Sandia National Laboratories, and discuss which lessons learned from Sandia`s long history of supercomputing can be applied in this new setting.

  14. The airborne supercomputer

    Science.gov (United States)

    Rhea, John

    1990-05-01

    A new class of airborne supercomputer designated RH-32 is being developed at USAF research facilities, capable of performing the critical battle management function for any future antiballistic missile system that emerges from the SDI. This research is also aimed at applications for future tactical aircraft and retrofit into the supercomputers of the ATF. The computers are based on a system architecture known as multi-interlock pipe stages, developed by the DARPA. Fiber-optic data buses appear to be the only communications media that are likely to match the speed of the processors and they have the added advantage of being inherently radiation resistant. The RH-32 itself, being the product of a basic research effort, may never see operational use. However, the technologies that emerge from this major R&D program will set the standards for airborne computers well into the next century.

  15. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  16. Introduction to Reconfigurable Supercomputing

    CERN Document Server

    Lanzagorta, Marco; Rosenberg, Robert

    2010-01-01

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPe who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigur

  17. Future integrated design environments

    DEFF Research Database (Denmark)

    Christiansson, Per; Svidt, Kjeld; Sørensen, Kristian Birch

    2009-01-01

    on the development. Among the most important are missing ontologies both on business and Web/Internet service levels as well as their interrelations, poor user involvement in needs and requirements formulations on new ICT tools as well as in continuous user involvement in design and evaluation of new user...... to be increased. The paper presents a roadmap for development of future Integrated Building Design Systems (IBDS) with end-user participation. Methods for development of tools supporting creative and innovative building design with end-user participation is taking into account, including methods for capture...

  18. Computational Dimensionalities of Global Supercomputing

    Directory of Open Access Journals (Sweden)

    Richard S. Segall

    2013-12-01

    Full Text Available This Invited Paper pertains to subject of my Plenary Keynote Speech at the 17th World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI 2013 held in Orlando, Florida on July 9-12, 2013. The title of my Plenary Keynote Speech was: "Dimensionalities of Computation: from Global Supercomputing to Data, Text and Web Mining" but this Invited Paper will focus only on the "Computational Dimensionalities of Global Supercomputing" and is based upon a summary of the contents of several individual articles that have been previously written with myself as lead author and published in [75], [76], [77], [78], [79], [80] and [11]. The topics of these of the Plenary Speech included Overview of Current Research in Global Supercomputing [75], Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing [76], Data Mining Supercomputing with SAS™ JMP® Genomics ([77], [79], [80], and Visualization by Supercomputing Data Mining [81]. ______________________ [11.] Committee on the Future of Supercomputing, National Research Council (2003, The Future of Supercomputing: An Interim Report, ISBN-13: 978-0-309-09016- 2, http://www.nap.edu/catalog/10784.html [75.] Segall, Richard S.; Zhang, Qingyu and Cook, Jeffrey S.(2013, "Overview of Current Research in Global Supercomputing", Proceedings of Forty- Fourth Meeting of Southwest Decision Sciences Institute (SWDSI, Albuquerque, NM, March 12-16, 2013. [76.] Segall, Richard S. and Zhang, Qingyu (2010, "Open-Source Software Tools for Data Mining Analysis of Genomic and Spatial Images using High Performance Computing", Proceedings of 5th INFORMS Workshop on Data Mining and Health Informatics, Austin, TX, November 6, 2010. [77.] Segall, Richard S., Zhang, Qingyu and Pierce, Ryan M.(2010, "Data Mining Supercomputing with SAS™ JMP®; Genomics: Research-in-Progress, Proceedings of 2010 Conference on Applied Research in Information Technology, sponsored by

  19. Comparison of supercomputers and mini-supercomputers for computational fluid dynamics calculations

    International Nuclear Information System (INIS)

    Gentzsch, W.

    1988-01-01

    Computational fluid dynamics (CFD) is a powerful tool for the simulation of complex fluid dynamics problems. In the future, the progress in CFD will depend on efficient algorithms as well as on the power and storage capacity of the computers available. A careful study and comparison of these supercomputers, therefore, is necessary. The following paper presents a short description of the Engineering and Scientific Model Benchmark, the supercomputers and mini-supercomputers under consideration, and a discussion of the benchmark results

  20. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  1. Japanese supercomputer technology

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Ewald, R.H.; Worlton, W.J.

    1982-01-01

    In February 1982, computer scientists from the Los Alamos National Laboratory and Lawrence Livermore National Laboratory visited several Japanese computer manufacturers. The purpose of these visits was to assess the state of the art of Japanese supercomputer technology and to advise Japanese computer vendors of the needs of the US Department of Energy (DOE) for more powerful supercomputers. The Japanese foresee a domestic need for large-scale computing capabilities for nuclear fusion, image analysis for the Earth Resources Satellite, meteorological forecast, electrical power system analysis (power flow, stability, optimization), structural and thermal analysis of satellites, and very large scale integrated circuit design and simulation. To meet this need, Japan has launched an ambitious program to advance supercomputer technology. This program is described

  2. GREEN SUPERCOMPUTING IN A DESKTOP BOX

    Energy Technology Data Exchange (ETDEWEB)

    HSU, CHUNG-HSING [Los Alamos National Laboratory; FENG, WU-CHUN [NON LANL; CHING, AVERY [NON LANL

    2007-01-17

    The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicated computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.

  3. ASCI's Vision for supercomputing future

    International Nuclear Information System (INIS)

    Nowak, N.D.

    2003-01-01

    The full text of publication follows. Advanced Simulation and Computing (ASC, formerly Accelerated Strategic Computing Initiative [ASCI]) was established in 1995 to help Defense Programs shift from test-based confidence to simulation-based confidence. Specifically, ASC is a focused and balanced program that is accelerating the development of simulation capabilities needed to analyze and predict the performance, safety, and reliability of nuclear weapons and certify their functionality - far exceeding what might have been achieved in the absence of a focused initiative. To realize its vision, ASC is creating simulation and proto-typing capabilities, based on advanced weapon codes and high-performance computing

  4. Supercomputers to transform Science

    CERN Multimedia

    2006-01-01

    "New insights into the structure of space and time, climate modeling, and the design of novel drugs, are but a few of the many research areas that will be transforned by the installation of three supercomputers at the Unversity of Bristol." (1/2 page)

  5. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  6. Supercomputing and nuclear safety

    International Nuclear Information System (INIS)

    Livolant, M.; Durin, M.; Micaelli, J.C.

    2003-01-01

    Safety is essential for nuclear installations: it is necessary to avoid the release of radioactive materials outside them. So, they are designed, built and operated in a way which allows to prevent accidents, to keep the system in a safe situation even if the largest accident taken for the design happens, and to protect the population from harm in case of an out of design accident. Limiting the analysis to the light water reactors, we can consider the interest of supercomputing in the following domains: - Primary circuit loss of coolant accident; - Computational Fluid Dynamics safety studies; - Treatment of uncertainties; - Simulators; - Severe accidents. The first topic, primary circuit loss of coolant accident, has contributed since many years to the development of high level codes to compute the behaviour of water-stream mixtures in a high pressure ruptured circuit, the objective being to maintain the cooling of the core in spite of the continued heat source of the fission products. Well known codes like RELAP and CATHARE are largely in use. Research is in progress to improve the physical and numerical models and to extend the scope of calculations. The second topic is in progress, and a large variety of applications are largely in use or foreseen at least for research or exploratory studies. Direct use of those techniques for proving the safety of systems will require a large work of validation, including comparison with experiments, and significant improvements in easiness of use and speed of calculation. Typically, complex safety calculations are made with the best models and the best values of parameters, but there are uncertainties in the models and the parameters, and a safety analysis has to consider the worst conditions. There are not fully satisfactory methods to take care of the uncertainties, specially in the models, and, whichever they are, they multiply the calculation time by a factor between ten and hundred. Operational simulators exist, at least

  7. Orbital Debris and Future Environment Remediation

    Science.gov (United States)

    Liou, Jer-Chyi

    2011-01-01

    This slide presentation is an overview of the historical and current orbital debris environment. Included is information about: Projected growth of the future debris population, The need for active debris removal (ADR), A grand challenge for the 21st century and The forward path

  8. Ultrascalable petaflop parallel supercomputer

    Science.gov (United States)

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  9. The Pawsey Supercomputer geothermal cooling project

    Science.gov (United States)

    Regenauer-Lieb, K.; Horowitz, F.; Western Australian Geothermal Centre Of Excellence, T.

    2010-12-01

    The Australian Government has funded the Pawsey supercomputer in Perth, Western Australia, providing computational infrastructure intended to support the future operations of the Australian Square Kilometre Array radiotelescope and to boost next-generation computational geosciences in Australia. Supplementary funds have been directed to the development of a geothermal exploration well to research the potential for direct heat use applications at the Pawsey Centre site. Cooling the Pawsey supercomputer may be achieved by geothermal heat exchange rather than by conventional electrical power cooling, thus reducing the carbon footprint of the Pawsey Centre and demonstrating an innovative green technology that is widely applicable in industry and urban centres across the world. The exploration well is scheduled to be completed in 2013, with drilling due to commence in the third quarter of 2011. One year is allocated to finalizing the design of the exploration, monitoring and research well. Success in the geothermal exploration and research program will result in an industrial-scale geothermal cooling facility at the Pawsey Centre, and will provide a world-class student training environment in geothermal energy systems. A similar system is partially funded and in advanced planning to provide base-load air-conditioning for the main campus of the University of Western Australia. Both systems are expected to draw ~80-95 degrees C water from aquifers lying between 2000 and 3000 meters depth from naturally permeable rocks of the Perth sedimentary basin. The geothermal water will be run through absorption chilling devices, which only require heat (as opposed to mechanical work) to power a chilled water stream adequate to meet the cooling requirements. Once the heat has been removed from the geothermal water, licensing issues require the water to be re-injected back into the aquifer system. These systems are intended to demonstrate the feasibility of powering large-scale air

  10. Supercomputer optimizations for stochastic optimal control applications

    Science.gov (United States)

    Chung, Siu-Leung; Hanson, Floyd B.; Xu, Huihuang

    1991-01-01

    Supercomputer optimizations for a computational method of solving stochastic, multibody, dynamic programming problems are presented. The computational method is valid for a general class of optimal control problems that are nonlinear, multibody dynamical systems, perturbed by general Markov noise in continuous time, i.e., nonsmooth Gaussian as well as jump Poisson random white noise. Optimization techniques for vector multiprocessors or vectorizing supercomputers include advanced data structures, loop restructuring, loop collapsing, blocking, and compiler directives. These advanced computing techniques and superconducting hardware help alleviate Bellman's curse of dimensionality in dynamic programming computations, by permitting the solution of large multibody problems. Possible applications include lumped flight dynamics models for uncertain environments, such as large scale and background random aerospace fluctuations.

  11. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-12-31

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  12. Supercomputer debugging workshop 1991 proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.

    1991-01-01

    This report discusses the following topics on supercomputer debugging: Distributed debugging; use interface to debugging tools and standards; debugging optimized codes; debugging parallel codes; and debugger performance and interface as analysis tools. (LSP)

  13. Population, resources, environment: an uncertain future.

    Science.gov (United States)

    Repetto, R

    1987-07-01

    This issue analyzes the economic and environmental consequences of rapid population growth in developing countries (LDC), the population decline in developed countries, the limits that life on a finite planet impose on economic and demographic expansion and progress, and the proper governmental response to promote the welfare of its current and future citizens. The links between population growth, resource use, and environmental quality are too complex to permit straightforward generalizations about direct causal relationships. However, rapid population growth has increased the number of poor people in LDC, thus contributing to degradation of the environment and the renewable resources of land, water, and nonhuman species on which humans depend. Demands of the rich industrial countries have also generated environmental pressures and have been foremost in consumption of the nonrenewable resources of fossil fuels, metals, and nonmetallic minerals. On the other hand, population and economic growth have also stimulated technological and management changes that help supply and use resources more effectively. Wide variations in the possible ultimate size of world population and accelerating technological change make future interrelationships of population, resources, and the environment uncertain as well as complex. Those interrelationships are mediated largely by government policies. Responsible governments can bring about a sustainable balance in the population/resource/environment equation by adopting population and development policies that experience has shown could reduce future population numbers in LDC below the additional 5 billion indicated in current UN medium projections. This coupled with proven management programs in both LDC and developed countries could brake and reverse the depletion and degradation of natural resources.

  14. The natural radiation environment: future perspective

    International Nuclear Information System (INIS)

    Steinhaeusler, F.

    1992-01-01

    The need to control the exposure of man to the natural radiation environment (NRE) is increasingly recognised. The main NRE sources and exposure situations warranting intensified efforts in the future are: exposure to radiation in space (astronaut: ≤ 1 mSv.d -1 ), technologically enhanced natural radiation (TENR; global impact: 400,000 man.Sv.y -1 ) and populations living in high background radiation areas (resident: ≤ 360 mGy.y -1 ). Data on NRE-TENR-induced biological effects are scarce and inconclusive, such as increased frequency of chromosome aberrations and mental retardation from environmental gamma radiation, but there are contradictory results for thorium and radon exposure induced lung cancer risk. Four coordinated actions are proposed, i.e. international standardisation of methods, coordination of multidisciplinary health effect studies, development of principles for NRE/TENR control, and establishment of an international clearing house for all NRE-related topics. (Author)

  15. Power-constrained supercomputing

    Science.gov (United States)

    Bailey, Peter E.

    As we approach exascale systems, power is turning from an optimization goal to a critical operating constraint. With power bounds imposed by both stakeholders and the limitations of existing infrastructure, achieving practical exascale computing will therefore rely on optimizing performance subject to a power constraint. However, this requirement should not add to the burden of application developers; optimizing the runtime environment given restricted power will primarily be the job of high-performance system software. In this dissertation, we explore this area and develop new techniques that extract maximum performance subject to a particular power constraint. These techniques include a method to find theoretical optimal performance, a runtime system that shifts power in real time to improve performance, and a node-level prediction model for selecting power-efficient operating points. We use a linear programming (LP) formulation to optimize application schedules under various power constraints, where a schedule consists of a DVFS state and number of OpenMP threads for each section of computation between consecutive message passing events. We also provide a more flexible mixed integer-linear (ILP) formulation and show that the resulting schedules closely match schedules from the LP formulation. Across four applications, we use our LP-derived upper bounds to show that current approaches trail optimal, power-constrained performance by up to 41%. This demonstrates limitations of current systems, and our LP formulation provides future optimization approaches with a quantitative optimization target. We also introduce Conductor, a run-time system that intelligently distributes available power to nodes and cores to improve performance. The key techniques used are configuration space exploration and adaptive power balancing. Configuration exploration dynamically selects the optimal thread concurrency level and DVFS state subject to a hardware-enforced power bound

  16. World's fastest supercomputer opens up to users

    Science.gov (United States)

    Xin, Ling

    2016-08-01

    China's latest supercomputer - Sunway TaihuLight - has claimed the crown as the world's fastest computer according to the latest TOP500 list, released at the International Supercomputer Conference in Frankfurt in late June.

  17. Mistral Supercomputer Job History Analysis

    OpenAIRE

    Zasadziński, Michał; Muntés-Mulero, Victor; Solé, Marc; Ludwig, Thomas

    2018-01-01

    In this technical report, we show insights and results of operational data analysis from petascale supercomputer Mistral, which is ranked as 42nd most powerful in the world as of January 2018. Data sources include hardware monitoring data, job scheduler history, topology, and hardware information. We explore job state sequences, spatial distribution, and electric power patterns.

  18. Computational plasma physics and supercomputers

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1984-09-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular codes, but parallel processing poses new coding difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematics

  19. Supercomputing and related national projects in Japan

    International Nuclear Information System (INIS)

    Miura, Kenichi

    1985-01-01

    Japanese supercomputer development activities in the industry and research projects are outlined. Architecture, technology, software, and applications of Fujitsu's Vector Processor Systems are described as an example of Japanese supercomputers. Applications of supercomputers to high energy physics are also discussed. (orig.)

  20. An assessment of worldwide supercomputer usage

    Energy Technology Data Exchange (ETDEWEB)

    Wasserman, H.J.; Simmons, M.L.; Hayes, A.H.

    1995-01-01

    This report provides a comparative study of advanced supercomputing usage in Japan and the United States as of Spring 1994. It is based on the findings of a group of US scientists whose careers have centered on programming, evaluating, and designing high-performance supercomputers for over ten years. The report is a follow-on to an assessment of supercomputing technology in Europe and Japan that was published in 1993. Whereas the previous study focused on supercomputer manufacturing capabilities, the primary focus of the current work was to compare where and how supercomputers are used. Research for this report was conducted through both literature studies and field research in Japan.

  1. China Debates the Future Security Environment

    Science.gov (United States)

    2000-01-01

    eternal verities of geopolitics and worst case scenarios. The Warring States era as a guide to the future is a rich subject, but it is never spelled...1995): 8-18, in Michael Pillsbury., Chinese Views of Future Warfare (Washington: National Defense University Press), 317- 326. ° Hart Shengmin, ed...up, surrounded, and even destroyed by the other. ’󈧮 • Estabk’sh sound economic structure. A Chinese military, Research Fellow, Hart Ren, pointed

  2. Seismic signal processing on heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that

  3. Environment, energy, economy. A sustainable future

    International Nuclear Information System (INIS)

    Luise, A.; Borrello, L.; Calef, D.; Cialani, C.; Di Majo, V.; Federio, A.; Lovisolo, G.; Musmeci, F.

    1998-01-01

    This paper is organized in five parts: 1. sustainable development from global point of view; 2. global problems and international instruments; 3. sustainable management of resources in economic systems; 4. forecasting and methods: models and index; 5. future urban areas [it

  4. TOP500 Supercomputers for June 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-06-23

    23rd Edition of TOP500 List of World's Fastest Supercomputers Released: Japan's Earth Simulator Enters Third Year in Top Position MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 23rd edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2004) at the International Supercomputer Conference in Heidelberg, Germany.

  5. Requirements for supercomputing in energy research: The transition to massively parallel computing

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-01

    This report discusses: The emergence of a practical path to TeraFlop computing and beyond; requirements of energy research programs at DOE; implementation: supercomputer production computing environment on massively parallel computers; and implementation: user transition to massively parallel computing.

  6. Exploration and production environment. Preserving the future our responsibility

    International Nuclear Information System (INIS)

    2004-01-01

    This document presents the Total Group commitments to manage natural resources in a rational way, to preserve biodiversity for future generations and protect the environment. It contains the health, safety, environment and quality charter of Total, the 12 exploration and production health, safety and environment rules and the exploration and production environmental policy. (A.L.B.)

  7. Supercomputer applications in nuclear research

    International Nuclear Information System (INIS)

    Ishiguro, Misako

    1992-01-01

    The utilization of supercomputers in Japan Atomic Energy Research Institute is mainly reported. The fields of atomic energy research which use supercomputers frequently and the contents of their computation are outlined. What is vectorizing is simply explained, and nuclear fusion, nuclear reactor physics, the hydrothermal safety of nuclear reactors, the parallel property that the atomic energy computations of fluids and others have, the algorithm for vector treatment and the effect of speed increase by vectorizing are discussed. At present Japan Atomic Energy Research Institute uses two systems of FACOM VP 2600/10 and three systems of M-780. The contents of computation changed from criticality computation around 1970, through the analysis of LOCA after the TMI accident, to nuclear fusion research, the design of new type reactors and reactor safety assessment at present. Also the method of using computers advanced from batch processing to time sharing processing, from one-dimensional to three dimensional computation, from steady, linear to unsteady nonlinear computation, from experimental analysis to numerical simulation and so on. (K.I.)

  8. Comparing clusters and supercomputers for lattice QCD

    International Nuclear Information System (INIS)

    Gottlieb, Steven

    2001-01-01

    Since the development of the Beowulf project to build a parallel computer from commodity PC components, there have been many such clusters built. The MILC QCD code has been run on a variety of clusters and supercomputers. Key design features are identified, and the cost effectiveness of clusters and supercomputers are compared

  9. Low Cost Supercomputer for Applications in Physics

    Science.gov (United States)

    Ahmed, Maqsood; Ahmed, Rashid; Saeed, M. Alam; Rashid, Haris; Fazal-e-Aleem

    2007-02-01

    Using parallel processing technique and commodity hardware, Beowulf supercomputers can be built at a much lower cost. Research organizations and educational institutions are using this technique to build their own high performance clusters. In this paper we discuss the architecture and design of Beowulf supercomputer and our own experience of building BURRAQ cluster.

  10. Dynamic Optical Networks for Future Internet Environments

    Science.gov (United States)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  11. TOP500 Supercomputers for November 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-11-16

    22nd Edition of TOP500 List of World s Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 22nd edition of the TOP500 list of the worlds fastest supercomputers was released today (November 16, 2003). The Earth Simulator supercomputer retains the number one position with its Linpack benchmark performance of 35.86 Tflop/s (''teraflops'' or trillions of calculations per second). It was built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan.

  12. TOP500 Supercomputers for June 2005

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2005-06-22

    25th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/L LNL BlueGene/L and IBM gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 25th edition of the TOP500 list of the world's fastest supercomputers was released today (June 22, 2005) at the 20th International Supercomputing Conference (ISC2005) in Heidelberg Germany.

  13. Integration of Titan supercomputer at OLCF with ATLAS Production System

    CERN Document Server

    AUTHOR|(SzGeCERN)643806; The ATLAS collaboration; De, Kaushik; Klimentov, Alexei; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Wenaus, Torre

    2017-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for jo...

  14. Extending ATLAS Computing to Commercial Clouds and Supercomputers

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Filipcic, A; Klimentov, A; Maeno, T; Oleynik, D; Panitkin, S; Wenaus, T; Wu, W

    2014-01-01

    The Large Hadron Collider will resume data collection in 2015 with substantially increased computing requirements relative to its first 2009-2013 run. A near doubling of the energy and the data rate, high level of event pile-up, and detector upgrades will mean the number and complexity of events to be analyzed will increase dramatically. A naive extrapolation of the Run 1 experience would suggest that a 5-6 fold increase in computing resources are needed - impossible within the anticipated flat computing budgets in the near future. Consequently ATLAS is engaged in an ambitious program to expand its computing to all available resources, notably including opportunistic use of commercial clouds and supercomputers. Such resources present new challenges in managing heterogeneity, supporting data flows, parallelizing workflows, provisioning software, and other aspects of distributed computing, all while minimizing operational load. We will present the ATLAS experience to date with clouds and supercomputers, and des...

  15. Integration of Titan supercomputer at OLCF with ATLAS production system

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration

    2016-01-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this talk we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job...

  16. Advanced Architectures for Astrophysical Supercomputing

    Science.gov (United States)

    Barsdell, B. R.; Barnes, D. G.; Fluke, C. J.

    2010-12-01

    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×) in general-purpose computation - performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.

  17. Federal Market Information Technology in the Post Flash Crash Era: Roles for Supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Leinweber, David; Ruebel, Oliver; Wu, Kesheng

    2011-09-16

    This paper describes collaborative work between active traders, regulators, economists, and supercomputing researchers to replicate and extend investigations of the Flash Crash and other market anomalies in a National Laboratory HPC environment. Our work suggests that supercomputing tools and methods will be valuable to market regulators in achieving the goal of market safety, stability, and security. Research results using high frequency data and analytics are described, and directions for future development are discussed. Currently the key mechanism for preventing catastrophic market action are “circuit breakers.” We believe a more graduated approach, similar to the “yellow light” approach in motorsports to slow down traffic, might be a better way to achieve the same goal. To enable this objective, we study a number of indicators that could foresee hazards in market conditions and explore options to confirm such predictions. Our tests confirm that Volume Synchronized Probability of Informed Trading (VPIN) and a version of volume Herfindahl-Hirschman Index (HHI) for measuring market fragmentation can indeed give strong signals ahead of the Flash Crash event on May 6 2010. This is a preliminary step toward a full-fledged early-warning system for unusual market conditions.

  18. 16 million [pounds] investment for 'virtual supercomputer'

    CERN Multimedia

    Holland, C

    2003-01-01

    "The Particle Physics and Astronomy Research Council is to spend 16million [pounds] to create a massive computing Grid, equivalent to the world's second largest supercomputer after Japan's Earth Simulator computer" (1/2 page)

  19. Plasma turbulence calculations on supercomputers

    International Nuclear Information System (INIS)

    Carreras, B.A.; Charlton, L.A.; Dominguez, N.; Drake, J.B.; Garcia, L.; Leboeuf, J.N.; Lee, D.K.; Lynch, V.E.; Sidikman, K.

    1991-01-01

    Although the single-particle picture of magnetic confinement is helpful in understanding some basic physics of plasma confinement, it does not give a full description. Collective effects dominate plasma behavior. Any analysis of plasma confinement requires a self-consistent treatment of the particles and fields. The general picture is further complicated because the plasma, in general, is turbulent. The study of fluid turbulence is a rather complex field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples back to the fluid. Since the fluid is not a perfect conductor, this turbulence can lead to changes in the topology of the magnetic field structure, causing the magnetic field lines to wander radially. Because the plasma fluid flows along field lines, they carry the particles with them, and this enhances the losses caused by collisions. The changes in topology are critical for the plasma confinement. The study of plasma turbulence and the concomitant transport is a challenging problem. Because of the importance of solving the plasma turbulence problem for controlled thermonuclear research, the high complexity of the problem, and the necessity of attacking the problem with supercomputers, the study of plasma turbulence in magnetic confinement devices is a Grand Challenge problem

  20. TOP500 Supercomputers for November 2004

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2004-11-08

    24th Edition of TOP500 List of World's Fastest Supercomputers Released: DOE/IBM BlueGene/L and NASA/SGI's Columbia gain Top Positions MANNHEIM, Germany; KNOXVILLE, Tenn.; BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 24th edition of the TOP500 list of the worlds fastest supercomputers was released today (November 8, 2004) at the SC2004 Conference in Pittsburgh, Pa.

  1. Energy-water-environment nexus underpinning future desalination sustainability

    KAUST Repository

    Shahzad, Muhammad Wakil

    2017-03-11

    Energy-water-environment nexus is very important to attain COP21 goal, maintaining environment temperature increase below 2°C, but unfortunately two third share of CO2 emission has already been used and the remaining will be exhausted by 2050. A number of technological developments in power and desalination sectors improved their efficiencies to save energy and carbon emission but still they are operating at 35% and 10% of their thermodynamic limits. Research in desalination processes contributing to fuel World population for their improved living standard and to reduce specific energy consumption and to protect environment. Recently developed highly efficient nature-inspired membranes (aquaporin & graphene) and trend in thermally driven cycle\\'s hybridization could potentially lower then energy requirement for water purification. This paper presents a state of art review on energy, water and environment interconnection and future energy efficient desalination possibilities to save energy and protect environment.

  2. TOP500 Supercomputers for June 2002

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2002-06-20

    19th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 19th edition of the TOP500 list of the worlds fastest supercomputers was released today (June 20, 2002). The recently installed Earth Simulator supercomputer at the Earth Simulator Center in Yokohama, Japan, is as expected the clear new number 1. Its performance of 35.86 Tflop/s (trillions of calculations per second) running the Linpack benchmark is almost five times higher than the performance of the now No.2 IBM ASCI White system at Lawrence Livermore National Laboratory (7.2 Tflop/s). This powerful leap frogging to the top by a system so much faster than the previous top system is unparalleled in the history of the TOP500.

  3. Status reports of supercomputing astrophysics in Japan

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Nagasawa, Mikio

    1990-01-01

    The Workshop on Supercomputing Astrophysics was held at National Laboratory for High Energy Physics (KEK, Tsukuba) from August 31 to September 2, 1989. More than 40 participants of physicists, astronomers were attendant and discussed many topics in the informal atmosphere. The main purpose of this workshop was focused on the theoretical activities in computational astrophysics in Japan. It was also aimed to promote effective collaboration between the numerical experimentists working on supercomputing technique. The various subjects of the presented papers of hydrodynamics, plasma physics, gravitating systems, radiative transfer and general relativity are all stimulating. In fact, these numerical calculations become possible now in Japan owing to the power of Japanese supercomputer such as HITAC S820, Fujitsu VP400E and NEC SX-2. (J.P.N.)

  4. TOP500 Supercomputers for November 2002

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2002-11-15

    20th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 20th edition of the TOP500 list of the world's fastest supercomputers was released today (November 15, 2002). The Earth Simulator supercomputer installed earlier this year at the Earth Simulator Center in Yokohama, Japan, is with its Linpack benchmark performance of 35.86 Tflop/s (trillions of calculations per second) retains the number one position. The No.2 and No.3 positions are held by two new, identical ASCI Q systems at Los Alamos National Laboratory (7.73Tflop/s each). These systems are built by Hewlett-Packard and based on the Alpha Server SC computer system.

  5. TOP500 Supercomputers for June 2003

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack; Simon, Horst D.

    2003-06-23

    21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamos National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.

  6. Future Evolution of Virtual Worlds as Communication Environments

    Science.gov (United States)

    Prisco, Giulio

    Extensive experience creating locations and activities inside virtual worlds provides the basis for contemplating their future. Users of virtual worlds are diverse in their goals for these online environments; for example, immersionists want them to be alternative realities disconnected from real life, whereas augmentationists want them to be communication media supporting real-life activities. As the technology improves, the diversity of virtual worlds will increase along with their significance. Many will incorporate more advanced virtual reality, or serve as major media for long-distance collaboration, or become the venues for futurist social movements. Key issues are how people can create their own virtual worlds, travel across worlds, and experience a variety of multimedia immersive environments. This chapter concludes by noting the view among some computer scientists that future technologies will permit uploading human personalities to artificial intelligence avatars, thereby enhancing human beings and rendering the virtual worlds entirely real.

  7. Requirements for user interaction support in future CACE environments

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, M.

    1994-01-01

    Based on a review of user interaction modes and the specific needs of the CACE domain the paper describes requirements for user interaction in future CACE environments. Taking another look at the design process in CACE key areas in need of more user interaction support are pointed out. Three...... concepts are described through examples, dynamic data access, parallel evaluation and active documentation. The features of existing tools are summarized. The problem of how easily or `naturally' the novel concepts are integrated is stressed...

  8. Computational plasma physics and supercomputers. Revision 1

    International Nuclear Information System (INIS)

    Killeen, J.; McNamara, B.

    1985-01-01

    The Supercomputers of the 80's are introduced. They are 10 to 100 times more powerful than today's machines. The range of physics modeling in the fusion program is outlined. New machine architecture will influence particular models, but parallel processing poses new programming difficulties. Increasing realism in simulations will require better numerics and more elaborate mathematical models

  9. The pan-European environment: glimpses into an uncertain future

    International Nuclear Information System (INIS)

    2007-01-01

    The rapidly changing nature of and increasing inter-linkages between many socio-economic phenomena - population growth and migration, globalisation and trade, personal consumption patterns and use of natural resources . are reflected in many of today's environment policy priorities: minimising and adapting to climate change; loss of biodiversity and ecosystem services; the degradation of such natural resources as land, freshwater and oceans; and the impacts of a wide range of pollutants on our environment and our health. The challenges that environmental policy makers are facing in this century are already very different from those of the last. Given the rapid change in socio.economic trends, both designing and implementing actions are becoming much more complex, and the way in which such policies deliver effective outcomes seems to be becoming increasingly uncertain. Alongside this, the time.lags between policy demands and institutional responses are often lengthening, with the institutional structures charged with designing and implementing agreed actions needing to change in order to keep up with this process. This report aims to contribute to the discussion about plausible future developments relevant to the wider European region and to stimulate medium to long-term thinking in policy-making circles. It does so by sketching some of the key environmental concerns for the pan-European region based on the EEA's Europe's environment - The fourth assessment, and by highlighting some of the many uncertainties the future holds. (au)

  10. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    International Nuclear Information System (INIS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-01-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers. (paper)

  11. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    Science.gov (United States)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  12. Developments in the simulation of compressible inviscid and viscous flow on supercomputers

    International Nuclear Information System (INIS)

    Steger, J.L.; Buning, P.G.; Tel Aviv Univ., Israel)

    1985-01-01

    In anticipation of future supercomputers, finite difference codes are rapidly being extended to simulate three-dimensional compressible flow about complex configurations. Some of these developments are reviewed. The importance of computational flow visualization and diagnostic methods to three-dimensional flow simulation is also briefly discussed. 46 references

  13. Global environment outlook GEO5. Environment for the future we want

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-05-15

    The main goal of UNEP's Global Environment Outlook (GEO) is to keep governments and stakeholders informed of the state and trends of the global environment. Over the past 15 years, the GEO reports have examined a wealth of data, information and knowledge about the global environment; identified potential policy responses; and provided an outlook for the future. The assessments, and their consultative and collaborative processes, have worked to bridge the gap between science and policy by turning the best available scientific knowledge into information relevant for decision makers. The GEO-5 report is made up of 17 chapters organized into three distinct but linked parts. Part 1 - State and trends of the global environment; Part 2 - Policy options from the regions; Part 3 - Opportunities for a global response.

  14. Adventures in Supercomputing: An innovative program

    Energy Technology Data Exchange (ETDEWEB)

    Summers, B.G.; Hicks, H.R.; Oliver, C.E.

    1995-06-01

    Within the realm of education, seldom does an innovative program become available with the potential to change an educator`s teaching methodology and serve as a spur to systemic reform. The Adventures in Supercomputing (AiS) program, sponsored by the Department of Energy, is such a program. Adventures in Supercomputing is a program for high school and middle school teachers. It has helped to change the teaching paradigm of many of the teachers involved in the program from a teacher-centered classroom to a student-centered classroom. ``A student-centered classroom offers better opportunities for development of internal motivation, planning skills, goal setting and perseverance than does the traditional teacher-directed mode``. Not only is the process of teaching changed, but evidences of systemic reform are beginning to surface. After describing the program, the authors discuss the teaching strategies being used and the evidences of systemic change in many of the AiS schools in Tennessee.

  15. Supercomputing Centers and Electricity Service Providers

    DEFF Research Database (Denmark)

    Patki, Tapasya; Bates, Natalie; Ghatikar, Girish

    2016-01-01

    Supercomputing Centers (SCs) have high and variable power demands, which increase the challenges of the Electricity Service Providers (ESPs) with regards to efficient electricity distribution and reliable grid operation. High penetration of renewable energy generation further exacerbates this pro...... (LRZ). We conclude that perspectives on demand management are dependent on the electricity market and pricing in the geographical region and on the degree of control that a particular SC has in terms of power-purchase negotiation....

  16. Towards a future robotic home environment: a survey.

    Science.gov (United States)

    Güttler, Jörg; Georgoulas, Christos; Linner, Thomas; Bock, Thomas

    2015-01-01

    Demographic change has resulted in an increase of elderly people, while at the same time the number of active working people is falling. In the future, there will be less caretaking, which is necessary to support the aging population. In order to enable the aged population to live in dignity, they should be able to perform activities of daily living (ADLs) as independently as possible. The aim of this paper is to describe several solutions and concepts that can support elderly people in their ADLs in a way that allows them to stay self-sufficient for as long as possible. To reach this goal, the Building Realization and Robotics Lab is researching in the field of ambient assisted living. The idea is to implement robots and sensors in the home environment so as to efficiently support the inhabitants in their ADLs and eventually increase their independence. Through embedding vital sensors into furniture and using ICT technologies, the health status of elderly people can be remotely evaluated by a physician or family members. By investigating ergonomic aspects specific to elderly people (e.g. via an age-simulation suit), it is possible to develop and test new concepts and novel applications, which will offer innovative solutions. Via the introduction of mechatronics and robotics, the home environment can be made able to seamlessly interact with the inhabitant through gestures, vocal commands, and visual recognition algorithms. Meanwhile, several solutions have been developed that address how to build a smart home environment in order to create an ambient assisted environment. This article describes how these concepts were developed. The approach for each concept, proposed in this article, was performed as follows: (1) research of needs, (2) creating definitions of requirements, (3) identification of necessary technology and processes, (4) building initial concepts, (5) experiments in a real environment, and (6) development of the final concepts. To keep these concepts

  17. Centralized supercomputer support for magnetic fusion energy research

    International Nuclear Information System (INIS)

    Fuss, D.; Tull, G.G.

    1984-01-01

    High-speed computers with large memories are vital to magnetic fusion energy research. Magnetohydrodynamic (MHD), transport, equilibrium, Vlasov, particle, and Fokker-Planck codes that model plasma behavior play an important role in designing experimental hardware and interpreting the resulting data, as well as in advancing plasma theory itself. The size, architecture, and software of supercomputers to run these codes are often the crucial constraints on the benefits such computational modeling can provide. Hence, vector computers such as the CRAY-1 offer a valuable research resource. To meet the computational needs of the fusion program, the National Magnetic Fusion Energy Computer Center (NMFECC) was established in 1974 at the Lawrence Livermore National Laboratory. Supercomputers at the central computing facility are linked to smaller computer centers at each of the major fusion laboratories by a satellite communication network. In addition to providing large-scale computing, the NMFECC environment stimulates collaboration and the sharing of computer codes and data among the many fusion researchers in a cost-effective manner

  18. A workbench for tera-flop supercomputing

    International Nuclear Information System (INIS)

    Resch, M.M.; Kuester, U.; Mueller, M.S.; Lang, U.

    2003-01-01

    Supercomputers currently reach a peak performance in the range of TFlop/s. With but one exception - the Japanese Earth Simulator - none of these systems has so far been able to also show a level of sustained performance for a variety of applications that comes close to the peak performance. Sustained TFlop/s are therefore rarely seen. The reasons are manifold and are well known: Bandwidth and latency both for main memory and for the internal network are the key internal technical problems. Cache hierarchies with large caches can bring relief but are no remedy to the problem. However, there are not only technical problems that inhibit the full exploitation by scientists of the potential of modern supercomputers. More and more organizational issues come to the forefront. This paper shows the approach of the High Performance Computing Center Stuttgart (HLRS) to deliver a sustained performance of TFlop/s for a wide range of applications from a large group of users spread over Germany. The core of the concept is the role of the data. Around this we design a simulation workbench that hides the complexity of interacting computers, networks and file systems from the user. (authors)

  19. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  20. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-03-13

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  1. Urban warming in Tokyo area and counterplan to improve future environment

    International Nuclear Information System (INIS)

    Saitoh, T.S.; Hoshi, H.

    1993-01-01

    The rapid progress in industrialization and concentration of economic and social functions in urban areas has stimulated a consistent increase in population and energy consumption. The sudden urbanization in modern cities has caused environmental problems including alternation of the local climate. This is a phenomenon peculiar to the urban areas, and is characterized by a consistent rise in the temperature of the urban atmosphere, an increase in air pollutants, a decrease in relative humidity, and so on. The phenomenon characterized by a noticeable temperature rise in the urban atmosphere has been called the urban heat island and analyzed by both observational and numerical approaches. The numerical model can be classified into two ways: the mechanical model and energy balance model. Since Howard reported on the urban heat island in London, there have been a number of observational studies and numerical studies based on the two-dimensional modeling. Recently, three-dimensional studies have been reported simultaneously with great the advancement of the supercomputer. The present paper reports the results of the field observation by automobiles in the Tokyo metropolitan area and also the results of the three-dimensional simulation for urban warming in Tokyo at present and in the future around 2030. Further, the authors also present the results of a simulation for the effect of tree planting and vegetation

  2. Man and environment-problems of the future

    CERN Document Server

    Zaviskii, E K; Okun, Lev Borisovich; Smirnov, B M

    1977-01-01

    The depletion of natural resources, against the background of increasing numbers of population, is discussed. Man's effect on his environment with emphasis on the reduction in stratospheric ozone is considered. (0 refs).

  3. Human requirements in future air-conditioned environments

    DEFF Research Database (Denmark)

    Fanger, Povl Ole

    1999-01-01

    Although air-conditioning has played a positive role for economic development in warm climates, its image is globally mixed. Field studies demonstrate that there are substantial numbers of dissatisfied people in many buildings, among them those suffering from Sick Building Syndrome (SBS) symptoms......, even though existing standards and guidelines are met. A paradigm shift from rather mediocre to excellent indoor environments is foreseen in the 21st century. Based on existing information and on new research results, five principles are suggested as elements behind a new philosophy of excellence...... control of the thermal environment should be provided. These principles of excellence are compatible with energy efficiency and sustainability....

  4. Human requirements in future air-conditioned environments

    DEFF Research Database (Denmark)

    Fanger, Povl Ole

    2001-01-01

    Although air-conditioning has played a positive role for economic development in warm climates, its image is globally mixed. Field studies demonstrate that there are substantial numbers of dissatisfied people in many buildings, among them those suffering from Sick Building Syndrome (SBS) symptoms......, even though existing standards and guidelines are met. A paradigm shift from rather mediocre to excellent indoor environments is foreseen in the 21st century. Based on existing information and on new research results, five principles are suggested as elements behind a new philosophy of excellence...... individual; individual control of the thermal environment should be provided. These principles of excellence are compatible with energy efficiency and sustainability....

  5. Language Learning in Virtual Reality Environments: Past, Present, and Future

    Science.gov (United States)

    Lin, Tsun-Ju; Lan, Yu-Ju

    2015-01-01

    This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…

  6. Valuing the environment: Economics for a sustainable future | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-10-06

    Oct 6, 2010 ... Listen to audio clip. Cao Jing, Assistant Professor at Tsinghua University in Beijing, describes her research on the joint benefits of reducing greenhouse gas emissions while also achieving the domestic benefits of pollution control. Her work on the topic, sponsored by the Economy and Environment Program ...

  7. The future of levies in a digital environment: final report

    NARCIS (Netherlands)

    Hugenholtz, P.B.; Guibault, L.; van Geffen, S.

    2003-01-01

    Copyright levy systems have been premised on the assumption that private copying of protected works cannot be controlled and exploited individually. With the advent of digital rights management (DRM), this assumption must be re-examined. In the digital environment, technical protection measures and

  8. BSMBench: a flexible and scalable supercomputer benchmark from computational particle physics

    CERN Document Server

    Bennett, Ed; Del Debbio, Luigi; Jordan, Kirk; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2016-01-01

    Benchmarking plays a central role in the evaluation of High Performance Computing architectures. Several benchmarks have been designed that allow users to stress various components of supercomputers. In order for the figures they provide to be useful, benchmarks need to be representative of the most common real-world scenarios. In this work, we introduce BSMBench, a benchmarking suite derived from Monte Carlo code used in computational particle physics. The advantage of this suite (which can be freely downloaded from http://www.bsmbench.org/) over others is the capacity to vary the relative importance of computation and communication. This enables the tests to simulate various practical situations. To showcase BSMBench, we perform a wide range of tests on various architectures, from desktop computers to state-of-the-art supercomputers, and discuss the corresponding results. Possible future directions of development of the benchmark are also outlined.

  9. Building more powerful less expensive supercomputers using Processing-In-Memory (PIM) LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Richard C.

    2009-09-01

    This report details the accomplishments of the 'Building More Powerful Less Expensive Supercomputers Using Processing-In-Memory (PIM)' LDRD ('PIM LDRD', number 105809) for FY07-FY09. Latency dominates all levels of supercomputer design. Within a node, increasing memory latency, relative to processor cycle time, limits CPU performance. Between nodes, the same increase in relative latency impacts scalability. Processing-In-Memory (PIM) is an architecture that directly addresses this problem using enhanced chip fabrication technology and machine organization. PIMs combine high-speed logic and dense, low-latency, high-bandwidth DRAM, and lightweight threads that tolerate latency by performing useful work during memory transactions. This work examines the potential of PIM-based architectures to support mission critical Sandia applications and an emerging class of more data intensive informatics applications. This work has resulted in a stronger architecture/implementation collaboration between 1400 and 1700. Additionally, key technology components have impacted vendor roadmaps, and we are in the process of pursuing these new collaborations. This work has the potential to impact future supercomputer design and construction, reducing power and increasing performance. This final report is organized as follow: this summary chapter discusses the impact of the project (Section 1), provides an enumeration of publications and other public discussion of the work (Section 1), and concludes with a discussion of future work and impact from the project (Section 1). The appendix contains reprints of the refereed publications resulting from this work.

  10. Integration of Titan supercomputer at OLCF with ATLAS Production System

    Science.gov (United States)

    Barreiro Megino, F.; De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wells, J.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job submission to Titan’s batch queues and local data management, with lightweight MPI wrappers to run single node workloads in parallel on Titan’s multi-core worker nodes. It provides for running of standard ATLAS production jobs on unused resources (backfill) on Titan. The system already allowed ATLAS to collect on Titan millions of core-hours per month, execute hundreds of thousands jobs, while simultaneously improving Titans utilization efficiency. We will discuss the details of the implementation, current experience with running the system, as well as future plans aimed at improvements in scalability and efficiency. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to

  11. Past successes and future challenges: Improving the urban environment

    International Nuclear Information System (INIS)

    Gade, M.

    1994-01-01

    The author discusses issues related to the Chicago urban environment from her perspective in the Illinois Environmental Protection Agency. Understanding of the ozone air pollution problem in the Chicago area has undergone significant changes in the past three years, and there is still more to be understood about the complex factors which contribute to ozone pollution over urban areas such as Chicago. Ability to address these problems to present clean air standards is not in hand at present. The author asserts that information, and the ability of governmental agencies to ingest and respond to that information in a timely manner is a key to improvement of the environment in urban areas in reasonable time spans. In addition cost and price information on environmental control and protection needs to be more clearly presented to the people so they can understand the difficult choices which must be made in addressing these environmental problems

  12. Past successes and future challenges: Improving the urban environment

    Energy Technology Data Exchange (ETDEWEB)

    Gade, M.

    1994-12-31

    The author discusses issues related to the Chicago urban environment from her perspective in the Illinois Environmental Protection Agency. Understanding of the ozone air pollution problem in the Chicago area has undergone significant changes in the past three years, and there is still more to be understood about the complex factors which contribute to ozone pollution over urban areas such as Chicago. Ability to address these problems to present clean air standards is not in hand at present. The author asserts that information, and the ability of governmental agencies to ingest and respond to that information in a timely manner is a key to improvement of the environment in urban areas in reasonable time spans. In addition cost and price information on environmental control and protection needs to be more clearly presented to the people so they can understand the difficult choices which must be made in addressing these environmental problems.

  13. Human requirements in future air-conditioned environments

    DEFF Research Database (Denmark)

    Fanger, Povl Ole

    2002-01-01

    Air-conditioning of buildings has played a very positive role for economic development in warm climates. Still its image is globally mixed. Field studies demonstrate that there are substantial numbers of dissatisfied people in many buildings, among them those suffering from SBS symptoms, even...... though existing standards and guidelines are met. A paradigm shift from rather mediocre to excellent indoor environments is foreseen in buildings in the 21st century. Based on existing information and on new research results, five principles are suggested as elements behind a new philosophy of excellence...... in the built environment: better indoor air quality increases productivity and decreases SBS symptoms; unnecessary indoor pollution sources should be avoided; the air should be served cool and dry to the occupants; personalized ventilation, i.e. small amounts of clean air, should be provided gently, close...

  14. Future Environments for Europe: Some Implications of Alternative Development Paths

    OpenAIRE

    Stigliani, W.M.; Brouwer, F.; Munn, R.E.; Shaw, R.W.; Antonovsky, M.Y.

    1989-01-01

    With the completion of the Report of the World Commission on Environment and Development (popularly known also as the Brundtland Commission Report) in 1987, and the subsequent worldwide attention given to that study, the concept of "ecologically sustainable development" has gained broad recognition. It is now commonly acknowledged that economic development and ecological sustainability are not contradictory goals. To the contrary, they are interdependent -- the economy of a country cannot gro...

  15. The Millennial Generation: Developing Leaders for the Future Security Environment

    Science.gov (United States)

    2011-02-15

    have been told and shown continually how special they are, and by being raised in an environment of instant gratification . Whether the immediate...to parents and friends through technology. They are the first generation to use e-mail, cell-phones, and instant messaging in childhood and...technological advances and want to be a part of technological solutions to problems.26 Roughly 95% percent of young adults (18-29) use the Internet , and

  16. International Space Environment Service: Current Activities and Future Plans

    Science.gov (United States)

    Boteler, D. H.; H. Lundstedt, H.; Kunches, J.; Coffey, H.; Hilgers, A.; Patterson, G.; van der Linden, R.; Lam, H.-L.; Wang, H.; Buresova, D.; et al.

    The International Space Environment Service ISES is a permanent service of the Federations of Astronomical and Geophysical Data Analysis Services FAGS with the mission to encourage and facilitate near-real-time international monitoring and prediction of the space environment This is done through the work of Regional Warning Centres RWC around the world who collaborate in the exploitation of a wide range of space-based and ground-based data Rapid exchange of information about the space environment is facilitated through the use of standard URSIgram codes RWCs also collaborate in sharing expertise in particular areas of specialty ISES also prepares the International Geophysical Calendar IGC each year giving a list of World Days during which scientists are encouraged to carry out their experiments and the monthly Spacewarn Bulletins which summarize the status of satellites in earth orbit and in the interplanetary medium ISES has its origins in the former URSI Central Committee of USRIgrams which initiated rapid international data interchange services in 1928 The modern system of regional warning centres was set up during the International Geophysical Year and now exist in every populated continent except Africa and South America ISES as part of its IGY 50 activities is working to develop RWCs in those continents ISES is also involved in developing new multi-national space weather services for example for trans-polar flights New space-based data on space weather activity will require extensive collaboration if it is to be

  17. Sample environments at IPNS: present and future capabilities

    International Nuclear Information System (INIS)

    Faber, J. Jr.

    1984-02-01

    Argonne's Intense Pulsed Neutron Source, IPNS, was dedicated as a major user-oriented neutron scattering facility two years ago. Most instruments are now equipped to provide for sample environments in the temperature range 1.5 < T < 1300K. A special facility provides T < 1mK, and another provides pressures to 30kbar. Several environmental equipment designs are described that emphasize time-of-flight technique. Methods for achieving time-resolved experiments which take advantage of the IPNS pulsed source characteristics are discussed. 6 references, 7 figures

  18. Prospects for Boiling of Subcooled Dielectric Liquids for Supercomputer Cooling

    Science.gov (United States)

    Zeigarnik, Yu. A.; Vasil'ev, N. V.; Druzhinin, E. A.; Kalmykov, I. V.; Kosoi, A. S.; Khodakov, K. A.

    2018-02-01

    It is shown experimentally that using forced-convection boiling of dielectric coolants of the Novec 649 Refrigerant subcooled relative to the saturation temperature makes possible removing heat flow rates up to 100 W/cm2 from modern supercomputer chip interface. This fact creates prerequisites for the application of dielectric liquids in cooling systems of modern supercomputers with increased requirements for their operating reliability.

  19. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  20. The future of seawater desalination: energy, technology, and the environment.

    Science.gov (United States)

    Elimelech, Menachem; Phillip, William A

    2011-08-05

    In recent years, numerous large-scale seawater desalination plants have been built in water-stressed countries to augment available water resources, and construction of new desalination plants is expected to increase in the near future. Despite major advancements in desalination technologies, seawater desalination is still more energy intensive compared to conventional technologies for the treatment of fresh water. There are also concerns about the potential environmental impacts of large-scale seawater desalination plants. Here, we review the possible reductions in energy demand by state-of-the-art seawater desalination technologies, the potential role of advanced materials and innovative technologies in improving performance, and the sustainability of desalination as a technological solution to global water shortages.

  1. Ab initio prediction of nano-structured materials using supercomputer

    International Nuclear Information System (INIS)

    Kumar, V.; Kawazoe, Y.

    2003-01-01

    Full text: Nano-structured materials are currently attracting great attention due to their promise in future nano-technologies. In the scale of a nanometer, properties of matter are sensitive to the atomic details that are often difficult to obtain from experiments. Impurities could change the properties very significantly. Predictive computer simulations based on ab initio methods are playing a very important role in not only supporting and explaining the experimental findings but also suggesting new possibilities. We shall present a brief overview of the current research done in our group using the supercomputing facilities of the IMR in designing and predicting nano-structured materials. These include the areas of molecular electronics, carbon fullerenes and nanotubes, super-structures on surfaces, multilayers, clusters and nanowires using calculational approaches such as all electron mixed basis, augmented plane wave, localized basis and pseudopotential plane wave methods. More accurate descriptions based on GW and QMC methods are also used. The possibilities of doing large scale calculations are also allowing the study of biological systems such as DNA. We shall discuss in more detail our recent predictions of novel metal encapsulated silicon fullerenes and nanotubes that offer new possibilities in developing silicon based technologies at the nano-scale

  2. Symbolic simulation of engineering systems on a supercomputer

    International Nuclear Information System (INIS)

    Ragheb, M.; Gvillo, D.; Makowitz, H.

    1986-01-01

    Model-Based Production-Rule systems for analysis are developed for the symbolic simulation of Complex Engineering systems on a CRAY X-MP Supercomputer. The Fault-Tree and Event-Tree Analysis methodologies from Systems-Analysis are used for problem representation and are coupled to the Rule-Based System Paradigm from Knowledge Engineering to provide modelling of engineering devices. Modelling is based on knowledge of the structure and function of the device rather than on human expertise alone. To implement the methodology, we developed a production-Rule Analysis System that uses both backward-chaining and forward-chaining: HAL-1986. The inference engine uses an Induction-Deduction-Oriented antecedent-consequent logic and is programmed in Portable Standard Lisp (PSL). The inference engine is general and can accommodate general modifications and additions to the knowledge base. The methodologies used will be demonstrated using a model for the identification of faults, and subsequent recovery from abnormal situations in Nuclear Reactor Safety Analysis. The use of the exposed methodologies for the prognostication of future device responses under operational and accident conditions using coupled symbolic and procedural programming is discussed

  3. Design of multiple sequence alignment algorithms on parallel, distributed memory supercomputers.

    Science.gov (United States)

    Church, Philip C; Goscinski, Andrzej; Holt, Kathryn; Inouye, Michael; Ghoting, Amol; Makarychev, Konstantin; Reumann, Matthias

    2011-01-01

    The challenge of comparing two or more genomes that have undergone recombination and substantial amounts of segmental loss and gain has recently been addressed for small numbers of genomes. However, datasets of hundreds of genomes are now common and their sizes will only increase in the future. Multiple sequence alignment of hundreds of genomes remains an intractable problem due to quadratic increases in compute time and memory footprint. To date, most alignment algorithms are designed for commodity clusters without parallelism. Hence, we propose the design of a multiple sequence alignment algorithm on massively parallel, distributed memory supercomputers to enable research into comparative genomics on large data sets. Following the methodology of the sequential progressiveMauve algorithm, we design data structures including sequences and sorted k-mer lists on the IBM Blue Gene/P supercomputer (BG/P). Preliminary results show that we can reduce the memory footprint so that we can potentially align over 250 bacterial genomes on a single BG/P compute node. We verify our results on a dataset of E.coli, Shigella and S.pneumoniae genomes. Our implementation returns results matching those of the original algorithm but in 1/2 the time and with 1/4 the memory footprint for scaffold building. In this study, we have laid the basis for multiple sequence alignment of large-scale datasets on a massively parallel, distributed memory supercomputer, thus enabling comparison of hundreds instead of a few genome sequences within reasonable time.

  4. Environment and future of the nuclear energy in France

    International Nuclear Information System (INIS)

    Lebas, G.

    1999-01-01

    This work presents the problem of the renewal of the French electro-nuclear park with respect to the energetic, economical, environmental, political and ethical aspects. The theoretical framework chosen for this analysis is the one of sustainable development because of the uncertainty, irreversibility and equity aspects characterizing this choice. Thus, this work evaluates the capacity of the nuclear technology to ensure the simultaneous reproduction of the economical sphere, of the human sphere and of the biosphere. The past, present and future energy situation of France is analyzed in the first chapter together with the characteristics of the nuclear choice. In the second chapter, the analysis of the different possible energy options leads to the conclusion that the nuclear option remains the most suitable for a conciliation between economy and ecology, but that a diversification of the reactor technologies is necessary to take advantage of the efficiency of each technology with respect to its use. The nuclear choice has the advantage to limit the arbitration between the economical, ecological, political and human stakes. The realization of the diversification project supposes to leave opened all energy options and to be prepared to the replacement of the present day power plants by 2010-2020. The success of this policy will depend on the risk mastery and information efforts that public authorities and nuclear industry actors will carry on to avoid any social opposition with respect to nuclear energy. (J.S.)

  5. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  6. Analyzing the Interplay of Failures and Workload on a Leadership-Class Supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, Esteban [University of Pittsburgh; Ni, Xiang [University of Illinois at Urbana-Champaign; Jones, Terry R [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The unprecedented computational power of cur- rent supercomputers now makes possible the exploration of complex problems in many scientific fields, from genomic analysis to computational fluid dynamics. Modern machines are powerful because they are massive: they assemble millions of cores and a huge quantity of disks, cards, routers, and other components. But it is precisely the size of these machines that glooms the future of supercomputing. A system that comprises many components has a high chance to fail, and fail often. In order to make the next generation of supercomputers usable, it is imperative to use some type of fault tolerance platform to run applications on large machines. Most fault tolerance strategies can be optimized for the peculiarities of each system and boost efficacy by keeping the system productive. In this paper, we aim to understand how failure characterization can improve resilience in several layers of the software stack: applications, runtime systems, and job schedulers. We examine the Titan supercomputer, one of the fastest systems in the world. We analyze a full year of Titan in production and distill the failure patterns of the machine. By looking into Titan s log files and using the criteria of experts, we provide a detailed description of the types of failures. In addition, we inspect the job submission files and describe how the system is used. Using those two sources, we cross correlate failures in the machine to executing jobs and provide a picture of how failures affect the user experience. We believe such characterization is fundamental in developing appropriate fault tolerance solutions for Cray systems similar to Titan.

  7. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  8. Modern Reality of Professional Development of a Future Teacher of Physical Culture: Informative-Educational Environment

    Directory of Open Access Journals (Sweden)

    Yury V. Dragnev

    2012-05-01

    Full Text Available The contribution is focused on informative-educational environment as reality of professional development of a future teacher of physical culture, states that the basis of strategic direction in education informatization of a future teacher of physical culture lies in access of students and teaching staff to the high-quality network educational informatiional resources.

  9. Modern Reality of Professional Development of a Future Teacher of Physical Culture: Informative-Educational Environment

    OpenAIRE

    Yury V. Dragnev

    2012-01-01

    The contribution is focused on informative-educational environment as reality of professional development of a future teacher of physical culture, states that the basis of strategic direction in education informatization of a future teacher of physical culture lies in access of students and teaching staff to the high-quality network educational informatiional resources.

  10. The Potential of Simulated Environments in Teacher Education: Current and Future Possibilities

    Science.gov (United States)

    Dieker, Lisa A.; Rodriguez, Jacqueline A.; Lignugaris/Kraft, Benjamin; Hynes, Michael C.; Hughes, Charles E.

    2014-01-01

    The future of virtual environments is evident in many fields but is just emerging in the field of teacher education. In this article, the authors provide a summary of the evolution of simulation in the field of teacher education and three factors that need to be considered as these environments further develop. The authors provide a specific…

  11. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jacobsen, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  12. Retail food environments research: Promising future with more work to be done.

    Science.gov (United States)

    Fuller, Daniel; Engler-Stringer, Rachel; Muhajarine, Nazeem

    2016-06-09

    As members of the scientific committee for the Food Environments in Canada conference, we reflect on the current state of food environments research in Canada. We are very encouraged that the field is growing and there have been many collaborative efforts to link researchers in Canada, including the 2015 Food Environments in Canada Symposium and Workshop. We believe there are 5 key challenges the field will need to collectively address: theory and causality; replication and extension; consideration of rural, northern and vulnerable populations; policy analysis; and intervention research. In addressing the challenges, we look forward to working together to conduct more sophisticated, complex and community-driven food environments research in the future.

  13. Comments on the parallelization efficiency of the Sunway TaihuLight supercomputer

    OpenAIRE

    Végh, János

    2016-01-01

    In the world of supercomputers, the large number of processors requires to minimize the inefficiencies of parallelization, which appear as a sequential part of the program from the point of view of Amdahl's law. The recently suggested new figure of merit is applied to the recently presented supercomputer, and the timeline of "Top 500" supercomputers is scrutinized using the metric. It is demonstrated, that in addition to the computing performance and power consumption, the new supercomputer i...

  14. Plastics, the environment and human health: current consensus and future trends

    OpenAIRE

    Thompson, Richard C.; Moore, Charles J.; vom Saal, Frederick S.; Swan, Shanna H.

    2009-01-01

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However...

  15. Future primary teacher‟s professional training to organizations of healthfocuses educational environment: theoretical models

    Directory of Open Access Journals (Sweden)

    Larysa Slyvka

    2017-03-01

    Full Text Available On the basis of the analysis of psycological and pedagogical literature the essence of thenotions “healthfocuses environment”, “primary teachers’ training to carry out healthfocusesenvironment”, “primary teachers’ readiness to carry out healthfocuses environment” isspecified. The model of preperation of the students of speciality “Primary Education” tohealthfocuses environment is offered. The totality of knowledge necessary for solvingprofessionally important tasks and realization healthpreserving activity is presented, skills andabilities which are the basis for realization of healthpreserving activity of future teachers insecondary educational establishments are stressed.Key words: health, healthy way of life, younger school-children, future primaryteachers’, healthfocuses environment.

  16. Recent results from the Swinburne supercomputer software correlator

    Science.gov (United States)

    Tingay, Steven; et al.

    I will descrcibe the development of software correlators on the Swinburne Beowulf supercomputer and recent work using the Cray XD-1 machine. I will also describe recent Australian and global VLBI experiments that have been processed on the Swinburne software correlator, along with imaging results from these data. The role of the software correlator in Australia's eVLBI project will be discussed.

  17. QCD on the BlueGene/L Supercomputer

    Science.gov (United States)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-03-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented.

  18. QCD on the BlueGene/L Supercomputer

    International Nuclear Information System (INIS)

    Bhanot, G.; Chen, D.; Gara, A.; Sexton, J.; Vranas, P.

    2005-01-01

    In June 2004 QCD was simulated for the first time at sustained speed exceeding 1 TeraFlops in the BlueGene/L supercomputer at the IBM T.J. Watson Research Lab. The implementation and performance of QCD in the BlueGene/L is presented

  19. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    OpenAIRE

    Khimich, O.M.; Molchanov, I.M.; Mova, V.І.; Nikolaichuk, О.O.; Popov, O.V.; Chistjakova, Т.V.; Yakovlev, M.F.; Tulchinsky, V.G.; Yushchenko, R.А.

    2016-01-01

    New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  20. Role of supercomputers in magnetic fusion and energy research programs

    International Nuclear Information System (INIS)

    Killeen, J.

    1985-06-01

    The importance of computer modeling in magnetic fusion (MFE) and energy research (ER) programs is discussed. The need for the most advanced supercomputers is described, and the role of the National Magnetic Fusion Energy Computer Center in meeting these needs is explained

  1. DOE's NERSC center deploys 10 Teraflops per second IBM supercomputer

    CERN Multimedia

    2003-01-01

    The National Energy Research Scientific Computing Center, funded by the USA DOE's Energy's Office of Science, put its 10 trillion calculations per second IBM supercomputer into service last week, providing researchers across the country with the most powerful computer for unclassified research in the United States (1 page).

  2. Climate Change Implications to the Global Security Environment, U.S. Interests, and Future Naval Operations

    Science.gov (United States)

    2011-03-14

    Title: Climate Change Implications to the Global Security Environment , U.S. Interests, and Future Naval Operations Thesis: This paper aims to...United States over the next 20 years. ·This is because it will aggravate existing problems such as poverty tensions, environmental degradation...Implications on the Global Security Environment As discussed above, the physical effects of climate change -rising sea levels, rising temperatures

  3. INTEGRATION OF PANDA WORKLOAD MANAGEMENT SYSTEM WITH SUPERCOMPUTERS

    Energy Technology Data Exchange (ETDEWEB)

    De, K [University of Texas at Arlington; Jha, S [Rutgers University; Maeno, T [Brookhaven National Laboratory (BNL); Mashinistov, R. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Nilsson, P [Brookhaven National Laboratory (BNL); Novikov, A. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Poyda, A. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Ryabinkin, E. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Teslyuk, A. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Tsulaia, V. [Lawrence Berkeley National Laboratory (LBNL); Velikhov, V. [Russian Research Center, Kurchatov Institute, Moscow, Russia; Wen, G. [University of Wisconsin, Madison; Wells, Jack C [ORNL; Wenaus, T [Brookhaven National Laboratory (BNL)

    2016-01-01

    Abstract The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the funda- mental nature of matter and the basic forces that shape our universe, and were recently credited for the dis- covery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data cen- ters are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Com- puting Facility (OLCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single- threaded workloads in parallel on Titan s multi-core worker nodes. This implementation was tested with a variety of

  4. Integration of Panda Workload Management System with supercomputers

    Science.gov (United States)

    De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.

    2016-09-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads

  5. Integration of PanDA workload management system with Titan supercomputer at OLCF

    Science.gov (United States)

    De, K.; Klimentov, A.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, the future LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). The current approach utilizes a modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multicore worker nodes. It also gives PanDA new capability to collect, in real time, information about unused worker nodes on Titan, which allows precise definition of the size and duration of jobs submitted to Titan according to available free resources. This capability significantly reduces PanDA job wait time while improving Titan's utilization efficiency. This implementation was tested with a variety of Monte-Carlo workloads on Titan and is being tested on several other supercomputing platforms. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  6. ASCI Red -- Experiences and lessons learned with a massively parallel teraFLOP supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Christon, M.A.; Crawford, D.A.; Hertel, E.S.; Peery, J.S.; Robinson, A.C. [Sandia National Labs., Albuquerque, NM (United States). Computational Physics R and D Dept.

    1997-06-01

    The Accelerated Strategic Computing Initiative (ASCI) program involves Sandia, Los Alamos and Lawrence Livermore National Laboratories. At Sandia National Laboratories, ASCI applications include large deformation transient dynamics, shock propagation, electromechanics, and abnormal thermal environments. In order to resolve important physical phenomena in these problems, it is estimated that meshes ranging from 10{sup 6} to 10{sup 9} grid points will be required. The ASCI program is relying on the use of massively parallel supercomputers initially capable of delivering over 1 TFLOPs to perform such demanding computations. The ASCI Red machine at Sandia National Laboratories consists of over 4,500 computational nodes with a peak computational rate of 1.8 TFLOPs, 567 GBytes of memory, and 2 TBytes of disk storage. Regardless of the peak FLOP rate, there are many issues surrounding the use of massively parallel supercomputers in a production environment. These issues include parallel I/O, mesh generation, visualization, archival storage, high-bandwidth networking and the development of parallel algorithms. In order to illustrate these issues and their solution with respect to ASCI Red, demonstration calculations of time-dependent buoyancy-dominated plumes, electromechanics, and shock propagation will be presented.

  7. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    Science.gov (United States)

    Noor, Ahmed

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  8. Supercomputing with TOUGH2 family codes for coupled multi-physics simulations of geologic carbon sequestration

    Science.gov (United States)

    Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.

    2015-12-01

    Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent

  9. Work group IV: Future directions for measures of the food and physical activity environments.

    Science.gov (United States)

    Story, Mary; Giles-Corti, Billie; Yaroch, Amy Lazarus; Cummins, Steven; Frank, Lawrence Douglas; Huang, Terry T-K; Lewis, LaVonna Blair

    2009-04-01

    Much progress has been made in the past 5 to 10 years in measuring and understanding the impact of the food and physical activity environments on behavioral outcomes. Nevertheless, this research is in its infancy. A work group was convened to identify current evidence gaps and barriers in food and physical activity environments and policy research measures, and develop recommendations to guide future directions for measurement and methodologic research efforts. A nominal group process was used to determine six priority areas for food and physical activity environments and policy measures to move the field forward by 2015, including: (1) identify relevant factors in the food and physical activity environments to measure, including those most amenable to change; (2) improve understanding of mechanisms for relationships between the environment and physical activity, diet, and obesity; (3) develop simplified measures that are sensitive to change, valid for different population groups and settings, and responsive to changing trends; (4) evaluate natural experiments to improve understanding of food and physical activity environments and their impact on behaviors and weight; (5) establish surveillance systems to predict and track change over time; and (6) develop standards for adopting effective health-promoting changes to the food and physical activity environments. The recommendations emanating from the work group highlight actions required to advance policy-relevant research related to food and physical activity environments.

  10. A Parametric Study on Using Active Debris Removal to Stabilize the Future LEO Debris Environment

    Science.gov (United States)

    Liou, J.C.

    2010-01-01

    Recent analyses of the instability of the orbital debris population in the low Earth orbit (LEO) region and the collision between Iridium 33 and Cosmos 2251 have reignited the interest in using active debris removal (ADR) to remediate the environment. There are; however, monumental technical, resources, operational, legal, and political challenges in making economically viable ADR a reality. Before a consensus on the need for ADR can be reached, a careful analysis of the effectiveness of ADR must be conducted. The goal is to demonstrate the feasibility of using ADR to preserve the future environment and to guide its implementation to maximize the benefit-cost ratio. This paper describes a comprehensive sensitivity study on using ADR to stabilize the future LEO debris environment. The NASA long-term, orbital debris evolutionary model, LEGEND, is used to quantify the effects of many key parameters. These parameters include (1) the starting epoch of ADR implementation, (2) various target selection criteria, (3) the benefits of collision avoidance maneuvers, (4) the consequence of targeting specific inclination or altitude regimes, (5) the consequence of targeting specific classes of vehicles, and (6) the timescale of removal. Additional analyses on the importance of postmission disposal and how future launches might affect the requirements to stabilize the environment are also included.

  11. Eigenmodes of superconducting cavities calculated on APE- supercomputers

    CERN Document Server

    Neugebauer, F

    2000-01-01

    The calculation of eigenmodes in superconducting cavities treated fully 3-dimensional is problematic on usual high end workstations due to the large amount of memory needed and the large number of floating point operations to be performed. Therefore the present approach uses a parallel SIMD supercomputer (APE-100) to deal with the task of finding the eigenvalues and associated eigenvectors of a large sparse matrix. The matrix is built up by the commercial software tool MAFIA and then sent to the nodes of the supercomputer where the package MAXQ solves the eigenvalue problem. The result of the diagonalization procedure is then read back to the MAFIA host where further data analysis and visualization can be done. (5 refs).

  12. Tryton Supercomputer Capabilities for Analysis of Massive Data Streams

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2015-09-01

    Full Text Available The recently deployed supercomputer Tryton, located in the Academic Computer Center of Gdansk University of Technology, provides great means for massive parallel processing. Moreover, the status of the Center as one of the main network nodes in the PIONIER network enables the fast and reliable transfer of data produced by miscellaneous devices scattered in the area of the whole country. The typical examples of such data are streams containing radio-telescope and satellite observations. Their analysis, especially with real-time constraints, can be challenging and requires the usage of dedicated software components. We propose a solution for such parallel analysis using the supercomputer, supervised by the KASKADA platform, which with the conjunction with immerse 3D visualization techniques can be used to solve problems such as pulsar detection and chronometric or oil-spill simulation on the sea surface.

  13. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  14. Intelligent Personal Supercomputer for Solving Scientific and Technical Problems

    Directory of Open Access Journals (Sweden)

    Khimich, O.M.

    2016-09-01

    Full Text Available New domestic intellіgent personal supercomputer of hybrid architecture Inparkom_pg for the mathematical modeling of processes in the defense industry, engineering, construction, etc. was developed. Intelligent software for the automatic research and tasks of computational mathematics with approximate data of different structures was designed. Applied software to provide mathematical modeling problems in construction, welding and filtration processes was implemented.

  15. Cellular-automata supercomputers for fluid-dynamics modeling

    International Nuclear Information System (INIS)

    Margolus, N.; Toffoli, T.; Vichniac, G.

    1986-01-01

    We report recent developments in the modeling of fluid dynamics, and give experimental results (including dynamical exponents) obtained using cellular automata machines. Because of their locality and uniformity, cellular automata lend themselves to an extremely efficient physical realization; with a suitable architecture, an amount of hardware resources comparable to that of a home computer can achieve (in the simulation of cellular automata) the performance of a conventional supercomputer

  16. Scientists turn to supercomputers for knowledge about universe

    CERN Multimedia

    White, G

    2003-01-01

    The DOE is funding the computers at the Center for Astrophysical Thermonuclear Flashes which is based at the University of Chicago and uses supercomputers at the nation's weapons labs to study explosions in and on certain stars. The DOE is picking up the project's bill in the hope that the work will help the agency learn to better simulate the blasts of nuclear warheads (1 page).

  17. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  18. Porting Ordinary Applications to Blue Gene/Q Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Maheshwari, Ketan C.; Wozniak, Justin M.; Armstrong, Timothy; Katz, Daniel S.; Binkowski, T. Andrew; Zhong, Xiaoliang; Heinonen, Olle; Karpeyev, Dmitry; Wilde, Michael

    2015-08-31

    Efficiently porting ordinary applications to Blue Gene/Q supercomputers is a significant challenge. Codes are often originally developed without considering advanced architectures and related tool chains. Science needs frequently lead users to want to run large numbers of relatively small jobs (often called many-task computing, an ensemble, or a workflow), which can conflict with supercomputer configurations. In this paper, we discuss techniques developed to execute ordinary applications over leadership class supercomputers. We use the high-performance Swift parallel scripting framework and build two workflow execution techniques-sub-jobs and main-wrap. The sub-jobs technique, built on top of the IBM Blue Gene/Q resource manager Cobalt's sub-block jobs, lets users submit multiple, independent, repeated smaller jobs within a single larger resource block. The main-wrap technique is a scheme that enables C/C++ programs to be defined as functions that are wrapped by a high-performance Swift wrapper and that are invoked as a Swift script. We discuss the needs, benefits, technicalities, and current limitations of these techniques. We further discuss the real-world science enabled by these techniques and the results obtained.

  19. Proceedings of the first energy research power supercomputer users symposium

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The Energy Research Power Supercomputer Users Symposium was arranged to showcase the richness of science that has been pursued and accomplished in this program through the use of supercomputers and now high performance parallel computers over the last year: this report is the collection of the presentations given at the Symposium. Power users'' were invited by the ER Supercomputer Access Committee to show that the use of these computational tools and the associated data communications network, ESNet, go beyond merely speeding up computations. Today the work often directly contributes to the advancement of the conceptual developments in their fields and the computational and network resources form the very infrastructure of today's science. The Symposium also provided an opportunity, which is rare in this day of network access to computing resources, for the invited users to compare and discuss their techniques and approaches with those used in other ER disciplines. The significance of new parallel architectures was highlighted by the interesting evening talk given by Dr. Stephen Orszag of Princeton University.

  20. Proceedings of the first energy research power supercomputer users symposium

    International Nuclear Information System (INIS)

    1991-01-01

    The Energy Research Power Supercomputer Users Symposium was arranged to showcase the richness of science that has been pursued and accomplished in this program through the use of supercomputers and now high performance parallel computers over the last year: this report is the collection of the presentations given at the Symposium. ''Power users'' were invited by the ER Supercomputer Access Committee to show that the use of these computational tools and the associated data communications network, ESNet, go beyond merely speeding up computations. Today the work often directly contributes to the advancement of the conceptual developments in their fields and the computational and network resources form the very infrastructure of today's science. The Symposium also provided an opportunity, which is rare in this day of network access to computing resources, for the invited users to compare and discuss their techniques and approaches with those used in other ER disciplines. The significance of new parallel architectures was highlighted by the interesting evening talk given by Dr. Stephen Orszag of Princeton University

  1. Extracting the Textual and Temporal Structure of Supercomputing Logs

    Energy Technology Data Exchange (ETDEWEB)

    Jain, S; Singh, I; Chandra, A; Zhang, Z; Bronevetsky, G

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an online clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.

  2. The future of marketing: an appropriate response to the environment changes

    Directory of Open Access Journals (Sweden)

    Victor DANCIU

    2013-05-01

    Full Text Available The future landscape of the business worldwide will have the marketing evolutions as a driver. These evolutions will be the response to the changes of business and marketing environment.The paper aims to analyze both the key trends that are shaping the macro environment, markets and consumers and their impact on the marketing at business level. First, these issues are presented as they result of both theoretical and applied various researches performed by numerous international and national organizations, universities, consulting and global companies, scholars and authors. These researches are read from the author’s scientific point of view, on the other hand, and some own considerations are revealed. They could be found mainly in the systematic approach by using the marketing paradigm and practices around the world, in order to keep successful the organizations. These organizations should give the proper skillful response to each of main futures challenges of marketing.

  3. Perspectives on Advanced Learning Technologies and Learning Networks and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    An overview of the advanced learning technologies is given in this presentation along with a brief description of their impact on future aerospace workforce development. The presentation is divided into five parts (see Figure 1). In the first part, a brief historical account of the evolution of learning technologies is given. The second part describes the current learning activities. The third part describes some of the future aerospace systems, as examples of high-tech engineering systems, and lists their enabling technologies. The fourth part focuses on future aerospace research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on learning technologies and learning networks.

  4. QMachine: commodity supercomputing in web browsers

    Science.gov (United States)

    2014-01-01

    Background Ongoing advancements in cloud computing provide novel opportunities in scientific computing, especially for distributed workflows. Modern web browsers can now be used as high-performance workstations for querying, processing, and visualizing genomics’ “Big Data” from sources like The Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC) without local software installation or configuration. The design of QMachine (QM) was driven by the opportunity to use this pervasive computing model in the context of the Web of Linked Data in Biomedicine. Results QM is an open-sourced, publicly available web service that acts as a messaging system for posting tasks and retrieving results over HTTP. The illustrative application described here distributes the analyses of 20 Streptococcus pneumoniae genomes for shared suffixes. Because all analytical and data retrieval tasks are executed by volunteer machines, few server resources are required. Any modern web browser can submit those tasks and/or volunteer to execute them without installing any extra plugins or programs. A client library provides high-level distribution templates including MapReduce. This stark departure from the current reliance on expensive server hardware running “download and install” software has already gathered substantial community interest, as QM received more than 2.2 million API calls from 87 countries in 12 months. Conclusions QM was found adequate to deliver the sort of scalable bioinformatics solutions that computation- and data-intensive workflows require. Paradoxically, the sandboxed execution of code by web browsers was also found to enable them, as compute nodes, to address critical privacy concerns that characterize biomedical environments. PMID:24913605

  5. QMachine: commodity supercomputing in web browsers.

    Science.gov (United States)

    Wilkinson, Sean R; Almeida, Jonas S

    2014-06-09

    Ongoing advancements in cloud computing provide novel opportunities in scientific computing, especially for distributed workflows. Modern web browsers can now be used as high-performance workstations for querying, processing, and visualizing genomics' "Big Data" from sources like The Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC) without local software installation or configuration. The design of QMachine (QM) was driven by the opportunity to use this pervasive computing model in the context of the Web of Linked Data in Biomedicine. QM is an open-sourced, publicly available web service that acts as a messaging system for posting tasks and retrieving results over HTTP. The illustrative application described here distributes the analyses of 20 Streptococcus pneumoniae genomes for shared suffixes. Because all analytical and data retrieval tasks are executed by volunteer machines, few server resources are required. Any modern web browser can submit those tasks and/or volunteer to execute them without installing any extra plugins or programs. A client library provides high-level distribution templates including MapReduce. This stark departure from the current reliance on expensive server hardware running "download and install" software has already gathered substantial community interest, as QM received more than 2.2 million API calls from 87 countries in 12 months. QM was found adequate to deliver the sort of scalable bioinformatics solutions that computation- and data-intensive workflows require. Paradoxically, the sandboxed execution of code by web browsers was also found to enable them, as compute nodes, to address critical privacy concerns that characterize biomedical environments.

  6. Anaesthesia in austere environments: literature review and considerations for future space exploration missions.

    Science.gov (United States)

    Komorowski, Matthieu; Fleming, Sarah; Mawkin, Mala; Hinkelbein, Jochen

    2018-01-01

    Future space exploration missions will take humans far beyond low Earth orbit and require complete crew autonomy. The ability to provide anaesthesia will be important given the expected risk of severe medical events requiring surgery. Knowledge and experience of such procedures during space missions is currently extremely limited. Austere and isolated environments (such as polar bases or submarines) have been used extensively as test beds for spaceflight to probe hazards, train crews, develop clinical protocols and countermeasures for prospective space missions. We have conducted a literature review on anaesthesia in austere environments relevant to distant space missions. In each setting, we assessed how the problems related to the provision of anaesthesia (e.g., medical kit and skills) are dealt with or prepared for. We analysed how these factors could be applied to the unique environment of a space exploration mission. The delivery of anaesthesia will be complicated by many factors including space-induced physiological changes and limitations in skills and equipment. The basic principles of a safe anaesthesia in an austere environment (appropriate training, presence of minimal safety and monitoring equipment, etc.) can be extended to the context of a space exploration mission. Skills redundancy is an important safety factor, and basic competency in anaesthesia should be part of the skillset of several crewmembers. The literature suggests that safe and effective anaesthesia could be achieved by a physician during future space exploration missions. In a life-or-limb situation, non-physicians may be able to conduct anaesthetic procedures, including simplified general anaesthesia.

  7. The Future of Coral Reefs Subject to Rapid Climate Change: Lessons from Natural Extreme Environments

    Directory of Open Access Journals (Sweden)

    Emma F. Camp

    2018-02-01

    Full Text Available Global climate change and localized anthropogenic stressors are driving rapid declines in coral reef health. In vitro experiments have been fundamental in providing insight into how reef organisms will potentially respond to future climates. However, such experiments are inevitably limited in their ability to reproduce the complex interactions that govern reef systems. Studies examining coral communities that already persist under naturally-occurring extreme and marginal physicochemical conditions have therefore become increasingly popular to advance ecosystem scale predictions of future reef form and function, although no single site provides a perfect analog to future reefs. Here we review the current state of knowledge that exists on the distribution of corals in marginal and extreme environments, and geographic sites at the latitudinal extremes of reef growth, as well as a variety of shallow reef systems and reef-neighboring environments (including upwelling and CO2 vent sites. We also conduct a synthesis of the abiotic data that have been collected at these systems, to provide the first collective assessment on the range of extreme conditions under which corals currently persist. We use the review and data synthesis to increase our understanding of the biological and ecological mechanisms that facilitate survival and success under sub-optimal physicochemical conditions. This comprehensive assessment can begin to: (i highlight the extent of extreme abiotic scenarios under which corals can persist, (ii explore whether there are commonalities in coral taxa able to persist in such extremes, (iii provide evidence for key mechanisms required to support survival and/or persistence under sub-optimal environmental conditions, and (iv evaluate the potential of current sub-optimal coral environments to act as potential refugia under changing environmental conditions. Such a collective approach is critical to better understand the future survival of

  8. The challenge of health & environment: profiling risks & strategic priorities for now & the future.

    Science.gov (United States)

    Narain, Jai P

    2012-08-01

    A substantial burden of communicable and non-communicable diseases in the developing countries is attributable to environmental risk factors. WHO estimates that the environmental factors are responsible for an estimated 24 per cent of the global burden of disease in terms of healthy life years lost and 23 per cent of all deaths; children being the worst sufferers. Given that the environment is linked with most of the Millennium Development Goals (MDGs), without proper attention to the environmental risk factors and their management, it will be difficult to achieve many MDGs by 2015. The impact of environmental degradation on health may continue well into the future and the situation in fact, is likely to get worse. In order to address this challenge, two facts are worth noting. First, that much of the environmental disease burden is attributable to a few critical risk factors which include unsafe water and sanitation, exposure to indoor smoke from cooking fuel, outdoor air pollution, exposure to chemicals such as arsenic, and climate change. Second, that environment and health aspects must become, as a matter of urgency, a national priority, both in terms of policy and resources allocation. To meet the challenge of health and environment now and in the future, the following strategic approaches must be considered which include conducting environmental and health impact assessments; strengthening national environmental health policy and infrastructure; fostering inter-sectoral co-ordination and partnerships; mobilizing public participation; and enhancing the leadership role of health in advocacy, stewardship and capacity building.

  9. EDF's experience with supercomputing and challenges ahead - towards multi-physics and multi-scale approaches

    International Nuclear Information System (INIS)

    Delbecq, J.M.; Banner, D.

    2003-01-01

    Nuclear power plants are a major asset of the EDF company. To remain so, in particular in a context of deregulation, competitiveness, safety and public acceptance are three conditions. These stakes apply both to existing plants and to future reactors. The purpose of the presentation is to explain how supercomputing can help EDF to satisfy these requirements. Three examples are described in detail: ensuring optimal use of nuclear fuel under wholly safe conditions, understanding and simulating the material deterioration mechanisms and moving forward with numerical simulation for the performance of EDF's activities. In conclusion, a broader vision of EDF long term R and D in the field of numerical simulation is given and especially of five challenges taken up by EDF together with its industrial and scientific partners. (author)

  10. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  11. Dust modelling and forecasting in the Barcelona Supercomputing Center: Activities and developments

    Energy Technology Data Exchange (ETDEWEB)

    Perez, C; Baldasano, J M; Jimenez-Guerrero, P; Jorba, O; Haustein, K; Basart, S [Earth Sciences Department. Barcelona Supercomputing Center. Barcelona (Spain); Cuevas, E [Izanaa Atmospheric Research Center. Agencia Estatal de Meteorologia, Tenerife (Spain); Nickovic, S [Atmospheric Research and Environment Branch, World Meteorological Organization, Geneva (Switzerland)], E-mail: carlos.perez@bsc.es

    2009-03-01

    The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.

  12. Exploration and production environment. Preserving the future our responsibility; Exploration et production environnement. Preserver l'avenir: notre responsabilite

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This document presents the Total Group commitments to manage natural resources in a rational way, to preserve biodiversity for future generations and protect the environment. It contains the health, safety, environment and quality charter of Total, the 12 exploration and production health, safety and environment rules and the exploration and production environmental policy. (A.L.B.)

  13. Palacios and Kitten : high performance operating systems for scalable virtualized and native supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Widener, Patrick (University of New Mexico); Jaconette, Steven (Northwestern University); Bridges, Patrick G. (University of New Mexico); Xia, Lei (Northwestern University); Dinda, Peter (Northwestern University); Cui, Zheng.; Lange, John (Northwestern University); Hudson, Trammell B.; Levenhagen, Michael J.; Pedretti, Kevin Thomas Tauke; Brightwell, Ronald Brian

    2009-09-01

    Palacios and Kitten are new open source tools that enable applications, whether ported or not, to achieve scalable high performance on large machines. They provide a thin layer over the hardware to support both full-featured virtualized environments and native code bases. Kitten is an OS under development at Sandia that implements a lightweight kernel architecture to provide predictable behavior and increased flexibility on large machines, while also providing Linux binary compatibility. Palacios is a VMM that is under development at Northwestern University and the University of New Mexico. Palacios, which can be embedded into Kitten and other OSes, supports existing, unmodified applications and operating systems by using virtualization that leverages hardware technologies. We describe the design and implementation of both Kitten and Palacios. Our benchmarks show that they provide near native, scalable performance. Palacios and Kitten provide an incremental path to using supercomputer resources that is not performance-compromised.

  14. Research to application: Supercomputing trends for the 90's - Opportunities for interdisciplinary computations

    International Nuclear Information System (INIS)

    Shankar, V.

    1991-01-01

    The progression of supercomputing is reviewed from the point of view of computational fluid dynamics (CFD), and multidisciplinary problems impacting the design of advanced aerospace configurations are addressed. The application of full potential and Euler equations to transonic and supersonic problems in the 70s and early 80s is outlined, along with Navier-Stokes computations widespread during the late 80s and early 90s. Multidisciplinary computations currently in progress are discussed, including CFD and aeroelastic coupling for both static and dynamic flexible computations, CFD, aeroelastic, and controls coupling for flutter suppression and active control, and the development of a computational electromagnetics technology based on CFD methods. Attention is given to computational challenges standing in a way of the concept of establishing a computational environment including many technologies. 40 refs

  15. A fast random number generator for the Intel Paragon supercomputer

    Science.gov (United States)

    Gutbrod, F.

    1995-06-01

    A pseudo-random number generator is presented which makes optimal use of the architecture of the i860-microprocessor and which is expected to have a very long period. It is therefore a good candidate for use on the parallel supercomputer Paragon XP. In the assembler version, it needs 6.4 cycles for a real∗4 random number. There is a FORTRAN routine which yields identical numbers up to rare and minor rounding discrepancies, and it needs 28 cycles. The FORTRAN performance on other microprocessors is somewhat better. Arguments for the quality of the generator and some numerical tests are given.

  16. New Mexico Supercomputing Challenge 1993 evaluation report. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Trainor, M.; Eker, P.; Kratzer, D.; Foster, M.; Anderson, M.

    1993-11-01

    This report provides the evaluation of the third year (1993) of the New Mexico High School Supercomputing Challenge. It includes data to determine whether we met the program objectives, measures participation, and compares progress from the first to the third years. This year`s report is a more complete assessment than last year`s, providing both formative and summative evaluation data. Data indicates that the 1993 Challenge significantly changed many students` career plans and attitudes toward science, provided professional development for teachers, and caused some changes in computer offerings in several participating schools.

  17. The radioactive risk - the future of radionuclides in the environment and their impacts on health

    International Nuclear Information System (INIS)

    Amiard, Jean-Claude

    2013-01-01

    This document contains a brief presentation and the table of contents of a book in which the author proposes a large synthesis of present knowledge on main radioactive pollutants (uranium, transuranic elements, caesium, strontium, iodine, tritium, carbon radioactive isotopes, and so on), their behaviour and their future in the various physical components of the environment and living organisms (including mankind). He presents the fundamentals of nuclear physics and chemistry, as well as their applications in different fields (military, energy, medicine, industry, etc.). He also addresses the important ecological and genetic notions, and recalls the anthropogenic origins of radionuclides in the environment: principles of radio-ecology, main radioactive risks, main drawbacks of the use of nuclear energy (wastes and their management), and nuclear accidents and their impact

  18. The present and future of microplastic pollution in the marine environment

    International Nuclear Information System (INIS)

    Ivar do Sul, Juliana A.; Costa, Monica F.

    2014-01-01

    Recently, research examining the occurrence of microplastics in the marine environment has substantially increased. Field and laboratory work regularly provide new evidence on the fate of microplastic debris. This debris has been observed within every marine habitat. In this study, at least 101 peer-reviewed papers investigating microplastic pollution were critically analysed (Supplementary material). Microplastics are commonly studied in relation to (1) plankton samples, (2) sandy and muddy sediments, (3) vertebrate and invertebrate ingestion, and (4) chemical pollutant interactions. All of the marine organism groups are at an eminent risk of interacting with microplastics according to the available literature. Dozens of works on other relevant issues (i.e., polymer decay at sea, new sampling and laboratory methods, emerging sources, externalities) were also analysed and discussed. This paper provides the first in-depth exploration of the effects of microplastics on the marine environment and biota. The number of scientific publications will increase in response to present and projected plastic uses and discard patterns. Therefore, new themes and important approaches for future work are proposed. Highlights: • >100 works on microplastic marine pollution were reviewed and discussed. • Microplastics (fibres, fragments, pellets) are widespread in oceans and sediments. • Microplastics interact with POPs and contaminate the marine biota when ingested. • The marine food web might be affected by microplastic biomagnification. • Urgently needed integrated approaches are suggested to different stakeholders. -- Microplastics, which are ubiquitous in marine habitats, affect all facets of the environment and continuously cause unexpected consequences for the environment and its biota

  19. [Perception of their social environment and their future in institutionalized school-age children].

    Science.gov (United States)

    Remetić, Mirjana; Tahirović, Husref; Loga, Slobodan

    2005-01-01

    Family home and institutions for children without parental care represent the rearing-environments where, from the early years, whole human development goes on. It's known today that despite the recognized importance of inborn traits, the influence of child-rearing environments dominates current models of development. The aim of the study was to investigate the satisfaction with the rearing-environment of school-aged institutionalized children, their dominating feeling and if institutionalization affects life optimism for now and for the future. The study was conducted in two institutions in Bosnia and Herzegovina who share the same care model imitating the traditional Bosnian families where the older children care for the younger siblings. We took as a sample 30 institutionalized children aged 8-12, and for the control group 60 children matched by age and sex. Parents, children and teachers who gave their informed consent answered the questionnaires. It was confirmed that children without parental care are vulnerable group and in a great risk who need urgent help of professional multidisciplinary team of their close and broad environment. Lack of social support cause the withdrawing and suffering and can lead soon or later to problematic behaviour.

  20. Strategic planning for future learning environments: an exploration of interpersonal, interprofessional and political factors.

    Science.gov (United States)

    Schmidt, Cathrine

    2013-09-01

    This article, written from the stance of a public planner and a policy maker, explores the challenges and potential in creating future learning environments through the concept of a new learning landscape. It is based on the belief that physical planning can support the strategic goals of universities. In Denmark, a political focus on education as a mean to improve national capacity for innovation and growth are redefining the universities role in society. This is in turn changing the circumstances for the physical planning. Drawing on examples of physical initiatives in three different scales--city, building and room scale, the paper highlights how space and place matters on an interpersonal, an interprofessional and a political level. The article suggests that a wider understanding of how new learning landscapes are created--both as a material reality and a political discourse--can help frame an emerging community of practice. This involves university leaders, faculty and students, architects, designers and urban planners, citizens and policy makers with the common goal of creating future learning environments today.

  1. Perspective: Environment, biodiversity, and the education of the physician of the future.

    Science.gov (United States)

    Gómez, Andrés; Balsari, Satchit; Nusbaum, Julie; Heerboth, Aaron; Lemery, Jay

    2013-02-01

    Ours is an age of unprecedented levels of environmental alteration and biodiversity loss. Beyond the exposure to environmental hazards, conditions such as environmental degradation, biotic impoverishment, climate change, and the loss of ecosystem services create important health threats by changing the ecology of many pathogens and increasing the incidence and/or severity of certain noncommunicable conditions. They also threaten health in the future by weakening the Earth's life support systems.Although physicians remain one of the most often accessed and most trusted sources of information about the environment, there is currently little emphasis on educating medical professionals about these environmental issues. This lack of training reduces the ability of most physicians to be efficient science-public interfaces and makes them ineffective at contributing to address the fundamental causes of environmental problems or participate in substantive environmental policy discussions. This is an important challenge facing medical education today.To turn medical students into effective physician-citizens, an already-overwhelmed medical school curriculum must make way for a thoughtful exploration of environmental stressors and their impacts on human health. The overarching question before medical educators is how to develop the competencies, standards, and curricula for this educational endeavor. To this end, the authors highlight some of the critical linkages between health and the environment and suggest a subset of key practical issues that need to be addressed in order to create environmental education standards for the physician of the future.

  2. IQ, the Urban Environment, and Their Impact on Future Schizophrenia Risk in Men.

    Science.gov (United States)

    Toulopoulou, Timothea; Picchioni, Marco; Mortensen, Preben Bo; Petersen, Liselotte

    2017-09-01

    Exposure to an urban environment during early life and low IQ are 2 well-established risk factors for schizophrenia. It is not known, however, how these factors might relate to one another. Data were pooled from the North Jutland regional draft board IQ assessments and the Danish Conscription Registry for men born between 1955 and 1993. Excluding those who were followed up for less than 1 year after the assessment yielded a final cohort of 153170 men of whom 578 later developed a schizophrenia spectrum disorder. We found significant effects of having an urban birth, and also experiencing an increase in urbanicity before the age of 10 years, on adult schizophrenia risk. The effect of urban birth was independent of IQ. However, there was a significant interaction between childhood changes in urbanization in the first 10 years and IQ level on the future adult schizophrenia risk. In short, those subjects who moved to more or less urban areas before their 10th birthday lost the protective effect of IQ. When thinking about adult schizophrenia risk, the critical time window of childhood sensitivity to changes in urbanization seems to be linked to IQ. Given the prediction that by 2050, over 80% of the developed world's population will live in an urban environment, this represents a major future public health issue. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. The TeraGyroid Experiment – Supercomputing 2003

    Directory of Open Access Journals (Sweden)

    R.J. Blake

    2005-01-01

    Full Text Available Amphiphiles are molecules with hydrophobic tails and hydrophilic heads. When dispersed in solvents, they self assemble into complex mesophases including the beautiful cubic gyroid phase. The goal of the TeraGyroid experiment was to study defect pathways and dynamics in these gyroids. The UK's supercomputing and USA's TeraGrid facilities were coupled together, through a dedicated high-speed network, into a single computational Grid for research work that peaked around the Supercomputing 2003 conference. The gyroids were modeled using lattice Boltzmann methods with parameter spaces explored using many 1283 and 3grid point simulations, this data being used to inform the world's largest three-dimensional time dependent simulation with 10243-grid points. The experiment generated some 2 TBytes of useful data. In terms of Grid technology the project demonstrated the migration of simulations (using Globus middleware to and fro across the Atlantic exploiting the availability of resources. Integration of the systems accelerated the time to insight. Distributed visualisation of the output datasets enabled the parameter space of the interactions within the complex fluid to be explored from a number of sites, informed by discourse over the Access Grid. The project was sponsored by EPSRC (UK and NSF (USA with trans-Atlantic optical bandwidth provided by British Telecommunications.

  4. Plane-wave electronic structure calculations on a parallel supercomputer

    International Nuclear Information System (INIS)

    Nelson, J.S.; Plimpton, S.J.; Sears, M.P.

    1993-01-01

    The development of iterative solutions of Schrodinger's equation in a plane-wave (pw) basis over the last several years has coincided with great advances in the computational power available for performing the calculations. These dual developments have enabled many new and interesting condensed matter phenomena to be studied from a first-principles approach. The authors present a detailed description of the implementation on a parallel supercomputer (hypercube) of the first-order equation-of-motion solution to Schrodinger's equation, using plane-wave basis functions and ab initio separable pseudopotentials. By distributing the plane-waves across the processors of the hypercube many of the computations can be performed in parallel, resulting in decreases in the overall computation time relative to conventional vector supercomputers. This partitioning also provides ample memory for large Fast Fourier Transform (FFT) meshes and the storage of plane-wave coefficients for many hundreds of energy bands. The usefulness of the parallel techniques is demonstrated by benchmark timings for both the FFT's and iterations of the self-consistent solution of Schrodinger's equation for different sized Si unit cells of up to 512 atoms

  5. Studying social interactions through immersive virtual environment technology: Virtues, pitfalls, and future challenges

    Directory of Open Access Journals (Sweden)

    Dario eBombari

    2015-06-01

    Full Text Available The goal of the present review is to explain how Immersive Virtual Environment Technology (IVET can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: Growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allow researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants or of the social partner (e.g., his/her race to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother.

  6. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges.

    Science.gov (United States)

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).

  7. Passive BCI in Operational Environments: Insights, Recent Advances, and Future Trends.

    Science.gov (United States)

    Arico, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Sciaraffa, Nicolina; Colosimo, Alfredo; Babiloni, Fabio

    2017-07-01

    This minireview aims to highlight recent important aspects to consider and evaluate when passive brain-computer interface (pBCI) systems would be developed and used in operational environments, and remarks future directions of their applications. Electroencephalography (EEG) based pBCI has become an important tool for real-time analysis of brain activity since it could potentially provide covertly-without distracting the user from the main task-and objectively-not affected by the subjective judgment of an observer or the user itself-information about the operator cognitive state. Different examples of pBCI applications in operational environments and new adaptive interface solutions have been presented and described. In addition, a general overview regarding the correct use of machine learning techniques (e.g., which algorithm to use, common pitfalls to avoid, etc.) in the pBCI field has been provided. Despite recent innovations on algorithms and neurotechnology, pBCI systems are not completely ready to enter the market yet, mainly due to limitations of the EEG electrodes technology, and algorithms reliability and capability in real settings. High complexity and safety critical systems (e.g., airplanes, ATM interfaces) should adapt their behaviors and functionality accordingly to the user' actual mental state. Thus, technologies (i.e., pBCIs) able to measure in real time the user's mental states would result very useful in such "high risk" environments to enhance human machine interaction, and so increase the overall safety.

  8. Harnessing Petaflop-Scale Multi-Core Supercomputing for Problems in Space Science

    Science.gov (United States)

    Albright, B. J.; Yin, L.; Bowers, K. J.; Daughton, W.; Bergen, B.; Kwan, T. J.

    2008-12-01

    The particle-in-cell kinetic plasma code VPIC has been migrated successfully to the world's fastest supercomputer, Roadrunner, a hybrid multi-core platform built by IBM for the Los Alamos National Laboratory. How this was achieved will be described and examples of state-of-the-art calculations in space science, in particular, the study of magnetic reconnection, will be presented. With VPIC on Roadrunner, we have performed, for the first time, plasma PIC calculations with over one trillion particles, >100× larger than calculations considered "heroic" by community standards. This allows examination of physics at unprecedented scale and fidelity. Roadrunner is an example of an emerging paradigm in supercomputing: the trend toward multi-core systems with deep hierarchies and where memory bandwidth optimization is vital to achieving high performance. Getting VPIC to perform well on such systems is a formidable challenge: the core algorithm is memory bandwidth limited with low compute-to-data ratio and requires random access to memory in its inner loop. That we were able to get VPIC to perform and scale well, achieving >0.374 Pflop/s and linear weak scaling on real physics problems on up to the full 12240-core Roadrunner machine, bodes well for harnessing these machines for our community's needs in the future. Many of the design considerations encountered commute to other multi-core and accelerated (e.g., via GPU) platforms and we modified VPIC with flexibility in mind. These will be summarized and strategies for how one might adapt a code for such platforms will be shared. Work performed under the auspices of the U.S. DOE by the LANS LLC Los Alamos National Laboratory. Dr. Bowers is a LANL Guest Scientist; he is presently at D. E. Shaw Research LLC, 120 W 45th Street, 39th Floor, New York, NY 10036.

  9. How to place your bet on the future coastal environment (Invited)

    Science.gov (United States)

    Plant, N. G.; Thieler, E. R.; Gutierrez, B. T.

    2010-12-01

    The future forms and functions of many coastal areas are unknown due to uncertainty in the future forces that will drive evolution (e.g., rate and magnitude of sea-level rise, changes in storminess) and due to limitations in our understanding of some of the evolutionary processes (e.g., coastal erosion, habitat response, human response). Anyone or anything (e.g., home owners, government officials, flora and fauna) making a decision that depends on the future coastal environment are essentially “betting” on one or more possible scenarios. Ideally, they should be able to evaluate the probability of winning (e.g., successfully protecting homes from storm surge, attracting tourists to beaches, preventing habitat loss). The USGS is integrating scientific understanding and uncertainty with coastal management problems in order to provide better assessments of existing bets and to evaluate the risk associated with some future sea-level rise scenarios. Here, we describe a focused study at Assateague Island National Seashore that includes both infrastructure and habitat that are sensitive to sea-level rise and climate change. Some coastal management bets are not overly sensitive to environmental uncertainty. This is because it may only be necessary to know whether environmental thresholds have been crossed (e.g., shoreline or water level intersects specific infrastructure or habitat boundaries) rather than knowing exact values for environmental parameters. Other bets are less clearly framed. Thus, coastal management questions provide the limits of integration of a probabilistic problem and allow complicated scientific understanding to be expressed as the risk of a particular bet. Nature holds the cards in this game, and investment in robust prediction is our ante. Who’s in?

  10. KfK-seminar series on supercomputing und visualization from May till September 1992

    International Nuclear Information System (INIS)

    Hohenhinnebusch, W.

    1993-05-01

    During the period of may 1992 to september 1992 a series of seminars was held at KfK on several topics of supercomputing in different fields of application. The aim was to demonstrate the importance of supercomputing and visualization in numerical simulations of complex physical and technical phenomena. This report contains the collection of all submitted seminar papers. (orig./HP) [de

  11. Japan Environment and Children's Study: backgrounds, activities, and future directions in global perspectives.

    Science.gov (United States)

    Ishitsuka, Kazue; Nakayama, Shoji F; Kishi, Reiko; Mori, Chisato; Yamagata, Zentaro; Ohya, Yukihiro; Kawamoto, Toshihiro; Kamijima, Michihiro

    2017-07-14

    There is worldwide concern about the effects of environmental factors on children's health and development. The Miami Declaration was signed at the G8 Environment Ministers Meeting in 1997 to promote children's environmental health research. The following ministerial meetings continued to emphasize the need to foster children's research. In response to such a worldwide movement, the Ministry of the Environment, Japan (MOE), launched a nationwide birth cohort study with 100,000 pairs of mothers and children, namely, the Japan Environment and Children's Study (JECS), in 2010. Other countries have also started or planned large-scale studies focusing on children's environmental health issues. The MOE initiated dialogue among those countries and groups to discuss and share the various processes, protocols, knowledge, and techniques for future harmonization and data pooling among such studies. The MOE formed the JECS International Liaison Committee in 2011, which plays a primary role in promoting the international collaboration between JECS and the other children's environmental health research projects and partnership with other countries. This review article aims to present activities that JECS has developed. As one of the committee's activities, a workshop and four international symposia were held between 2011 and 2015 in Japan. In these conferences, international researchers and government officials, including those from the World Health Organization, have made presentations on their own birth cohort studies and health policies. In 2015, the MOE hosted the International Advisory Board meeting and received constructive comments and recommendations from the board. JECS is a founding member of the Environment and Child Health International Birth Cohort Group, and has discussed harmonization of exposure and outcome measurements with member parties, which will make it possible to compare and further combine data from different studies, considering the diversity in the

  12. Futurism.

    Science.gov (United States)

    Foy, Jane Loring

    The objectives of this research report are to gain insight into the main problems of the future and to ascertain the attitudes that the general population has toward the treatment of these problems. In the first section of this report the future is explored socially, psychologically, and environmentally. The second section describes the techniques…

  13. Protecting the environment for future generations. Principles and actors in international environmental law

    Energy Technology Data Exchange (ETDEWEB)

    Proelss, Alexander (ed.) [Trier Univ. (Germany). Inst. of Environmental and Technology Law

    2017-08-01

    This book compiles the written versions of presentations held at the occasion of an international symposium entitled ''Protecting the Environment for Future Generations - Principles and Actors in International Environmental Law''. The symposium was organized by the Institute of Environmental and Technology Law of Trier University (IUTR) on the basis of a cooperation scheme with the Environmental Law Institute of the Johannes Kepler University Linz, Austria, and took place in Trier on 29-30 October 2015. It brought together a distinguished group of experts from Europe and abroad to address current issues of international and European environmental law. The main objective of the symposium was to take stock of the actors and principles of international and European environmental law, and to analyze how and to what extent these principles have been implemented on the supranational and domestic legal levels.

  14. Cities of the future-bionic systems of new urban environment.

    Science.gov (United States)

    Krzemińska, Alicja Edyta; Zaręba, Anna Danuta; Dzikowska, Anna; Jarosz, Katarzyna Rozalia

    2017-12-07

    The concepts of the cities we know nowadays, and which we are accustomed to, change at a very rapid pace. The philosophy of their design is also changing. It will base on new standards, entering a completely different, futuristic dimension. This stage is related to changes in the perception of space, location and lack of belonging to definite, national or cultural structures. Cities of the future are cities primarily intelligent, zero-energetic, zero-waste, environmentally sustainable, self-sufficient in terms of both organic food production and symbiosis between the environment and industry. New cities will be able to have new organisational structures-either city states, or, apolitical, jigsaw-like structures that can change their position-like in the case of the city of Artisanopolis, designed as a floating city, close to the land, reminiscent of the legendary Atlantis. This paper is focused on the main issues connected with problems of the contemporary city planning. The purpose of the research was to identify existing technological solutions, whose aim is to use solar energy and urban greenery. The studies were based on literature related to future city development issues and futuristic projects of the architects and city planners. In the paper, the following issues have been verified: futuristic cities and districts, and original bionic buildings, both residential and industrial. The results of the analysis have been presented in a tabular form.

  15. Behavioral studies on anxiety and depression in a drug discovery environment: keys to a successful future.

    Science.gov (United States)

    Bouwknecht, J Adriaan

    2015-04-15

    The review describes a personal journey through 25 years of animal research with a focus on the contribution of rodent models for anxiety and depression to the development of new medicines in a drug discovery environment. Several classic acute models for mood disorders are briefly described as well as chronic stress and disease-induction models. The paper highlights a variety of factors that influence the quality and consistency of behavioral data in a laboratory setting. The importance of meta-analysis techniques for study validation (tolerance interval) and assay sensitivity (Monte Carlo modeling) are demonstrated by examples that use historic data. It is essential for successful discovery of new potential drugs to maintain a high level of control in animal research and to bridge knowledge across in silico modeling, and in vitro and in vivo assays. Today, drug discovery is a highly dynamic environment in search of new types of treatments and new animal models which should be guided by enhanced two-way translation between bench and bed. Although productivity has been disappointing in the search of new and better medicines in psychiatry over the past decades, there has been and will always be an important role for in vivo models in-between preclinical discovery and clinical development. The right balance between good science and proper judgment versus a decent level of innovation, assay development and two-way translation will open the doors to a very bright future. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Review of the ASDEX upgrade data acquisition environment - present operation and future requirements

    International Nuclear Information System (INIS)

    Behler, K.; Blank, H.; Buhler, A.; Drube, R.; Friedrich, H.; Foerster, K.; Hallatschek, K.; Heimann, P.; Hertweck, F.; Maier, J.; Heimann, R.; Hertweck, F.; Maier, J.; Merkel, R.; Pacco-Duechs, M.-G.; Raupp, G.; Reuter, H.; Schneider-Maxon, U.; Tisma, R.; Zilker, M.

    1999-01-01

    The data acquisition environment of the ASDEX upgrade fusion experiment was designed in the late 1980s to handle a predicted quantity of 8 Mbytes fo data per discharge. After 7 years of operation a review of the whole data acquisition and analysis environment shows what remains of the original design ideas. Comparing the original 15 diagnostics with the present set of 250 diagnostic datasets generated per shot shows how the system has grown. Although now a vast accumulation of functional parts, the system still works in a stable manner and is maintainable. The underlying concepts affirming these qualities are modularity and compatibility. Modularity ensures that most parts of the system can be modified without affecting others. Standards for data structures and interfaces between components and methods are the prerequisites which make modularity work. The experience of the last few years shows that, besides the standards achieved, new, mainly real-time, features are needed: real-time event recognition allowing reaction to complex changing conditions; real-time wavelet analysis allowing adapted sampling rates; real-time data exchange between diagnostics and control; real-time networks allowing flexible computer coupling to permit interplay between different components; object-oriented programming concepts and databases are required for readily adaptable software modules. A final assessment of our present data processing situation and future requirements shows that modern information technology methods have to be applied more intensively to provide the most flexible means to improve the interaction of all components on a large fusion device. (orig.)

  17. The present and future of microplastic pollution in the marine environment.

    Science.gov (United States)

    Ivar do Sul, Juliana A; Costa, Monica F

    2014-02-01

    Recently, research examining the occurrence of microplastics in the marine environment has substantially increased. Field and laboratory work regularly provide new evidence on the fate of microplastic debris. This debris has been observed within every marine habitat. In this study, at least 101 peer-reviewed papers investigating microplastic pollution were critically analysed (Supplementary material). Microplastics are commonly studied in relation to (1) plankton samples, (2) sandy and muddy sediments, (3) vertebrate and invertebrate ingestion, and (4) chemical pollutant interactions. All of the marine organism groups are at an eminent risk of interacting with microplastics according to the available literature. Dozens of works on other relevant issues (i.e., polymer decay at sea, new sampling and laboratory methods, emerging sources, externalities) were also analysed and discussed. This paper provides the first in-depth exploration of the effects of microplastics on the marine environment and biota. The number of scientific publications will increase in response to present and projected plastic uses and discard patterns. Therefore, new themes and important approaches for future work are proposed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. SUPERCOMPUTER SIMULATION OF CRITICAL PHENOMENA IN COMPLEX SOCIAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Petrus M.A. Sloot

    2014-09-01

    Full Text Available The paper describes a problem of computer simulation of critical phenomena in complex social systems on a petascale computing systems in frames of complex networks approach. The three-layer system of nested models of complex networks is proposed including aggregated analytical model to identify critical phenomena, detailed model of individualized network dynamics and model to adjust a topological structure of a complex network. The scalable parallel algorithm covering all layers of complex networks simulation is proposed. Performance of the algorithm is studied on different supercomputing systems. The issues of software and information infrastructure of complex networks simulation are discussed including organization of distributed calculations, crawling the data in social networks and results visualization. The applications of developed methods and technologies are considered including simulation of criminal networks disruption, fast rumors spreading in social networks, evolution of financial networks and epidemics spreading.

  19. Lectures in Supercomputational Neurosciences Dynamics in Complex Brain Networks

    CERN Document Server

    Graben, Peter beim; Thiel, Marco; Kurths, Jürgen

    2008-01-01

    Computational Neuroscience is a burgeoning field of research where only the combined effort of neuroscientists, biologists, psychologists, physicists, mathematicians, computer scientists, engineers and other specialists, e.g. from linguistics and medicine, seem to be able to expand the limits of our knowledge. The present volume is an introduction, largely from the physicists' perspective, to the subject matter with in-depth contributions by system neuroscientists. A conceptual model for complex networks of neurons is introduced that incorporates many important features of the real brain, such as various types of neurons, various brain areas, inhibitory and excitatory coupling and the plasticity of the network. The computational implementation on supercomputers, which is introduced and discussed in detail in this book, will enable the readers to modify and adapt the algortihm for their own research. Worked-out examples of applications are presented for networks of Morris-Lecar neurons to model the cortical co...

  20. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    Science.gov (United States)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  1. Solving global shallow water equations on heterogeneous supercomputers.

    Directory of Open Access Journals (Sweden)

    Haohuan Fu

    Full Text Available The scientific demand for more accurate modeling of the climate system calls for more computing power to support higher resolutions, inclusion of more component models, more complicated physics schemes, and larger ensembles. As the recent improvements in computing power mostly come from the increasing number of nodes in a system and the integration of heterogeneous accelerators, how to scale the computing problems onto more nodes and various kinds of accelerators has become a challenge for the model development. This paper describes our efforts on developing a highly scalable framework for performing global atmospheric modeling on heterogeneous supercomputers equipped with various accelerators, such as GPU (Graphic Processing Unit, MIC (Many Integrated Core, and FPGA (Field Programmable Gate Arrays cards. We propose a generalized partition scheme of the problem domain, so as to keep a balanced utilization of both CPU resources and accelerator resources. With optimizations on both computing and memory access patterns, we manage to achieve around 8 to 20 times speedup when comparing one hybrid GPU or MIC node with one CPU node with 12 cores. Using a customized FPGA-based data-flow engines, we see the potential to gain another 5 to 8 times improvement on performance. On heterogeneous supercomputers, such as Tianhe-1A and Tianhe-2, our framework is capable of achieving ideally linear scaling efficiency, and sustained double-precision performances of 581 Tflops on Tianhe-1A (using 3750 nodes and 3.74 Pflops on Tianhe-2 (using 8644 nodes. Our study also provides an evaluation on the programming paradigm of various accelerator architectures (GPU, MIC, FPGA for performing global atmospheric simulation, to form a picture about both the potential performance benefits and the programming efforts involved.

  2. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to

  3. MEDIA ENVIRONMENT AS FACTOR OF REALIZATION OF CREATIVE POTENTIAL OF FUTURE TEACHERS` IN THE MOUNTAIN SCHOOLS OF THE UKRAINIAN CARPATHIANS

    Directory of Open Access Journals (Sweden)

    Alla Lebedieva

    2015-04-01

    Full Text Available The article shows up “media environment” as a factor of future teachers` creative potential realization in the mountainous schools of the Ukrainian Carpathians. The problem of using media environment as a factor of future teachers` creative potential in the mountainous schools of the Ukrainian Carpathians and the ways of its optimization is the main point of this research. Highlights ways to modernize social and professional orientation training of students in the creative process of nature is situates in information education and educational environment of high school. We consider the causal link use media environment as a factor of future teachers` creative potential and complexity of the teacher in the mountainous schools of the Ukrainian Carpathians. The basic function of the media environment are extensity, instrumental, communicative, interactive, multimedia. Reveals some aspects of training students to creatively active teaching process we describe subjects with objective possibilities in the formation of professional skills of future teachers` and which directly affect the realization of creative potential – “Ukrainian folk art”, “Basic recitation and rhetoric”, “The basis of pedagogical creativity”. The necessity of creating a full-fledged media environment in higher education is important condition of successful education as an important factor that allows the efficiency of the creative potential of future teachers` in the mountainous schools of the Ukrainian Carpathians.

  4. Plastics, the environment and human health: current consensus and future trends.

    Science.gov (United States)

    Thompson, Richard C; Moore, Charles J; vom Saal, Frederick S; Swan, Shanna H

    2009-07-27

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However, concerns about usage and disposal are diverse and include accumulation of waste in landfills and in natural habitats, physical problems for wildlife resulting from ingestion or entanglement in plastic, the leaching of chemicals from plastic products and the potential for plastics to transfer chemicals to wildlife and humans. However, perhaps the most important overriding concern, which is implicit throughout this volume, is that our current usage is not sustainable. Around 4 per cent of world oil production is used as a feedstock to make plastics and a similar amount is used as energy in the process. Yet over a third of current production is used to make items of packaging, which are then rapidly discarded. Given our declining reserves of fossil fuels, and finite capacity for disposal of waste to landfill, this linear use of hydrocarbons, via packaging and other short-lived applications of plastic, is simply not sustainable. There are solutions, including material reduction, design for end-of-life recyclability, increased recycling capacity, development of bio-based feedstocks, strategies to reduce littering, the application of green chemistry life-cycle analyses and revised risk assessment approaches. Such measures will be most effective through the combined actions of the public, industry, scientists and policymakers. There is some urgency, as the quantity of plastics produced in the first 10 years of the current century is likely to approach the quantity produced in the

  5. Plastics, the environment and human health: current consensus and future trends

    Science.gov (United States)

    Thompson, Richard C.; Moore, Charles J.; vom Saal, Frederick S.; Swan, Shanna H.

    2009-01-01

    Plastics have transformed everyday life; usage is increasing and annual production is likely to exceed 300 million tonnes by 2010. In this concluding paper to the Theme Issue on Plastics, the Environment and Human Health, we synthesize current understanding of the benefits and concerns surrounding the use of plastics and look to future priorities, challenges and opportunities. It is evident that plastics bring many societal benefits and offer future technological and medical advances. However, concerns about usage and disposal are diverse and include accumulation of waste in landfills and in natural habitats, physical problems for wildlife resulting from ingestion or entanglement in plastic, the leaching of chemicals from plastic products and the potential for plastics to transfer chemicals to wildlife and humans. However, perhaps the most important overriding concern, which is implicit throughout this volume, is that our current usage is not sustainable. Around 4 per cent of world oil production is used as a feedstock to make plastics and a similar amount is used as energy in the process. Yet over a third of current production is used to make items of packaging, which are then rapidly discarded. Given our declining reserves of fossil fuels, and finite capacity for disposal of waste to landfill, this linear use of hydrocarbons, via packaging and other short-lived applications of plastic, is simply not sustainable. There are solutions, including material reduction, design for end-of-life recyclability, increased recycling capacity, development of bio-based feedstocks, strategies to reduce littering, the application of green chemistry life-cycle analyses and revised risk assessment approaches. Such measures will be most effective through the combined actions of the public, industry, scientists and policymakers. There is some urgency, as the quantity of plastics produced in the first 10 years of the current century is likely to approach the quantity produced in the

  6. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    OpenAIRE

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment Programme (UNEP), Nairobi, Kenia; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to 2050. The study was carried out in support of the Agenda 21 interim evaluation, five years after 'Rio' and ten years after 'Brundtland'. The scenario analysis is based on only one scenario, Conventional...

  7. Gene x environment interactions in conduct disorder: Implications for future treatments.

    Science.gov (United States)

    Holz, Nathalie E; Zohsel, Katrin; Laucht, Manfred; Banaschewski, Tobias; Hohmann, Sarah; Brandeis, Daniel

    2016-08-18

    Conduct disorder (CD) causes high financial and social costs, not only in affected families but across society, with only moderately effective treatments so far. There is consensus that CD is likely caused by the convergence of many different factors, including genetic and adverse environmental factors. There is ample evidence of gene-environment interactions in the etiology of CD on a behavioral level regarding genetically sensitive designs and candidate gene-driven approaches, most prominently and consistently represented by MAOA. However, conclusive indications of causal GxE patterns are largely lacking. Inconsistent findings, lack of replication and methodological limitations remain a major challenge. Likewise, research addressing the identification of affected brain pathways which reflect plausible biological mechanisms underlying GxE is still very sparse. Future research will have to take multilevel approaches into account, which combine genetic, environmental, epigenetic, personality, neural and hormone perspectives. A better understanding of relevant GxE patterns in the etiology of CD might enable researchers to design customized treatment options (e.g. biofeedback interventions) for specific subgroups of patients. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. TRAINING OF FUTURE TEACHER OF INFORMATICS TO WORK IN MODERN INFORMATION AND EDUCATIONAL ENVIRONMENT OF SCHOOL

    Directory of Open Access Journals (Sweden)

    V. Shovkun

    2015-05-01

    Full Text Available The article analyzes the impact of new information and communication technologies in formation trends for changes in the education system. An important factor according to specific trends and satisfying the educational needs of students in the school is to create an information and communication environment (ICE. This requires the presence in educational institutions the specialists able to advise the management on the choice of hardware and software, to the design, implementation, configuration programs, serve teaching aid and others. Anonymous survey of teachers of Informatics of Kherson region is conducted and it revealed that in most cases the defined functions are performed exactly by teachers of Informatics. Only a few schools have special workers or appeal to workers or companies that provide related services. Therefore, special importance is the preparation of future teachers of Informatics for continuous tracking trends of educational technologies, self-reliant mastering of new services and applications, finding ways for their implementation in the educational process of the school, consulting colleagues, conducting explanatory work with parents. Also, in the survey we determined the level of equipment and working conditions of teachers of Informatics at school and at home.

  9. The Future of Nonproliferation in a Changed and Changing Environment: A Workshop Summary

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-30

    The Center for Global Security Research and Global Security Principal Directorate at Lawrence Livermore National Laboratory convened a workshop in July 2016 to consider “The Future of Nonproliferation in a Changed and Changing Security Environment.” We took a broad view of nonproliferation, encompassing not just the treaty regime but also arms control, threat reduction, counter-­proliferation, and countering nuclear terrorism. We gathered a group of approximately 60 experts from the technical, academic, political, defense and think tank communities and asked them what—and how much—can reasonably be accomplished in each of these areas in the 5 to 10 years ahead. Discussion was on a not-­for-­attribution basis. This document provides a summary of key insights and lessons learned, and is provided to help stimulate broader public discussion of these issues. It is a collection of ideas as informally discussed and debated among a group of experts. The ideas reported here are the personal views of individual experts and should not be attributed to Lawrence Livermore National Laboratory.

  10. A 2-layer and P2P-based architecture on resource location in future grid environment

    International Nuclear Information System (INIS)

    Pei Erming; Sun Gongxin; Zhang Weiyi; Pang Yangguang; Gu Ming; Ma Nan

    2004-01-01

    Grid and Peer-to-Peer computing are two distributed resource sharing environments developing rapidly in recent years. The final objective of Grid, as well as that of P2P technology, is to pool large sets of resources effectively to be used in a more convenient, fast and transparent way. We can speculate that, though many difference exists, Grid and P2P environments will converge into a large scale resource sharing environment that combines the characteristics of the two environments: large diversity, high heterogeneity (of resources), dynamism, and lack of central control. Resource discovery in this future Grid environment is a basic however, important problem. In this article. We propose a two-layer and P2P-based architecture for resource discovery and design a detailed algorithm for resource request propagation in the computing environment discussed above. (authors)

  11. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; van Woerden JW; Alcamo J; Berk MM; Bol P; van den Born GJ; ten Brink BJE; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; MNV

    1997-01-01

    Dit rapport bevat de details van de scenario-analyse in de gelijktijdig verschijnende eerste Global Environment Outlook, onder auspicien van UNEP. Dit is een proeve van een wereldmilieuverkenning tot 2015 met een doorkijkje naar 2050. De studie is uitgevoerd ten behoeve van de tussenbalans van

  12. Novel Supercomputing Approaches for High Performance Linear Algebra Using FPGAs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Supercomputing plays a major role in many areas of science and engineering, and it has had tremendous impact for decades in areas such as aerospace, defense, energy,...

  13. Novel Supercomputing Approaches for High Performance Linear Algebra Using FPGAs, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Supercomputing plays a major role in many areas of science and engineering, and it has had tremendous impact for decades in areas such as aerospace, defense, energy,...

  14. Micro-mechanical Simulations of Soils using Massively Parallel Supercomputers

    Directory of Open Access Journals (Sweden)

    David W. Washington

    2004-06-01

    Full Text Available In this research a computer program, Trubal version 1.51, based on the Discrete Element Method was converted to run on a Connection Machine (CM-5,a massively parallel supercomputer with 512 nodes, to expedite the computational times of simulating Geotechnical boundary value problems. The dynamic memory algorithm in Trubal program did not perform efficiently in CM-2 machine with the Single Instruction Multiple Data (SIMD architecture. This was due to the communication overhead involving global array reductions, global array broadcast and random data movement. Therefore, a dynamic memory algorithm in Trubal program was converted to a static memory arrangement and Trubal program was successfully converted to run on CM-5 machines. The converted program was called "TRUBAL for Parallel Machines (TPM." Simulating two physical triaxial experiments and comparing simulation results with Trubal simulations validated the TPM program. With a 512 nodes CM-5 machine TPM produced a nine-fold speedup demonstrating the inherent parallelism within algorithms based on the Discrete Element Method.

  15. ALCF Data Science Program: Productive Data-centric Supercomputing

    Science.gov (United States)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  16. FUTURES

    DEFF Research Database (Denmark)

    Pedersen, Michael Haldrup

    2017-01-01

    Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores the potenti......Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores...... the potentials of speculative thinking in relation to design and social and cultural studies, arguing that both offer valuable insights for creating a speculative space for new emergent criticalities challenging current assumptions of the relations between power and design. It does so by tracing out discussions...... of ‘futurity’ and ‘futuring’ in design as well as social and cultural studies. Firstly, by discussing futurist and speculative approaches in design thinking; secondly by engaging with ideas of scenario thinking and utopianism in current social and cultural studies; and thirdly by showing how the articulation...

  17. SUPERCOMPUTERS FOR AIDING ECONOMIC PROCESSES WITH REFERENCE TO THE FINANCIAL SECTOR

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2014-12-01

    Full Text Available The article discusses the use of supercomputers to support business processes with particular emphasis on the financial sector. A reference was made to the selected projects that support economic development. In particular, we propose the use of supercomputers to perform artificial intel-ligence methods in banking. The proposed methods combined with modern technology enables a significant increase in the competitiveness of enterprises and banks by adding new functionality.

  18. Suitability of Agent Technology for Military Command and Control in the Future Combat System Environment

    Energy Technology Data Exchange (ETDEWEB)

    Potok, TE

    2003-02-13

    The U.S. Army is faced with the challenge of dramatically improving its war fighting capability through advanced technologies. Any new technology must provide significant improvement over existing technologies, yet be reliable enough to provide a fielded system. The focus of this paper is to assess the novelty and maturity of agent technology for use in the Future Combat System (FCS). The FCS concept represents the U.S. Army's ''mounted'' form of the Objective Force. This concept of vehicles, communications, and weaponry is viewed as a ''system of systems'' which includes net-centric command and control (C{sup 2}) capabilities. This networked C{sup 2} is an important transformation from the historically centralized, or platform-based, C{sup 2} function since a centralized command architecture may become a decision-making and execution bottleneck, particularly as the pace of war accelerates. A mechanism to ensure an effective network-centric C{sup 2} capacity (combining intelligence gathering and analysis available at lower levels in the military hierarchy) is needed. Achieving a networked C{sup 2} capability will require breakthroughs in current software technology. Many have proposed the use of agent technology as a potential solution. Agents are an emerging technology, and it is not yet clear whether it is suitable for addressing the networked C{sup 2} challenge, particularly in satisfying battlespace scalability, mobility, and security expectations. We have developed a set of software requirements for FCS based on military requirements for this system. We have then evaluated these software requirements against current computer science technology. This analysis provides a set of limitations in the current technology when applied to the FCS challenge. Agent technology is compared against this set of limitations to provide a means of assessing the novelty of agent technology in an FCS environment. From this analysis we

  19. Gene-environment interactions and alcohol use and dependence: current status and future challenges

    NARCIS (Netherlands)

    Zwaluw, C.S. van der; Engels, R.C.M.E.

    2009-01-01

    To discuss the current status of gene-environment interaction research with regard to alcohol use and dependence. Further, we highlight the difficulties concerning gene-environment studies. Overview of the current evidence for gene-environment interactions in alcohol outcomes, and of the associated

  20. Toward a Sustainable Future: The Role of Student Affairs in Creating Healthy Environments, Social Justice, and Strong Economies

    Science.gov (United States)

    ACPA College Student Educators International, 2008

    2008-01-01

    "Toward a Sustainable Future: The Role of Student Affairs in Creating Healthy Environments, Social Justice, and Strong Economies" is a call to action for college student educators, articulating the crucial role they play in the international sustainability movement. It contains valuable information about educating self, educating students, and…

  1. Effects of Self-Efficacy, Emotional Intelligence, and Perceptions of Future Work Environment on Preservice Teacher Commitment

    Science.gov (United States)

    Chesnut, Steven R.; Cullen, Theresa A.

    2014-01-01

    This study was designed to examine the effects of self-efficacy, expectations of future work environment, and emotional intelligence on preservice teacher commitment to the teaching profession on a sample of 209 preservice teachers. The purpose of the study was to add to the existing knowledge surrounding preservice teacher commitment and promote…

  2. TOPICAL REVIEW: Simulating functional magnetic materials on supercomputers

    Science.gov (United States)

    Gruner, Markus Ernst; Entel, Peter

    2009-07-01

    The recent passing of the petaflop per second landmark by the Roadrunner project at the Los Alamos National Laboratory marks a preliminary peak of an impressive world-wide development in the high-performance scientific computing sector. Also, purely academic state-of-the-art supercomputers such as the IBM Blue Gene/P at Forschungszentrum Jülich allow us nowadays to investigate large systems of the order of 103 spin polarized transition metal atoms by means of density functional theory. Three applications will be presented where large-scale ab initio calculations contribute to the understanding of key properties emerging from a close interrelation between structure and magnetism. The first two examples discuss the size dependent evolution of equilibrium structural motifs in elementary iron and binary Fe-Pt and Co-Pt transition metal nanoparticles, which are currently discussed as promising candidates for ultra-high-density magnetic data storage media. However, the preference for multiply twinned morphologies at smaller cluster sizes counteracts the formation of a single-crystalline L10 phase, which alone provides the required hard magnetic properties. The third application is concerned with the magnetic shape memory effect in the Ni-Mn-Ga Heusler alloy, which is a technologically relevant candidate for magnetomechanical actuators and sensors. In this material strains of up to 10% can be induced by external magnetic fields due to the field induced shifting of martensitic twin boundaries, requiring an extremely high mobility of the martensitic twin boundaries, but also the selection of the appropriate martensitic structure from the rich phase diagram.

  3. 75 FR 57546 - The Future of Aviation Advisory Committee (FAAC) Environment Subcommittee; Notice of Meeting

    Science.gov (United States)

    2010-09-21

    ..., and opportunities of the global economy. The Environment Subcommittee is charged with examining steps... consideration of potential approaches to promote effective international actions through the International Civil...

  4. 75 FR 44303 - The Future of Aviation Advisory Committee (FAAC) Environment Subcommittee; Notice of Meeting

    Science.gov (United States)

    2010-07-28

    ... economy. The Environment Subcommittee is charged with examining steps and strategies that can be taken by... to promote effective international actions through the International Civil Aviation Organization...

  5. Aging Well and the Environment: Toward an Integrative Model and Research Agenda for the Future

    Science.gov (United States)

    Wahl, Hans-Werner; Iwarsson, Susanne; Oswald, Frank

    2012-01-01

    Purpose of the Study: The effects of the physical-spatial-technical environment on aging well have been overlooked both conceptually and empirically. In the spirit of M. Powell Lawton's seminal work on aging and environment, this article attempts to rectify this situation by suggesting a new model of how older people interact with their…

  6. Sustainability - What are the Odds? Guessing the Future of our Environment, Economy, and Society

    Science.gov (United States)

    This article examines the concept of sustainability from a global perspective, describing how alternative futures might develop in the environmental, economic, and social dimensions. The alternatives to sustainability appear to be (a) a catastrophic failure of life support, econo...

  7. The Future of Deterrent Capability for Medium-Sized Western Powers in the New Environment

    International Nuclear Information System (INIS)

    Quinlan, Michael

    2001-01-01

    What should be the longer-term future for the nuclear-weapons capabilities of France and the United Kingdom? I plan to tackle the subject in concrete terms. My presentation will be divided into three parts, and, though they are distinct rather than separate, they interact extensively. The first and largest part will relate to strategic context and concept: what aims, justifications and limitations should guide the future, or the absence of a future, for our capabilities? The second part, a good deal briefer, will be the practical content and character of the capabilities: what questions for decision will arise, and in what timescale, about the preservation, improvement or adjustment of the present capabilities? And the third part, still more briefly, will concern the political and institutional framework into which their future should or might be fitted. (author)

  8. Carry-over effects of the social environment on future divorce probability in a wild bird population.

    Science.gov (United States)

    Culina, Antica; Hinde, Camilla A; Sheldon, Ben C

    2015-10-22

    Initial mate choice and re-mating strategies (infidelity and divorce) influence individual fitness. Both of these should be influenced by the social environment, which determines the number and availability of potential partners. While most studies looking at this relationship take a population-level approach, individual-level responses to variation in the social environment remain largely unstudied. Here, we explore carry-over effects on future mating decisions of the social environment in which the initial mating decision occurred. Using detailed data on the winter social networks of great tits, we tested whether the probability of subsequent divorce, a year later, could be predicted by measures of the social environment at the time of pairing. We found that males that had a lower proportion of female associates, and whose partner ranked lower among these, as well as inexperienced breeders, were more likely to divorce after breeding. We found no evidence that a female's social environment influenced the probability of divorce. Our findings highlight the importance of the social environment that individuals experience during initial pair formation on later pairing outcomes, and demonstrate that such effects can be delayed. Exploring these extended effects of the social environment can yield valuable insights into processes and selective pressures acting upon the mating strategies that individuals adopt. © 2015 The Author(s).

  9. Cyberdyn supercomputer - a tool for imaging geodinamic processes

    Science.gov (United States)

    Pomeran, Mihai; Manea, Vlad; Besutiu, Lucian; Zlagnean, Luminita

    2014-05-01

    More and more physical processes developed within the deep interior of our planet, but with significant impact on the Earth's shape and structure, become subject to numerical modelling by using high performance computing facilities. Nowadays, worldwide an increasing number of research centers decide to make use of such powerful and fast computers for simulating complex phenomena involving fluid dynamics and get deeper insight to intricate problems of Earth's evolution. With the CYBERDYN cybernetic infrastructure (CCI), the Solid Earth Dynamics Department in the Institute of Geodynamics of the Romanian Academy boldly steps into the 21st century by entering the research area of computational geodynamics. The project that made possible this advancement, has been jointly supported by EU and Romanian Government through the Structural and Cohesion Funds. It lasted for about three years, ending October 2013. CCI is basically a modern high performance Beowulf-type supercomputer (HPCC), combined with a high performance visualization cluster (HPVC) and a GeoWall. The infrastructure is mainly structured around 1344 cores and 3 TB of RAM. The high speed interconnect is provided by a Qlogic InfiniBand switch, able to transfer up to 40 Gbps. The CCI storage component is a 40 TB Panasas NAS. The operating system is Linux (CentOS). For control and maintenance, the Bright Cluster Manager package is used. The SGE job scheduler manages the job queues. CCI has been designed for a theoretical peak performance up to 11.2 TFlops. Speed tests showed that a high resolution numerical model (256 × 256 × 128 FEM elements) could be resolved with a mean computational speed of 1 time step at 30 seconds, by employing only a fraction of the computing power (20%). After passing the mandatory tests, the CCI has been involved in numerical modelling of various scenarios related to the East Carpathians tectonic and geodynamic evolution, including the Neogene magmatic activity, and the intriguing

  10. What are the factors that could influence the future of work with regard to energy systems and the built environment?

    International Nuclear Information System (INIS)

    Pratt, Andy C.

    2008-01-01

    The aim of this paper is to examine which factors in energy systems and the built environment could influence the future of work. In addition, it looks at trends in relation to corporate demands for space and its specifications, and considers what the scope is for integrating business and industry within the dwelling landscape. It seeks to consider these questions on a 50-year time horizon. The paper begins by discussing the challenge of prediction of future trends, especially in a field apparently so reliant upon technological change and innovation. Because of these problems, the paper concerns itself not with picking technologies but rather with questions about the social adoption of technologies and their applications. It highlights a spectrum of coordinating mechanisms in society that are likely to be critical in shaping the future implications of built environment forms and the consequential use of energy. The scenarios discussed arise from the intersection of two tendencies: concentration versus dispersal, and local versus globally focused growth of city regions. The challenges identified in this report are associated with 'lock-in' to past governance modes of the built environment, exacerbated by rapidly changing demand structures. Demand is not simply changing in volume but also in character. The shifts that will need to be dealt with concern a fundamental issue: how activities are coordinated in society

  11. Sustainability—What Are the Odds? Envisioning the Future of Our Environment, Economy and Society

    Directory of Open Access Journals (Sweden)

    Stephen J. Jordan

    2013-03-01

    Full Text Available This article examines the concept of sustainability from a global perspective, describing how alternative futures might develop in the environmental, economic, and social dimensions. The alternatives to sustainability appear to be (a a catastrophic failure of life support, economies, and societies, or (b a radical technological revolution (singularity. The case is made that solutions may be found by developing a global vision of the future, estimating the probabilities of possible outcomes from multiple indicators, and looking holistically for the most likely paths to sustainability. Finally, an intuitive vision of these paths is offered as a starting point for discussion.

  12. Scenario analysis for the San Pedro River, analyzing hydrological consequences of a future environment.

    Science.gov (United States)

    Kepner, William G; Semmens, Darius J; Bassett, Scott D; Mouat, David A; Goodrich, David C

    2004-06-01

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits and consequences. The San Pedro River in Arizona and Sonora, Mexico is an area that has undergone rapid changes in land use and cover, and subsequently is facing keen environmental crises related to water resources. It is the location of a number of studies that have dealt with change analysis, watershed condition, and most recently, alternative futures analysis. The previous work has dealt primarily with resources of habitat, visual quality, and groundwater related to urban development patterns and preferences. In the present study, previously defined future scenarios, in the form of land-use/land-cover grids, were examined relative to their impact on surface-water conditions (e.g., surface runoff and sediment yield). These hydrological outputs were estimated for the baseline year of 2000 and predicted twenty years in the future as a demonstration of how new geographic information system-based hydrologic modeling tools can be used to evaluate the spatial impacts of urban growth patterns on surface-water hydrology.

  13. Comparison of a personal computer oil spill model with a supercomputer model

    International Nuclear Information System (INIS)

    Cekirge, H.M.; Convery, K.; Koch, M.; Long, C.; Giammona, C.P.; Jamail, R.

    1994-01-01

    There are a number of oil spill trajectory and fate models operating on personal computers, mainframes and supercomputers. These models are used for rapid response, training and contingency planning. The purpose of this study is to compare the results of oil spill models operating on personal computers to those operating on supercomputers when they are used for emergency response. Specific attention is focused on the three-dimensional hydrodynamic model, oil fate and transport model that must be run on a supercomputer. The SUPER-SLIK C model was chosen for the comparative study because it is the only such model. A PC model, FSU-SLIK C was selected for the study because it contains all the features of PC based oil spill models, and was considered representative of PC based models. The predictions of these two models were compared to observations with the aim of assessing their relative accuracy. 32 refs., 9 figs., 2 tabs

  14. New Mexico High School Supercomputing Challenge, 1990--1995: Five years of making a difference to students, teachers, schools, and communities. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Foster, M.; Kratzer, D.

    1996-02-01

    The New Mexico High School Supercomputing Challenge is an academic program dedicated to increasing interest in science and math among high school students by introducing them to high performance computing. This report provides a summary and evaluation of the first five years of the program, describes the program and shows the impact that it has had on high school students, their teachers, and their communities. Goals and objectives are reviewed and evaluated, growth and development of the program are analyzed, and future directions are discussed.

  15. Campus Retrofitting (CARE) Methodology: A Way to Co-Create Future Learning Environments

    DEFF Research Database (Denmark)

    Nenonen, Suvi; Eriksson, Robert; Niemi, Olli

    2016-01-01

    they used. Based on the analysis of the methods the framework for Campus retrofitting (CARE) - methodology is presented and discussed. CARE-methodology is a tool to capture new logic to learning environment design. It has three key activities: co-creating, co-financing and co-evaluating. The integrated...

  16. Beyond the Personal Learning Environment: Attachment and Control in the Classroom of the Future

    Science.gov (United States)

    Johnson, Mark William; Sherlock, David

    2014-01-01

    The Personal Learning Environment (PLE) has been presented in a number of guises over a period of 10 years as an intervention which seeks the reorganisation of educational technology through shifting the "locus of control" of technology towards the learner. In the intervening period to the present, a number of initiatives have attempted…

  17. The Security Environment in Central and Eastern Europe: Current Status, Future Prospects

    Directory of Open Access Journals (Sweden)

    Georgeta Chirleşan

    2012-05-01

    Full Text Available The paper aims at presenting the main features of the current security environment within Centraland Eastern Europe. It tries to build up on previous approaches regarding the Euro-Atlantic security with afocus on specific security environment in Central and Eastern Europe. It operates with concepts of theEuropean Security Strategy and with the NATO Alliance security principles, which not entirely overlap. Thepresent research is based on deductive and inductive analysis, comparative and case study. The researchfindings have revealed that European and Euro-Atlantic security are inter-laced. Collective securityarrangements are necessary and able to ensure peace and stability in Europe. Still, security is a controversialconcept in terms of perception at the level of political elites and public opinion. This paper presents a jointinterest to academics and researchers working in this sensitive field of security, providing them the possibilityto gain a better knowledge and understanding on the security environment within Central and Eastern Europe.The value of this paper resides on the original approach and on the research methods that have been used inorder to deeply analyse the security environment from an inside perspective of an Eastern country.

  18. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Energy Technology Data Exchange (ETDEWEB)

    De, K [University of Texas at Arlington; Jha, S [Rutgers University; Klimentov, A [Brookhaven National Laboratory (BNL); Maeno, T [Brookhaven National Laboratory (BNL); Nilsson, P [Brookhaven National Laboratory (BNL); Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Wells, Jack C [ORNL; Wenaus, T [Brookhaven National Laboratory (BNL)

    2016-01-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation

  19. Optimization of large matrix calculations for execution on the Cray X-MP vector supercomputer

    Science.gov (United States)

    Hornfeck, William A.

    1988-01-01

    A considerable volume of large computational computer codes were developed for NASA over the past twenty-five years. This code represents algorithms developed for machines of earlier generation. With the emergence of the vector supercomputer as a viable, commercially available machine, an opportunity exists to evaluate optimization strategies to improve the efficiency of existing software. This result is primarily due to architectural differences in the latest generation of large-scale machines and the earlier, mostly uniprocessor, machines. A sofware package being used by NASA to perform computations on large matrices is described, and a strategy for conversion to the Cray X-MP vector supercomputer is also described.

  20. Guide to dataflow supercomputing basic concepts, case studies, and a detailed example

    CERN Document Server

    Milutinovic, Veljko; Trifunovic, Nemanja; Giorgi, Roberto

    2015-01-01

    This unique text/reference describes an exciting and novel approach to supercomputing in the DataFlow paradigm. The major advantages and applications of this approach are clearly described, and a detailed explanation of the programming model is provided using simple yet effective examples. The work is developed from a series of lecture courses taught by the authors in more than 40 universities across more than 20 countries, and from research carried out by Maxeler Technologies, Inc. Topics and features: presents a thorough introduction to DataFlow supercomputing for big data problems; revie

  1. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  2. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    Science.gov (United States)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  3. CosmoBon for studying wood formation under exotic gravitational environment for future space agriculture

    Science.gov (United States)

    Tomita-Yokotani, Kaori; Baba, Keiichi; Suzuki, Toshisada; Funada, Ryo; Nakamura, Teruko; Hashimoto, Hirofumi; Yamashita, Masamichi; Cosmobon, Jstwg

    We are proposing to raise woody plants in space for several applications and plant science. Japanese flowering cherry tree is one of a candidate for these studies. Mechanism behind sensing gravity and controlling shape of tree has been studied quite extensively. Even molecular mechanism for the response of plant against gravity has been investigated quite intensively for various species, woody plants are left behind. Morphology of woody branch growth is different from that of stem growth in herbs. Morphology in tree is strongly dominated by the secondary xylem formation. Nobody knows the tree shape grown under the space environment. If whole tree could be brought up to space as research materials, it might provide important scientific knowledge. Furthermore, trees produce excess oxygen, wooden materials for living cabin, and provide biomass for cultivating mushroom and insect as for the space agriculture. Excellent tree shapes which would be deeply related to wood formation improve quality of life under stressful environment in outer space. The serious problem would be their size. Bonsai is one of the Japanese traditional arts. We can study secondly xylem formation, wood formation, under exotic gravitational environment using Bonsai. "CosmoBon" is the small tree Bonsai for our space experiment. It has been recognized that the reaction wood in CosmoBon is formed similar to natural trees. Our goal is to examine feasibility to grow various species of trees in space as bioresource for space agriculture.

  4. Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer.

    Science.gov (United States)

    Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas

    2016-04-01

    Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .

  5. Securing a better future for all: Nuclear techniques for global development and environmental protection. NA factsheet on environment laboratories: Protecting the environment

    International Nuclear Information System (INIS)

    2012-01-01

    According to the Millennium Development Goals, managing the environment is considered an integral part of the global development process. The main purpose of the IAEA's environment laboratories is to provide Member States with reliable information on environmental issues and facilitate decision making on protection of the environment. An increasingly important feature of this work is to assess the impact of climate change on environmental sustainability and natural resources. The IAEA's environment laboratories use nuclear techniques, radionuclides, isotopic tracers and stable isotopes to gain a better understanding of the various marine processes, including locating the sources of pollutants and their fate, their transport pathways and their ultimate accumulation in sediments. Radioisotopes are also used to study bioaccumulation in organisms and the food chain, as well as to track signals of climate change throughout history. Natural and artificial radionuclides are used to track ocean currents in key regions. They are also used to validate models designed to predict the future impact of climate change and ocean acidification. The laboratories study the fate and impact of contamination on a variety of ecosystems in order to provide effective preventative diagnostic and remediation strategies. They enhance the capability of Member States to use nuclear techniques to understand and assess changes in their own terrestrial and atmospheric environments, and adopt suitable and sustainable remediation measures when needed. Since 1995, the IAEA environment laboratories have coordinated the international network of Analytical Laboratories for the Measurement of Environmental Radioactivity, providing accurate analysis in the event of an accident or an intentional release of radioactivity. In addition, the laboratories work alongside other organizations, such as UNESCO, the IOC, UNEP and the EC. The laboratories collaborate with Member States through direct involvement with

  6. Svalbard as a study model of future High Arctic coastal environments in a warming world

    Directory of Open Access Journals (Sweden)

    Jacek Piskozub

    2017-10-01

    Full Text Available Svalbard archipelago, a high latitude area in a region undergoing rapid climate change, is relatively easily accessible for field research. This makes the fjords of Spitsbergen, its largest island, some of the best studied Arctic coastal areas. This paper aims at answering the question of how climatically diverse the fjords are, and how representative they are for the expected future Arctic diminishing range of seasonal sea-ice. This study uses a meteorological reanalysis, sea surface temperature climatology, and the results of a recent one-year meteorological campaign in Spitsbergen to determine the seasonal differences between different Spitsbergen fjords, as well as the sea water temperature and ice ranges around Svalbard in recent years. The results show that Spitsbergen fjords have diverse seasonal patterns of air temperature due to differences in the SST of the adjacent ocean, and different cloudiness. The sea water temperatures and ice concentrations around Svalbard in recent years are similar to what is expected most of the Arctic coastal areas in the second half of this century. This makes Spitsbergen a unique field study model of the conditions expected in future warmer High Arctic.

  7. Learning in the e-environment: new media and learning for the future

    Directory of Open Access Journals (Sweden)

    Milan Matijević

    2015-03-01

    Full Text Available We live in times of rapid change in all areas of science, technology, communication and social life. Every day we are asked to what extent school prepares us for these changes and for life in a new, multimedia environment. Children and adolescents spend less time at school or in other settings of learning than they do outdoors or within other social communities (family, clubs, societies, religious institutions and the like. Experts must constantly inquire about what exactly influences learning and development in our rich media environment. The list of the most important life competences has significantly changed and expanded since the last century. Educational experts are attempting to predict changes in the content and methodology of learning at the beginning of the 21st century. Answers are sought to key questions such as: what should one learn; how should one learn; where should one learn; why should one learn; and how do these answers relate to the new learning environment? In his examination of the way children and young people learn and grow up, the author places special attention on the relationship between personal and non-personal communication (e.g. the internet, mobile phones and different types of e-learning. He deals with today's questions by looking back to some of the more prominent authors and studies of the past fifty years that tackled identical or similar questions (Alvin Toffler, Ivan Illich, George Orwell, and the members of the Club of Rome. The conclusion reached is that in today's world of rapid and continuous change, it is much more crucial than in the last century, both, to be able to learn, and to adapt to learning with the help of new media.

  8. Mineral formation on metallic copper in a 'future repository site environment'

    International Nuclear Information System (INIS)

    Amcoff, Oe.; Holenyi, K.

    1996-04-01

    Since reducing conditions are expected much effort has been concentrated on Cu-sulfides and CuFe-sulfides. However, oxidizing conditions are also discussed. A list of copper minerals are included. It is concluded that mineral formation and mineral transitions on the copper canister surface will be governed by kinetics and metastabilities rather than by stability relations. The sulfides formed are less likely to form a passivating layer, and the rate of sulfide growth will probably be governed by the rate of transport of reacting species to the canister surface. A series of tests are recommended, in an environment resembling the initial repository site conditions. 82 refs, 8 figs

  9. Impact of family environment on future mental health professionals' attitudes toward lesbians and gay men.

    Science.gov (United States)

    Kissinger, Daniel B; Lee, Sang Min; Twitty, Lisa; Kisner, Harrison

    2009-01-01

    This study explored the relationship between dimensions of functioning in the family of origin of graduate students in helping profession programs and their attitudes toward lesbians and gay men. One hundred forty-three participants completed the Family Environment Scale (FES-R: Moos & Moos, 1986), the Attitudes Toward Lesbians and Gay Men scale (ATLG: Herek, 1994), and demographic questions. Results suggest that three family dimensions (conflict, intellectual-cultural orientation, and moral-religious emphasis) significantly predicted attitudes toward lesbians and gay men. The results also revealed that younger students held more negative attitudes toward lesbians and gay men than their older peers. Implications for educators, researchers, and practitioners are discussed.

  10. The challenge of monitoring the cryosphere in alpine environments: Prepare the present for the future

    Science.gov (United States)

    Fischer, Andrea; Helfricht, Kay; Seiser, Bernd; Stocker-Waldhuber, Martin; Hartl, Lea; Wiesenegger, Hans

    2017-04-01

    Understanding the interaction of mountain glaciers and permafrost with weather and climate is essential for the interpretation of past states of the cryosphere in terms of climate change. Most of the glaciers and rock glaciers in Eastern Alpine terrain are subject to strong gradients in climatic forcing, and the persistence of these gradients under past climatic conditions is, more or less, unknown. Thus a key challenge of monitoring the cryosphere is to define the demands on a monitoring strategy for capturing essential processes and their potential changes. For example, the effects of orographic precipitation and local shading vary with general circulation patterns and the amount of solar radiation during the melt(ing) season. Recent investigations based on the Austrian glacier inventories have shown that glacier distribution is closely linked to topography and climatic situation, and that these two parameters imply also different sensitivities of the specific glaciers to progressing climate change. This leads to the need to develop a monitoring system capturing past, but also fairly unknown future ensembles of climatic state and sensitivities. As a first step, the Austrian glacier monitoring network has been analyzed from the beginning of the records onwards. Today's monitoring network bears the imprints of past research interests, but also past funding policies and personal/institutional engagements. As a limitation for long term monitoring in general, today's monitoring strategies have to cope with being restricted to these historical commitments to preserve the length of the time series, but at the same time expanding the measurements to fulfil present and future scientific and societal demands. The decision on cryospheric benchmark sites has an additional uncertainty: the ongoing disintegration of glaciers, their increasing debris cover as well as the potential low ice content and relatively unknown reaction of rock glaciers in the course of climate change

  11. Early social environment affects the endogenous oxytocin system: a review and future directions

    Directory of Open Access Journals (Sweden)

    Emily eAlves

    2015-03-01

    Full Text Available Endogenous oxytocin plays an important role in a wide range of human functions including birth, milk ejection during lactation and facilitation of social interaction. There is increasing evidence that both variations in the oxytocin receptor (OXTR and concentrations of oxytocin are associated with differences in these functions. The causes for the differences that have been observed in tonic and stimulated oxytocin release remain unclear. Previous reviews have suggested that across the life course, these differences may be due to individual factors, e.g. genetic variation (of the OXTR, age or sex, or be the result of early environmental influences such as social experiences, stress or trauma partly by inducing epigenetic changes. This review has three aims. First, we briefly discuss the endogenous oxytocin system, including physiology, development, individual differences and function. Secondly, current models describing the relationship between the early life environment and the development of the oxytocin system in humans and animals are discussed. Finally, we describe research designs that can be used to investigate the effects of the early environment on the oxytocin system, identifying specific areas of research that need further attention.

  12. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    Science.gov (United States)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.

    2010-01-01

    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  13. [Environment and health in Gela (Sicily): present knowledge and prospects for future studies].

    Science.gov (United States)

    Musmeci, Loredana; Bianchi, Fabrizio; Carere, Mario; Cori, Liliana

    2009-01-01

    The study area includes the Municipalities of Gela, Niscemi and Butera located in the South of Sicily, Italy. In 1990 it was declared Area at High Risk of Environmental Crisis. In 2000 part of it was designated as Gela Reclamation Site of National Interest, RSNI. The site includes a private industrial area, public and marine areas, for a total of 51 km(2). Gela populationin 2008 was 77,145 (54,774 in 1961). Sea level:46 m. Total area: 276 km(2). Grid reference: 37 degrees 4' 0" N, 14 degrees 15' 0" E. Niscemi and Butera are located border to Gela. Populations are respectively 26,541 and 5,063. Sea level respectively: 332 m and 402 m. Close to the city of Gela, the industrial area, operating since 1962, includes chemical production plants, a power station and an oil refinery plant, one of the larger in Europe, refining 5 millions tons of crude per year. From the beginning the workforces decreased from 7,000 to the current 3,000 units. Over the years, these industrial activities have been a major source of environmental pollution. Extremely high levels of toxic, persistent and bio-accumulating chemical pollutants have been documented. Many relevant environmental and health data are available. Prior to the studies described in the present publication, their use in order to identify environmental pressures on health has been limited. Nevertheless, since several years different epidemiological studies have provided evidence of the occurrence of health outcomes significantly higher than in neighbouring areas and compared to regional data. In 2007 a Multidisciplinary Working Group has been established, to analyze the existing data on pollution-exposure-effect and to complete current knowledge on the cycle of pollutants, from migration in the environment to health impact. The present publication is a collection of contribution of this group of experts, supported by the following projects: Evaluation of environmental health impact and estimation of economic costs at of

  14. Comparative assessment for future prediction of urban water environment using WEAP model: A case study of Kathmandu, Manila and Jakarta

    Science.gov (United States)

    Kumar, Pankaj; Yoshifumi, Masago; Ammar, Rafieiemam; Mishra, Binaya; Fukushi, Ken

    2017-04-01

    Uncontrolled release of pollutants, increasing extreme weather condition, rapid urbanization and poor governance posing a serious threat to sustainable water resource management in developing urban spaces. Considering half of the world's mega-cities are in the Asia and the Pacific with 1.7 billion people do not access to improved water and sanitation, water security through its proper management is both an increasing concern and an imperative critical need. This research work strives to give a brief glimpse about predicted future water environment in Bagmati, Pasig and Ciliwung rivers from three different cities viz. Manila, Kathmandu and Jakarta respectively. Hydrological model used here to foresee the collective impacts of rapid population growth because of urbanization as well as climate change on unmet demand and water quality in near future time by 2030. All three rivers are major source of water for different usage viz. domestic, industrial, agriculture and recreation but uncontrolled withdrawal and sewerage disposal causing deterioration of water environment in recent past. Water Evaluation and Planning (WEAP) model was used to model river water quality pollution future scenarios using four indicator species i.e. Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD) and Nitrate (NO3). Result for simulated water quality as well as unmet demand for year 2030 when compared with that of reference year clearly indicates that not only water quality deteriorates but also unmet demands is increasing in future course of time. This also suggests that current initiatives and policies for water resource management are not sufficient enough and hence immediate and inclusive action through transdisciplinary research.

  15. The future context of work in the business environment in South Africa: Some empirical evidence

    Directory of Open Access Journals (Sweden)

    PS Nel

    2014-10-01

    Full Text Available The future is uncertain, but management needs to determine and also be informed about possible change trends. This research, however, reports on empirical results of the views of South African HRM practitioners to identify and prioritise business change trends for 2002 and 2010 in terms of the “hard” or “soft” HRM debate in the literature. All organisations employing HRM practitioners were include and a total of 1640 questionnaires were distributed resulting in 207 useable responses.   The results highlight trends such as increased international competition, globalisation and inadequate skills in different rankings for 2002 and 2010. It is concluded that HRM practitioners, are influenced by the “hard” or “soft” approach, when they participate in a strategic management context in organisations.

  16. Microbial fuel cells in saline and hypersaline environments: Advancements, challenges and future perspectives.

    Science.gov (United States)

    Grattieri, Matteo; Minteer, Shelley D

    2018-04-01

    This review is aimed to report the possibility to utilize microbial fuel cells for the treatment of saline and hypersaline solutions. An introduction to the issues related with the biological treatment of saline and hypersaline wastewater is reported, discussing the limitation that characterizes classical aerobic and anaerobic digestions. The microbial fuel cell (MFC) technology, and the possibility to be applied in the presence of high salinity, is discussed before reviewing the most recent advancements in the development of MFCs operating in saline and hypersaline conditions, with their different and interesting applications. Specifically, the research performed in the last 5years will be the main focus of this review. Finally, the future perspectives for this technology, together with the most urgent research needs, are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Gene–Environment Interactions in Preventive Medicine: Current Status and Expectations for the Future

    Directory of Open Access Journals (Sweden)

    Hiroto Narimatsu

    2017-01-01

    Full Text Available The progression of many common disorders involves a complex interplay of multiple factors, including numerous different genes and environmental factors. Gene–environmental cohort studies focus on the identification of risk factors that cannot be discovered by conventional epidemiological methodologies. Such epidemiological methodologies preclude precise predictions, because the exact risk factors can be revealed only after detailed analyses of the interactions among multiple factors, that is, between genes and environmental factors. To date, these cohort studies have reported some promising results. However, the findings do not yet have sufficient clinical significance for the development of precise, personalized preventive medicine. Especially, some promising preliminary studies have been conducted in terms of the prevention of obesity. Large-scale validation studies of those preliminary studies, using a prospective cohort design and long follow-ups, will produce useful and practical evidence for the development of preventive medicine in the future.

  18. Recent research activities and future subjects on stable- and radio-isotopes of chlorine in environment

    International Nuclear Information System (INIS)

    Kushita, Kouhei

    2001-12-01

    This report reviews the recent studies on the stable- and radio-isotopes of chlorine from a viewpoint of environmental science, partly including historic references on this element. First, general properties, occurrence, and utilization of chlorine are described. Secondly, current status and research works on chlorine-compounds, which attract special attention in recent years as environmentally hazardous materials, are reported. Thirdly, research works on stable chlorine isotopes, 35 Cl and 37 Cl, are described with a focus laid on the newly-developed techniques; isotopic ratio mass spectrometry (IRMS) and thermal ionization mass spectrometry (TIMS). Fourthly, recent research works on chlorine radioisotopes, 36 Cl etc., are described, focusing on the development of accelerator mass spectrometry (AMS) and its application to geochemistry and others. Finally, taking account of the above-mentioned recent works on Cl isotopes, possible future research subjects are discussed. (author)

  19. Recent research activities and future subjects on stable- and radio-isotopes of chlorine in environment

    Energy Technology Data Exchange (ETDEWEB)

    Kushita, Kouhei [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    This report reviews the recent studies on the stable- and radio-isotopes of chlorine from a viewpoint of environmental science, partly including historic references on this element. First, general properties, occurrence, and utilization of chlorine are described. Secondly, current status and research works on chlorine-compounds, which attract special attention in recent years as environmentally hazardous materials, are reported. Thirdly, research works on stable chlorine isotopes, {sup 35}Cl and {sup 37}Cl, are described with a focus laid on the newly-developed techniques; isotopic ratio mass spectrometry (IRMS) and thermal ionization mass spectrometry (TIMS). Fourthly, recent research works on chlorine radioisotopes, {sup 36}Cl etc., are described, focusing on the development of accelerator mass spectrometry (AMS) and its application to geochemistry and others. Finally, taking account of the above-mentioned recent works on Cl isotopes, possible future research subjects are discussed. (author)

  20. Future Connections: The potential of Web service and Portal technologies for the historic environment

    Directory of Open Access Journals (Sweden)

    Stewart Waller

    2005-09-01

    Full Text Available Where the other papers in this special edition of Internet Archaeology look at the experiences and achievements of the ARENA project this paper looks to the future. Portal technologies like those used by the ARENA portal are already moving into new and exciting areas, areas such as web services and portlet technologies in particular. This paper considers the work of the ADS in projects such as ARENA, but also HEIRPORT and CREE, work that has revealed new and exciting paths towards data sharing on a European scale. Are these the paths that any new ARENA project will have to follow to sustain the dream of interoperable data sharing for European archaeology?

  1. The future is in the numbers: the power of predictive analysis in the biomedical educational environment

    Science.gov (United States)

    Gullo, Charles A.

    2016-01-01

    Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large. PMID:27374246

  2. The future is in the numbers: the power of predictive analysis in the biomedical educational environment

    Directory of Open Access Journals (Sweden)

    Charles A. Gullo

    2016-07-01

    Full Text Available Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large.

  3. Future improvements and implementation of animal care practices within the animal testing regulatory environment.

    Science.gov (United States)

    Guittin, Pierre; Decelle, Thierry

    2002-01-01

    Animal welfare is an increasingly important concern when considering biomedical experimentation. Many of the emerging regulations and guidelines specifically address animal welfare in laboratory animal care and use. The current revision of the appendix of the European Convention, ETS123 (Council of Europe), updates and improves on the current animal care standardization in Europe. New guidelines from the Organisation for Economic Co-operation and Development and the European Federation of Pharmaceutical Industries Association focus specifically on safety testing. These guidelines will affect the way toxicity studies are conducted and therefore the global drug development process. With the 3Rs principles taken into account, consideration regarding animal welfare will demand changes in animal care practices in regulatory safety testing. The most significant future improvements in animal care and use practices are likely to be environmental enrichment, management of animal pain and distress, and improved application of the humane endpoints. Our challenge is to implement respective guidelines based on scientific data and animal welfare, through a complex interplay of regulatory objective and public opinion. The current goal is to work toward solutions that continue to provide relevant animal models for risk assessment in drug development and that are science based. In this way, future improvements in animal care and use practices can be founded on facts, scientific results, and analysis. Some of these improvements become common practice in some countries. International harmonization can facilitate the development and practical application of "best scientific practices" by the consensus development process that harmonization requires. Since the implementation of good laboratory practices (GLP) standards in safety testing, these new regulations and recommendations represent a new way forward for animal safety studies.

  4. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  5. An efficient implementation of a backpropagation learning algorithm on quadrics parallel supercomputer

    International Nuclear Information System (INIS)

    Taraglio, S.; Massaioli, F.

    1995-08-01

    A parallel implementation of a library to build and train Multi Layer Perceptrons via the Back Propagation algorithm is presented. The target machine is the SIMD massively parallel supercomputer Quadrics. Performance measures are provided on three different machines with different number of processors, for two network examples. A sample source code is given

  6. Interactive real-time nuclear plant simulations on a UNIX based supercomputer

    International Nuclear Information System (INIS)

    Behling, S.R.

    1990-01-01

    Interactive real-time nuclear plant simulations are critically important to train nuclear power plant engineers and operators. In addition, real-time simulations can be used to test the validity and timing of plant technical specifications and operational procedures. To accurately and confidently simulate a nuclear power plant transient in real-time, sufficient computer resources must be available. Since some important transients cannot be simulated using preprogrammed responses or non-physical models, commonly used simulation techniques may not be adequate. However, the power of a supercomputer allows one to accurately calculate the behavior of nuclear power plants even during very complex transients. Many of these transients can be calculated in real-time or quicker on the fastest supercomputers. The concept of running interactive real-time nuclear power plant transients on a supercomputer has been tested. This paper describes the architecture of the simulation program, the techniques used to establish real-time synchronization, and other issues related to the use of supercomputers in a new and potentially very important area. (author)

  7. The impact of the U.S. supercomputing initiative will be global

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, Dona [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-15

    Last July, President Obama issued an executive order that created a coordinated federal strategy for HPC research, development, and deployment called the U.S. National Strategic Computing Initiative (NSCI). However, this bold, necessary step toward building the next generation of supercomputers has inaugurated a new era for U.S. high performance computing (HPC).

  8. Argonne National Lab deploys Force10 networks' massively dense ethernet switch for supercomputing cluster

    CERN Multimedia

    2003-01-01

    "Force10 Networks, Inc. today announced that Argonne National Laboratory (Argonne, IL) has successfully deployed Force10 E-Series switch/routers to connect to the TeraGrid, the world's largest supercomputing grid, sponsored by the National Science Foundation (NSF)" (1/2 page).

  9. A low-carbon future: Spatial planning's role in enhancing technological innovation in the built environment

    International Nuclear Information System (INIS)

    Crawford, Jenny; French, Will

    2008-01-01

    The scope of spatial planning activity includes issues of governance, corporate organisation, policy integration, statutory and regulatory frameworks, and technical analysis and design. The nature of its potential contribution to achieving low-carbon built environments will vary according to the resolution of tensions between pressures for leadership, consistent decision making and speed of change and the value placed on diversity, flexibility and innovation. A planning system that can support technological innovation will be characterised by high levels of organisational and institutional capacity and high-quality knowledge systems that support a focus on delivering place-based objectives. The paper reflects on further aspects of such a system and the issues that spatial planning needs to address in delivering low-carbon energy systems

  10. Biofuels are (Not the Future! Legitimation Strategies of Sustainable Ventures in Complex Institutional Environments

    Directory of Open Access Journals (Sweden)

    Neil A. Thompson

    2018-04-01

    Full Text Available Sustainable ventures often lack legitimacy (perceived to be desirable and appropriate because various stakeholder groups use contradictory institutions (rules and norms to make their judgements, which leads to there being fewer resources available and higher failure rates. Using an institutional theory framework and a multi-case research design with 15 biofuel ventures operating in the Netherlands, this study asks how sustainable entrepreneurs attempt to gain legitimacy in these circumstances. Analysis reveals that the entrepreneurs use a combination of rhetorical, reconciliatory and institutional change strategies to obtain legitimacy from different stakeholder groups. These findings further our understanding of sustainable entrepreneurial behavior by revealing how and why different legitimation strategies are used in complex institutional environments.

  11. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    Science.gov (United States)

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  12. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    Science.gov (United States)

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  13. Transportation Energy Futures Series: Effects of the Built Environment on Transportation: Energy Use, Greenhouse Gas Emissions, and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C. D.; Brown, A.; Dunphy, R. T.; Vimmerstedt, L.

    2013-03-01

    Planning initiatives in many regions and communities aim to reduce transportation energy use, decrease emissions, and achieve related environmental benefits by changing land use. This report reviews and summarizes findings from existing literature on the relationship between the built environment and transportation energy use and greenhouse gas emissions, identifying results trends as well as potential future actions. The indirect influence of federal transportation and housing policies, as well as the direct impact of municipal regulation on land use are examined for their effect on transportation patterns and energy use. Special attention is given to the 'four D' factors of density, diversity, design and accessibility. The report concludes that policy-driven changes to the built environment could reduce transportation energy and GHG emissions from less than 1% to as much as 10% by 2050, the equivalent of 16%-18% of present-day urban light-duty-vehicle travel. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  14. Transportation Energy Futures Series. Effects of the Built Environment on Transportation. Energy Use, Greenhouse Gas Emissions, and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C. D. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Brown, A. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Dunphy, R. T. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States); Vimmerstedt, L. [National Renewable Energy Lab. (NREL) and Cambridge Systematics, Inc., Golden, CO (United States)

    2013-03-15

    Planning initiatives in many regions and communities aim to reduce transportation energy use, decrease emissions, and achieve related environmental benefits by changing land use. This report reviews and summarizes findings from existing literature on the relationship between the built environment and transportation energy use and greenhouse gas emissions, identifying results trends as well as potential future actions. The indirect influence of federal transportation and housing policies, as well as the direct impact of municipal regulation on land use are examined for their effect on transportation patterns and energy use. Special attention is given to the 'four D' factors of density, diversity, design and accessibility. The report concludes that policy-driven changes to the built environment could reduce transportation energy and GHG emissions from less than 1% to as much as 10% by 2050, the equivalent of 16%-18% of present-day urban light-duty-vehicle travel. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  15. Clown knifefish (Chitala ornata) oxygen uptake and its partitioning in present and future temperature environments.

    Science.gov (United States)

    Tuong, Dang Diem; Ngoc, Tran Bao; Huynh, Vo Thi Nhu; Huong, Do Thi Thanh; Phuong, Nguyen Thanh; Hai, Tran Ngoc; Wang, Tobias; Bayley, Mark

    2018-02-01

    It has been argued that tropical ectotherms are more vulnerable to the projected temperature increases than their temperate relatives, because they already live closer to their upper temperature limit. Here we examine the effects of a temperature increase in environmental temperature to 6°C above the present day median temperature (27°C) in the freshwater air-breathing fish Chitala ornata, on aspects of its respiratory physiology in both normoxia and in hypoxia. We found no evidence of respiratory impairment with elevated temperature. The standard metabolic rate (SMR) and routine metabolic rate (RMR) in the two temperatures in normoxia and hypoxia increased with Q 10 values between 2.3 and 2.9, while the specific dynamic action (SDA) and its coefficient increased from 7.8 to 14.7% in 27°C and 33°C, respectively. In addition, Chitala ornata exhibited significantly improved growth at the elevated temperature in both hypoxic and normoxic water. While projected temperature increases may negatively impact other essential aspects in this animal's environment, we see no evidence of a negative impact on this species itself. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Ageing, genes, environment and epigenetics: what twin studies tell us now, and in the future.

    Science.gov (United States)

    Steves, Claire Joanne; Spector, Timothy D; Jackson, Stephen H D

    2012-09-01

    Compared with younger people, older people are much more variable in their organ function, and these large individual differences contribute to the complexity of geriatric medicine. What determines this variability? Is it due to the accumulation of different life experiences, or because of the variation in the genes we are born with, or an interaction of both? This paper reviews key findings from ageing twin cohorts probing these questions. Twin studies are the perfect natural experiment to dissect out genes and life experiences. We discuss the paradox that ageing is strongly determined by heritable factors (an influence that often gets stronger with time), yet longevity and lifespan seem not to be so heritable. We then focus on the intriguing question of why DNA sequence-identical twins might age differently. Animal studies are increasingly showing that epigenetic modifications occurring in early development and adulthood, might be key to ageing phenomena but this is difficult to investigate longitudinally in human populations, due to ethical problems of intervention and long lifespan. We propose that identical twin studies using new and existing cohorts may be useful human models in which to investigate the interaction between the environment and genetics, mediated by epigenetic modifications.

  17. Climate change, renewable energy and population impact on future energy demand for Burkina Faso build environment

    Science.gov (United States)

    Ouedraogo, B. I.

    This research addresses the dual challenge faced by Burkina Faso engineers to design sustainable low-energy cost public buildings and domestic dwellings while still providing the required thermal comfort under warmer temperature conditions caused by climate change. It was found base don climate change SRES scenario A2 that predicted mean temperature in Burkina Faso will increase by 2oC between 2010 and 2050. Therefore, in order to maintain a thermally comfortable 25oC inside public buildings, the projected annual energy consumption for cooling load will increase by 15%, 36% and 100% respectively for the period between 2020 to 2039, 2040 to 2059 and 2070 to 2089 when compared to the control case. It has also been found that a 1% increase in population growth will result in a 1.38% and 2.03% increase in carbon emission from primary energy consumption and future electricity consumption respectively. Furthermore, this research has investigated possible solutions for adaptation to the severe climate change and population growth impact on energy demand in Burkina Faso. Shading devices could potentially reduce the cooling load by up to 40%. Computer simulation programming of building energy consumption and a field study has shown that adobe houses have the potential of significantly reducing energy demand for cooling and offer a formidable method for climate change adaptation. Based on the Net Present Cost, hybrid photovoltaic (PV) and Diesel generator energy production configuration is the most cost effective local electricity supply system, for areas without electricity at present, with a payback time of 8 years when compared to diesel generator stand-alone configuration. It is therefore a viable solution to increase electricity access to the majority of the population.

  18. Present and future thermal environments available to Sharp-tailed Grouse in an intact grassland.

    Directory of Open Access Journals (Sweden)

    Edward J Raynor

    Full Text Available Better understanding animal ecology in terms of thermal habitat use has become a focus of ecological studies, in large part due to the predicted temperature increases associated with global climate change. To further our knowledge on how ground-nesting endotherms respond to thermal landscapes, we examined the thermal ecology of Sharp-tailed Grouse (Tympanuchus phasianellus during the nesting period. We measured site-specific iButton temperatures (TiB and vegetation characteristics at nest sites, nearby random sites, and landscape sites to assess thermal patterns at scales relevant to nesting birds. We asked if microhabitat vegetation characteristics at nest sites matched the characteristics that directed macrohabitat nest-site selection. Grouse selected sites sheltered by dense vegetation for nesting that moderated TiB on average up to 2.7°C more than available landscape sites. Successful nests were positioned in a way that reduced exposure to thermal extremes by as much as 4°C relative to failed nests with an overall mean daytime difference (±SE of 0.4 ±0.03°C. We found that macrohabitat nest-site selection was guided by dense vegetation cover and minimal bare ground as also seen at the microhabitat scale. Global climate projections for 2080 suggest that TiB at nest sites may approach temperatures currently avoided on the landscape, emphasizing a need for future conservation plans that acknowledge fine-scale thermal space in climate change scenarios. These data show that features of grassland landscapes can buffer organisms from unfavorable microclimatic conditions and highlight how thermal heterogeneity at the individual-level can drive decisions guiding nest site selection.

  19. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    Science.gov (United States)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  20. Holocene Paleoceanographic Environments at the Chukchi-Alaskan Margin: Implications for Future Changes

    Science.gov (United States)

    Polyak, L.; Nam, S. I.; Dipre, G.; Kim, S. Y.; Ortiz, J. D.; Darby, D. A.

    2017-12-01

    The impacts of the North Pacific oceanic and atmospheric system on the Arctic Ocean result in accelerated sea-ice retreat and related changes in hydrography and biota in the western Arctic. Paleoclimatic records from the Pacific sector of the Arctic are key for understanding the long-term history of these interactions. As opposed to stratigraphically long but strongly compressed sediment cores recovered from the deep Arctic Ocean, sediment depocenters on the Chukchi-Alaskan margin yield continuous, medium to high resolution records formed since the last deglaciation. While early Holocene conditions were non-analogous to modern environments due to the effects of prolonged deglaciation and insufficiently high sea levels, mid to late Holocene sediments are more relevant for recent and modern climate variability. Notably, a large depocenter at the Alaskan margin has sedimentation rates estimated as high as a few millimeters per year, thus providing a decadal to near-annual resolution. This high accumulation can be explained by sediment delivery via the Alaskan Coastal Current originating from the Bering Sea and supposedly controlled by the Aleutian Low pressure center. Preliminary results from sediment cores recovering the last several centuries, along with a comparison with other paleoclimatic proxy records from the Arctic-North Pacific region, indicate a persistent role of the Aleutian Low in the Bering Strait inflow and attendant deposition. More proxy studies are underway to reconstruct the history of this circulation system and its relationship with sea ice extent. The expected results will improve our understanding of natural variability in oceanic and atmospheric conditions at the Chukchi-Alaskan margin, a critical area for modulating the Arctic climate change.

  1. Radioactivity in the aquatic environment. A review of UK research 1994-1997 and recommendations for future work

    International Nuclear Information System (INIS)

    1998-07-01

    The national Radioactivity Research and Environmental Monitoring Committee (RADREM) provides a forum for liaison on UK research and monitoring in the radioactive substances and radioactive waste management fields. The committee aims to ensure that there is no unnecessary overlap between, or significant omission from, the research programmes of the various parts of Government, the regulatory bodies or industry. This report has been produced by the Aquatic Environment Sub-Committee (AESC) of RADREM. AESC is responsible for providing RADREM with scientific advice in the field of research relating to radionuclides in the aquatic environment, for reporting on the progress of research in this field and on future research requirements. The objectives of this report are presented in Section 2, and the membership of AESC given in Section 3. This report describes a review of research undertaken in the field of radioactivity in aquatic systems over the last three years (Section 4). The review updates previous reviews, the most recent of which being in 1993 (AESC, 1994). Future research requirements have been identified by AESC, considering past work and work in progress, and are presented in Section 5. Specific research requirements are discussed in Section 5, whilst Section 6 summarises the main areas where future research is identified as a priority. These areas are as follows: the movement and uptake of 99 Tc and 14 C in aquatic systems and biota; geochemical processes; off-shore sediments; non-equilibrium systems; radiation exposure during civil engineering works; further work on movement of radionuclides in salt marshes; development and validation of models. The specific objectives of this report are as follows: 1. To provide a summary of research undertaken in this field over the last three years. 2. To identify future research requirements. 3. To attach priorities to the future research requirements. It should be noted that the purpose of the report is to identify

  2. Nature, nurture, and capital punishment: How evidence of a genetic-environment interaction, future dangerousness, and deliberation affect sentencing decisions.

    Science.gov (United States)

    Gordon, Natalie; Greene, Edie

    2018-01-01

    Research has shown that the low-activity MAOA genotype in conjunction with a history of childhood maltreatment increases the likelihood of violent behaviors. This genetic-environment (G × E) interaction has been introduced as mitigation during the sentencing phase of capital trials, yet there is scant data on its effectiveness. This study addressed that issue. In a factorial design that varied mitigating evidence offered by the defense [environmental (i.e., childhood maltreatment), genetic, G × E, or none] and the likelihood of the defendant's future dangerousness (low or high), 600 mock jurors read sentencing phase evidence in a capital murder trial, rendered individual verdicts, and half deliberated as members of a jury to decide a sentence of death or life imprisonment. The G × E evidence had little mitigating effect on sentencing preferences: participants who received the G × E evidence were no less likely to sentence the defendant to death than those who received evidence of childhood maltreatment or a control group that received neither genetic nor maltreatment evidence. Participants with evidence of a G × E interaction were more likely to sentence the defendant to death when there was a high risk of future dangerousness than when there was a low risk. Sentencing preferences were more lenient after deliberation than before. We discuss limitations and future directions. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Effects of the Extraterrestrial Environment on Plants: Recommendations for Future Space Experiments for the MELiSSA Higher Plant Compartment

    Directory of Open Access Journals (Sweden)

    Silje A. Wolff

    2014-05-01

    Full Text Available Due to logistical challenges, long-term human space exploration missions require a life support system capable of regenerating all the essentials for survival. Higher plants can be utilized to provide a continuous supply of fresh food, atmosphere revitalization, and clean water for humans. Plants can adapt to extreme environments on Earth, and model plants have been shown to grow and develop through a full life cycle in microgravity. However, more knowledge about the long term effects of the extraterrestrial environment on plant growth and development is necessary. The European Space Agency (ESA has developed the Micro-Ecological Life Support System Alternative (MELiSSA program to develop a closed regenerative life support system, based on micro-organisms and higher plant processes, with continuous recycling of resources. In this context, a literature review to analyze the impact of the space environments on higher plants, with focus on gravity levels, magnetic fields and radiation, has been performed. This communication presents a roadmap giving directions for future scientific activities within space plant cultivation. The roadmap aims to identify the research activities required before higher plants can be included in regenerative life support systems in space.

  4. Effects of the Extraterrestrial Environment on Plants: Recommendations for Future Space Experiments for the MELiSSA Higher Plant Compartment.

    Science.gov (United States)

    Wolff, Silje A; Coelho, Liz H; Karoliussen, Irene; Jost, Ann-Iren Kittang

    2014-05-05

    Due to logistical challenges, long-term human space exploration missions require a life support system capable of regenerating all the essentials for survival. Higher plants can be utilized to provide a continuous supply of fresh food, atmosphere revitalization, and clean water for humans. Plants can adapt to extreme environments on Earth, and model plants have been shown to grow and develop through a full life cycle in microgravity. However, more knowledge about the long term effects of the extraterrestrial environment on plant growth and development is necessary. The European Space Agency (ESA) has developed the Micro-Ecological Life Support System Alternative (MELiSSA) program to develop a closed regenerative life support system, based on micro-organisms and higher plant processes, with continuous recycling of resources. In this context, a literature review to analyze the impact of the space environments on higher plants, with focus on gravity levels, magnetic fields and radiation, has been performed. This communication presents a roadmap giving directions for future scientific activities within space plant cultivation. The roadmap aims to identify the research activities required before higher plants can be included in regenerative life support systems in space.

  5. NASA's Planetary Science Summer School: Training Future Mission Leaders in a Concurrent Engineering Environment

    Science.gov (United States)

    Mitchell, K. L.; Lowes, L. L.; Budney, C. J.; Sohus, A.

    2014-12-01

    NASA's Planetary Science Summer School (PSSS) is an intensive program for postdocs and advanced graduate students in science and engineering fields with a keen interest in planetary exploration. The goal is to train the next generation of planetary science mission leaders in a hands-on environment involving a wide range of engineers and scientists. It was established in 1989, and has undergone several incarnations. Initially a series of seminars, it became a more formal mission design experience in 1999. Admission is competitive, with participants given financial support. The competitively selected trainees develop an early mission concept study in teams of 15-17, responsive to a typical NASA Science Mission Directorate Announcement of Opportunity. They select the mission concept from options presented by the course sponsors, based on high-priority missions as defined by the Decadal Survey, prepare a presentation for a proposal authorization review, present it to a senior review board and receive critical feedback. Each participant assumes multiple roles, on science, instrument and project teams. They develop an understanding of top-level science requirements and instrument priorities in advance through a series of reading assignments and webinars help trainees. Then, during the five day session at Jet Propulsion Laboratory, they work closely with concurrent engineers including JPL's Advanced Projects Design Team ("Team X"), a cross-functional multidisciplinary team of engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. All are mentored and assisted directly by Team X members and course tutors in their assigned project roles. There is a strong emphasis on making difficult trades, simulating a real mission design process as accurately as possible. The process is intense and at times dramatic, with fast-paced design sessions and late evening study sessions. A survey of PSSS alumni

  6. Environment

    Science.gov (United States)

    2005-01-01

    biodiversity. Consequently, the major environmental challenges facing us in the 21st century include: global climate change , energy, population and food...technological prowess, and security interests. Challenges Global Climate Change – Evidence shows that our environment and the global climate ... urbanization will continue to pressure the regional environment . Although most countries have environmental protection ministries or agencies, a lack of

  7. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Karbach, Carsten [Julich Research Center (Germany); Frings, Wolfgang [Julich Research Center (Germany)

    2013-02-22

    resources form the user display of LLview. These monitoring features have to be integrated into the development environment. Besides showing the current status PTP's monitoring also needs to allow for submitting and canceling user jobs. Monitoring peta-scale systems especially deals with presenting the large amount of status data in a useful manner. Users require to select arbitrary levels of detail. The monitoring views have to provide a quick overview of the system state, but also need to allow for zooming into specific parts of the system, into which the user is interested in. At present, the major batch systems running on supercomputers are PBS, TORQUE, ALPS and LoadLeveler, which have to be supported by both the monitoring and the job controlling component. Finally, PTP needs to be designed as generic as possible, so that it can be extended for future batch systems.

  8. Decadal analysis of impact of future climate on wheat production in dry Mediterranean environment: A case of Jordan.

    Science.gov (United States)

    Dixit, Prakash N; Telleria, Roberto; Al Khatib, Amal N; Allouzi, Siham F

    2018-01-01

    Different aspects of climate change, such as increased temperature, changed rainfall and higher atmospheric CO 2 concentration, all have different effects on crop yields. Process-based crop models are the most widely used tools for estimating future crop yield responses to climate change. We applied APSIM crop simulation model in a dry Mediterranean climate with Jordan as sentinel site to assess impact of climate change on wheat production at decadal level considering two climate change scenarios of representative concentration pathways (RCP) viz., RCP4.5 and RCP8.5. Impact of climatic variables alone was negative on grain yield but this adverse effect was negated when elevated atmospheric CO 2 concentrations were also considered in the simulations. Crop cycle of wheat was reduced by a fortnight for RCP4.5 scenario and by a month for RCP8.5 scenario at the approach of end of the century. On an average, a grain yield increase of 5 to 11% in near future i.e., 2010s-2030s decades, 12 to 16% in mid future i.e., 2040s-2060s decades and 9 to 16% in end of century period can be expected for moderate climate change scenario (RCP4.5) and 6 to 15% in near future, 13 to 19% in mid future and 7 to 20% increase in end of century period for a drastic climate change scenario (RCP8.5) based on different soils. Positive impact of elevated CO 2 is more pronounced in soils with lower water holding capacity with moderate increase in temperatures. Elevated CO 2 had greater positive effect on transpiration use efficiency (TUE) than negative effect of elevated mean temperatures. The change in TUE was in near perfect direct relationship with elevated CO 2 levels (R 2 >0.99) and every 100-ppm atmospheric CO 2 increase resulted in TUE increase by 2kgha -1 mm -1 . Thereby, in this environment yield gains are expected in future and farmers can benefit from growing wheat. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  10. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    International Nuclear Information System (INIS)

    Klimentov, A; Maeno, T; Nilsson, P; Panitkin, S; Wenaus, T; De, K; Oleynik, D; Jha, S; Wells, J

    2016-01-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  11. Integration Of PanDA Workload Management System With Supercomputers for ATLAS and Data Intensive Science

    Science.gov (United States)

    Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.

    2016-10-01

    The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the

  12. A review on emerging contaminants in wastewaters and the environment: current knowledge, understudied areas and recommendations for future monitoring.

    Science.gov (United States)

    Petrie, Bruce; Barden, Ruth; Kasprzyk-Hordern, Barbara

    2015-04-01

    This review identifies understudied areas of emerging contaminant (EC) research in wastewaters and the environment, and recommends direction for future monitoring. Non-regulated trace organic ECs including pharmaceuticals, illicit drugs and personal care products are focused on due to ongoing policy initiatives and the expectant broadening of environmental legislation. These ECs are ubiquitous in the aquatic environment, mainly derived from the discharge of municipal wastewater effluents. Their presence is of concern due to the possible ecological impact (e.g., endocrine disruption) to biota within the environment. To better understand their fate in wastewaters and in the environment, a standardised approach to sampling is needed. This ensures representative data is attained and facilitates a better understanding of spatial and temporal trends of EC occurrence. During wastewater treatment, there is a lack of suspended particulate matter analysis due to further preparation requirements and a lack of good analytical approaches. This results in the under-reporting of several ECs entering wastewater treatment works (WwTWs) and the aquatic environment. Also, sludge can act as a concentrating medium for some chemicals during wastewater treatment. The majority of treated sludge is applied directly to agricultural land without analysis for ECs. As a result there is a paucity of information on the fate of ECs in soils and consequently, there has been no driver to investigate the toxicity to exposed terrestrial organisms. Therefore a more holistic approach to environmental monitoring is required, such that the fate and impact of ECs in all exposed environmental compartments are studied. The traditional analytical approach of applying targeted screening with low resolution mass spectrometry (e.g., triple quadrupoles) results in numerous chemicals such as transformation products going undetected. These can exhibit similar toxicity to the parent EC, demonstrating the necessity

  13. Preliminary design of CERN Future Circular Collider tunnel: first evaluation of the radiation environment in critical areas for electronics

    Directory of Open Access Journals (Sweden)

    Infantino Angelo

    2017-01-01

    Full Text Available As part of its post-LHC high energy physics program, CERN is conducting a study for a new proton-proton collider, called Future Circular Collider (FCC-hh, running at center-of-mass energies of up to 100 TeV in a new 100 km tunnel. The study includes a 90-350 GeV lepton collider (FCC-ee as well as a lepton-hadron option (FCC-he. In this work, FLUKA Monte Carlo simulation was extensively used to perform a first evaluation of the radiation environment in critical areas for electronics in the FCC-hh tunnel. The model of the tunnel was created based on the original civil engineering studies already performed and further integrated in the existing FLUKA models of the beam line. The radiation levels in critical areas, such as the racks for electronics and cables, power converters, service areas, local tunnel extensions was evaluated.

  14. Radioactivity in the terrestrial environment; review of UK research 1993-1996 and recommendations for future work

    International Nuclear Information System (INIS)

    1997-03-01

    The national Radioactivity Research and Environmental Monitoring Committee (RADREM) provides a forum for liaison on UK research and monitoring in the radioactive substances and radioactive waste management fields. It is subscribed to by Government departments, national regulatory bodies, the UK nuclear industry and other bodies with relevant research sponsorship and monitoring interests. A key function of the RADREM committee is to ensure that there is no unnecessary overlap between or significant omission from the research sponsored by the organisations represented upon it. To this end periodic reviews of research sector programmes are carried out. This report covers a review which was carried out by the Terrestrial Environment Sub-Committee (TESC) of RADREM for the period 1993-1996. In particular possible future research requirements are considered and evaluated. Such omissions are as identified do not reflect Sub-Committee views on the adequacy of any individual organisations research programme. Rather they should be seen as areas where gaps in knowledge may exist, which all organisations are free to consider and prioritise in the formulation of their future research requirements. (author)

  15. Hazardous waste, impact on health and environment for development of better waste management strategies in future in India.

    Science.gov (United States)

    Misra, Virendra; Pandey, S D

    2005-04-01

    Industry has become an essential part of modern society, and waste production is an inevitable outcome of the developmental activities. A material becomes waste when it is discarded without expecting to be compensated for its inherent value. These wastes may pose a potential hazard to the human health or the environment (soil, air, water) when improperly treated, stored, transported or disposed off or managed. Currently in India even though hazardous wastes, emanations and effluents are regulated, solid wastes often are disposed off indiscriminately posing health and environmental risk. In view of this, management of hazardous wastes including their disposal in environment friendly and economically viable way is very important and therefore suggestions are made for developing better strategies. Out of the various categories of the wastes, solid waste contributes a major share towards environmental degradation. The present paper outlines the nature of the wastes, waste generating industries, waste characterization, health and environmental implications of wastes management practices, steps towards planning, design and development of models for effective hazardous waste management, treatment, approaches and regulations for disposal of hazardous waste. Appraisal of the whole situation with reference to Indian scenario is attempted so that a better cost-effective strategies for waste management be evolved in future.

  16. Impact of nanoparticles on human and environment: review of toxicity factors, exposures, control strategies, and future prospects.

    Science.gov (United States)

    Sajid, Muhammad; Ilyas, Muhammad; Basheer, Chanbasha; Tariq, Madiha; Daud, Muhammad; Baig, Nadeem; Shehzad, Farrukh

    2015-03-01

    Nanotechnology has revolutionized the world through introduction of a unique class of materials and consumer products in many arenas. It has led to production of innovative materials and devices. Despite of their unique advantages and applications in domestic and industrial sectors, use of materials with dimensions in nanometers has raised the issue of safety for workers, consumers, and human environment. Because of their small size and other unique characteristics, nanoparticles have ability to harm human and wildlife by interacting through various mechanisms. We have reviewed the characteristics of nanoparticles which form the basis of their toxicity. This paper also reviews possible routes of exposure of nanoparticles to human body. Dermal contact, inhalation, and ingestion have been discussed in detail. As very limited data is available for long-term human exposures, there is a pressing need to develop the methods which can determine short and long-term effects of nanoparticles on human and environment. We also discuss in brief the strategies which can help to control human exposures to toxic nanoparticles. We have outlined the current status of toxicological studies dealing with nanoparticles, accomplishments, weaknesses, and future challenges.

  17. 17th Edition of TOP500 List of World's Fastest SupercomputersReseased

    Energy Technology Data Exchange (ETDEWEB)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack J.; Simon,Horst D.

    2001-06-21

    17th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, GERMANY; KNOXVILLE, TENN.; BERKELEY, CALIF. In what has become a much-anticipated event in the world of high-performance computing, the 17th edition of the TOP500 list of the world's fastest supercomputers was released today (June 21). The latest edition of the twice-yearly ranking finds IBM as the leader in the field, with 40 percent in terms of installed systems and 43 percent in terms of total performance of all the installed systems. In second place in terms of installed systems is Sun Microsystems with 16 percent, while Cray Inc. retained second place in terms of performance (13 percent). SGI Inc. was third both with respect to systems with 63 (12.6 percent) and performance (10.2 percent).

  18. Direct exploitation of a top 500 Supercomputer for Analysis of CMS Data

    Science.gov (United States)

    Cabrillo, I.; Cabellos, L.; Marco, J.; Fernandez, J.; Gonzalez, I.

    2014-06-01

    The Altamira Supercomputer hosted at the Instituto de Fisica de Cantatbria (IFCA) entered in operation in summer 2012. Its last generation FDR Infiniband network used (for message passing) in parallel jobs, supports the connection to General Parallel File System (GPFS) servers, enabling an efficient simultaneous processing of multiple data demanding jobs. Sharing a common GPFS system and a single LDAP-based identification with the existing Grid clusters at IFCA allows CMS researchers to exploit the large instantaneous capacity of this supercomputer to execute analysis jobs. The detailed experience describing this opportunistic use for skimming and final analysis of CMS 2012 data for a specific physics channel, resulting in an order of magnitude reduction of the waiting time, is presented.

  19. Direct exploitation of a top 500 Supercomputer for Analysis of CMS Data

    International Nuclear Information System (INIS)

    Cabrillo, I; Cabellos, L; Marco, J; Fernandez, J; Gonzalez, I

    2014-01-01

    The Altamira Supercomputer hosted at the Instituto de Fisica de Cantatbria (IFCA) entered in operation in summer 2012. Its last generation FDR Infiniband network used (for message passing) in parallel jobs, supports the connection to General Parallel File System (GPFS) servers, enabling an efficient simultaneous processing of multiple data demanding jobs. Sharing a common GPFS system and a single LDAP-based identification with the existing Grid clusters at IFCA allows CMS researchers to exploit the large instantaneous capacity of this supercomputer to execute analysis jobs. The detailed experience describing this opportunistic use for skimming and final analysis of CMS 2012 data for a specific physics channel, resulting in an order of magnitude reduction of the waiting time, is presented.

  20. Explaining the gap between theoretical peak performance and real performance for supercomputer architectures

    International Nuclear Information System (INIS)

    Schoenauer, W.; Haefner, H.

    1993-01-01

    The basic architectures of vector and parallel computers with their properties are presented. Then the memory size and the arithmetic operations in the context of memory bandwidth are discussed. For the exemplary discussion of a single operation micro-measurements of the vector triad for the IBM 3090 VF and the CRAY Y-MP/8 are presented. They reveal the details of the losses for a single operation. Then we analyze the global performance of a whole supercomputer by identifying reduction factors that bring down the theoretical peak performance to the poor real performance. The responsibilities of the manufacturer and of the user for these losses are dicussed. Then the price-performance ratio for different architectures in a snapshot of January 1991 is briefly mentioned. Finally some remarks to a user-friendly architecture for a supercomputer will be made. (orig.)

  1. Design and performance characterization of electronic structure calculations on massively parallel supercomputers

    DEFF Research Database (Denmark)

    Romero, N. A.; Glinsvad, Christian; Larsen, Ask Hjorth

    2013-01-01

    's ability to study larger system sizes. Ground-state DFT calculations on∼103 valence electrons using traditional O(N3) algorithms can be routinely performed on present-day supercomputers. The performance characteristics of these massively parallel DFT codes on>104 computer cores are not well understood......Density function theory (DFT) is the most widely employed electronic structure method because of its favorable scaling with system size and accuracy for a broad range of molecular and condensed-phase systems. The advent of massively parallel supercomputers has enhanced the scientific community....... The GPAW code was ported an optimized for the Blue Gene/P architecture. We present our algorithmic parallelization strategy and interpret the results for a number of benchmark test cases....

  2. Technology - environment - future

    International Nuclear Information System (INIS)

    1980-01-01

    This volume contains the materials of the meeting Scientific-technical progress and sociological alternatives organized in March 1980 by the Institute for Marxistic Studies and Research (IMSF). The goal of the meeting was to give a view at the present level of knowledge and discussion among the Federal Republic's Marxists on the direction, the social and ecological consequences of the development of science and technique under the conditions of capitalism. The arguments with the bourgeois opinions of the relation between technique and society was paid special attention to, as well as the discussion on alternative sociological concepts. (HSCH) [de

  3. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  4. US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.

  5. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    Science.gov (United States)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  6. Application of super-computer to field of plasma physics and nuclear fusion research

    International Nuclear Information System (INIS)

    In the fields of plasma physics and nuclear fusion research, for carrying out the various numerical simulation of plasma, enormous amount of computation is required, accordingly, the computers as fast as possible are necessary. In the Electronic Computer Center attached to the Institute for Plasma Physics, Nagoya University, which is a national common utilization facility, the full scale super-computer VP-100 made in Japan was adopted, and utmost effort has been exerted to increase the speed of large scale computation as well as numerical simulation. In this paper, super-computers are briefly explained, and the example of especially the aspect of performance in the case of the Computer Center is described. Actually, the scale of simulation is limited by the capability of computers. The performance obtained by the computer code for research and the average performance obtained by the computer codes of wide range by using the VP-100 are discussed. Present fast computers are about 10 MFLOPS, therefore, the computers having the performance surpassing 100 MFLOPS may be super-computers. The features of vector computers, the performance related to application programs and others are reported. (Kako, I.)

  7. Technologies for the people: a future in the making

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, D.C.

    2004-09-01

    India's post-independence policy of using science and technology for national development, and investment in research and development infrastructure resulted in success in space, atomic energy, missile development and supercomputing. Use of space technology has impacted directly or indirectly the vast majority of India's billion plus population. Developments in a number of emerging technologies in recent years hold the promise of impacting the future of ordinary Indians in significant ways, if a proper policy and enabling environment are provided. New telecom technologies - a digital rural exchange and a wireless access system - are beginning to touch the lives of common people. Development of a low-cost hand held computing device, use of hybrid telemedicine systems to extend modem healthcare to the unreached, and other innovative uses of IT at the grassroots also hold promise for the future. Biotechnology too has the potential to deliver cost-effective vaccines and drugs, but the future of GM crops is uncertain due to growing opposition. Some of these emerging technologies hold promise for future, provided a positive policy and enabling environment. (author)

  8. High Performance Simulation of Large-Scale Red Sea Ocean Bottom Seismic Data on the Supercomputer Shaheen II

    KAUST Repository

    Tonellot, Thierry

    2017-02-27

    than three days. After careful optimization of the finite difference kernel, each gather was computed at 184 gigaflops, on average. Up to 6,103 nodes could be used during the computation, resulting in a peak computation speed greater than 1.11 petaflops. The synthetic seismic data using the planned survey geometry was available one month before the actual acquisition, allowing for early real scale validation of our processing and imaging workflows. Moreover, the availability of a massive supercomputer such as Shaheen II enables fast reverse time migration (RTM) and full waveform inversion, and therefore, a more accurate velocity model estimation for future work.

  9. ATLAS FTK a - very complex - custom parallel supercomputer

    CERN Document Server

    Kimura, Naoki; The ATLAS collaboration

    2016-01-01

    In the ever increasing pile-up LHC environment advanced techniques of analysing the data are implemented in order to increase the rate of relevant physics processes with respect to background processes. The Fast TracKer (FTK) is a track finding implementation at hardware level that is designed to deliver full-scan tracks with $p_{T}$ above 1GeV to the ATLAS trigger system for every L1 accept (at a maximum rate of 100kHz). In order to achieve this performance a highly parallel system was designed and now it is under installation in ATLAS. In the beginning of 2016 it will provide tracks for the trigger system in a region covering the central part of the ATLAS detector, and during the year it's coverage will be extended to the full detector coverage. The system relies on matching hits coming from the silicon tracking detectors against 1 billion patterns stored in specially designed ASICS chips (Associative memory - AM06). In a first stage coarse resolution hits are matched against the patterns and the accepted h...

  10. Combining density functional theory calculations, supercomputing, and data-driven methods to design new materials (Conference Presentation)

    Science.gov (United States)

    Jain, Anubhav

    2017-04-01

    Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.

  11. Serving two purposes: Plans for a MOOC and a World Campus course called Energy, the Environment, and Our Future (Invited)

    Science.gov (United States)

    Bralower, T. J.; Alley, R. B.; Blumsack, S.; Keller, K.; Feineman, M. D.

    2013-12-01

    We are in the final stages of developing a Massive Open Online Course entitled Energy, the Environment, and Our Future. The course is a broad overview of the implications of the current energy options on Earth's climate and the choices for more sustainable energy sources in the future. The course is founded in concepts explored in the book and PBS series Earth: The Operators' Manual, but it includes more in-depth treatment of renewable energy as well as the ethical issues surrounding energy choices. One of the key aspects of the course is that it is being designed to be taught in two formats, the first, an eight week MOOC through Coursera in Fall semester 2013, and the second, a 16 week online course developed as part of the NSF Geo-STEP InTeGrate program and offered through the Penn State World Campus. The advantage of the MOOC format is the ability to reach out to thousands of students worldwide, exposing them to the science behind important issues that may have a direct impact on the lifestyle decisions they make, while the World Campus course allows us to explore deeper levels of cognition through application of carefully designed pedagogies. The principal difference between the two versions of the course will be assessment. The MOOC will have embedded assessment between pages and end of module quizzes. The InTeGrate course will have a range of assessments that are directly linked to the goals and objectives of the course. These will include active learning exercises built around energy and climate data. Both of the versions are works in progress and we anticipate modifying them regularly based on student feedback.

  12. Evaluating Satellite and Supercomputing Technologies for Improved Coastal Ecosystem Assessments

    Science.gov (United States)

    McCarthy, Matthew James

    Water quality and wetlands represent two vital elements of a healthy coastal ecosystem. Both experienced substantial declines in the U.S. during the 20th century. Overall coastal wetland cover decreased over 50% in the 20th century due to coastal development and water pollution. Management and legislative efforts have successfully addressed some of the problems and threats, but recent research indicates that the diffuse impacts of climate change and non-point source pollution may be the primary drivers of current and future water-quality and wetland stress. In order to respond to these pervasive threats, traditional management approaches need to adopt modern technological tools for more synoptic, frequent and fine-scale monitoring and assessment. In this dissertation, I explored some of the applications possible with new, commercial satellite imagery to better assess the status of coastal ecosystems. Large-scale land-cover change influences the quality of adjacent coastal water. Satellite imagery has been used to derive land-cover maps since the 1960's. It provides multiple data points with which to evaluate the effects of land-cover change on water quality. The objective of the first chapter of this research was to determine how 40 years of land-cover change in the Tampa Bay watershed (6,500 km2) may have affected turbidity and chlorophyll concentration - two proxies for coastal water quality. Land cover classes were evaluated along with precipitation and wind stress as explanatory variables. Results varied between analyses for the entire estuary and those of segments within the bay. Changes in developed land percent cover best explained the turbidity and chlorophyll-concentration time series for the entire bay (R2 > 0.75, p metrics were evaluated against atmospheric, meteorological, and oceanographic variables including precipitation, wind speed, U and V wind vectors, river discharge, and water level over weekly, monthly, seasonal and annual time steps. Climate

  13. Human–environment interactions in urban green spaces — A systematic review of contemporary issues and prospects for future research

    Energy Technology Data Exchange (ETDEWEB)

    Kabisch, Nadja, E-mail: nadja.kabisch@geo.hu-berlin.de [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); Department of Urban and Environmental Sociology, Helmholtz Centre for Environmental Research — UFZ, 04318 Leipzig (Germany); Qureshi, Salman [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); School of Architecture, Birmingham Institute of Art and Design, Birmingham City University, The Parkside Building, 5 Cardigan Street, Birmingham B4 7BD (United Kingdom); Haase, Dagmar [Institute of Geography, Humboldt-University Berlin, Unter den Linden 6, 10099 Berlin (Germany); Department of Computational Landscape Ecology, Helmholtz Centre for Environmental Research — UFZ, 04318 Leipzig (Germany)

    2015-01-15

    Scientific papers on landscape planning underline the importance of maintaining and developing green spaces because of their multiple environmental and social benefits for city residents. However, a general understanding of contemporary human–environment interaction issues in urban green space is still incomplete and lacks orientation for urban planners. This review examines 219 publications to (1) provide an overview of the current state of research on the relationship between humans and urban green space, (2) group the different research approaches by identifying the main research areas, methods, and target groups, and (3) highlight important future prospects in urban green space research. - Highlights: • Reviewed literature on urban green pins down a dearth of comparative studies. • Case studies in Africa and Russia are marginalized – the Europe and US dominate. • Questionnaires are used as major tool followed by GIS and quantitative approaches. • Developing countries should contribute in building an urban green space agenda. • Interdisciplinary, adaptable and pluralistic approaches can satiate a knowledge gap.

  14. Information Environment is an Integral Element of Informational Space in the Process of Professional Development of Future Teacher of Physical Culture

    Directory of Open Access Journals (Sweden)

    Yuri V. Dragnev

    2012-04-01

    Full Text Available The article examines information environment as an integral element of information space in the process of professional development of future teacher of physical culture, notes that the strategic objective of the system of higher education is training of competent future teacher of physical culture in the field of information technologies, when information competence and information culture are major components of professionalism in modern information-oriented society

  15. Conceptual Provisions of the Educational System of Professional Training of a Future Teacher of Physical Culture in Terms of Informational and Educational Environment

    Directory of Open Access Journals (Sweden)

    Yurii V. Dragnev

    2013-01-01

    Full Text Available proves that the system approach in the problem of professional development of a future teacher of physical culture in terms of informational and educational environment is the basic trend of scientific cognition, determines that the logic of the modern modernization of the higher athletic education in general is the initial precondition, defining the establishment of the concept of professional development of a future teacher of physical culture

  16. Environment

    International Nuclear Information System (INIS)

    McIntyre, A.D.; Turnbull, R.G.H.

    1992-01-01

    The development of the hydrocarbon resources of the North Sea has resulted in both offshore and onshore environmental repercussions, involving the existing physical attributes of the sea and seabed, the coastline and adjoining land. The social and economic repercussions of the industry were equally widespread. The dramatic and speedy impact of the exploration and exploitation of the northern North Sea resources in the early 1970s, on the physical resources of Scotland was quickly realised together with the concern that any environmental and social damage to the physical and social fabric should be kept to a minimum. To this end, a wide range of research and other activities by central and local government, and other interested agencies was undertaken to extend existing knowledge on the marine and terrestrial environments that might be affected by the oil and gas industry. The outcome of these activities is summarized in this paper. The topics covered include a survey of the marine ecosystems of the North Sea, the fishing industry, the impact of oil pollution on seabirds and fish stocks, the ecology of the Scottish coastline and the impact of the petroleum industry on a selection of particular sites. (author)

  17. Traditional foods and practices of Spanish-speaking Latina mothers influence the home food environment: implications for future interventions.

    Science.gov (United States)

    Evans, Alexandra; Chow, Sherman; Jennings, Rose; Dave, Jayna; Scoblick, Kathryn; Sterba, Katherine Regan; Loyo, Jennifer

    2011-07-01

    This study aimed to obtain in-depth information from low-income, Spanish-speaking Latino families with young children to guide the development of culturally appropriate nutrition interventions. Focus groups were used to assess parent's knowledge about healthful eating, the home food environment, perceived influences on children's eating habits, food purchasing practices, and commonly used strategies to promote healthful eating among their children. Thirty-four Latino parents (33 women; 27 born in Mexico; 21 food-insecure) of preschool-aged children participated in four focus group discussions conducted in Spanish by a trained moderator. The focus groups were audiotaped, transcribed, translated, and coded by independent raters. Results suggest that in general, parents were very knowledgeable about healthful eating and cited both parents and school as significant factors influencing children's eating habits; at home, most families had more traditional Mexican foods available than American foods; cost and familiarity with foods were the most influential factors affecting food purchasing; many parents had rules regarding sugar intake; and parents cited role modeling, reinforcement, and creative food preparation as ways to encourage children's healthful eating habits. Finally, parents generated ideas on how to best assist Latino families through interventions. Parents indicated that future interventions should be community based and teach skills to purchase and prepare meals that include low-cost and traditional Mexican ingredients, using hands-on activities. In addition, interventions could encourage and reinforce healthy food-related practices that Latino families bring from their native countries. Copyright © 2011 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  18. STAMPS: software tool for automated MRI post-processing on a supercomputer

    OpenAIRE

    Bigler, Don C.; Aksu, Yaman; Yang, Qing X.

    2009-01-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features....

  19. Watson will see you now: a supercomputer to help clinicians make informed treatment decisions.

    Science.gov (United States)

    Doyle-Lindrud, Susan

    2015-02-01

    IBM has collaborated with several cancer care providers to develop and train the IBM supercomputer Watson to help clinicians make informed treatment decisions. When a patient is seen in clinic, the oncologist can input all of the clinical information into the computer system. Watson will then review all of the data and recommend treatment options based on the latest evidence and guidelines. Once the oncologist makes the treatment decision, this information can be sent directly to the insurance company for approval. Watson has the ability to standardize care and accelerate the approval process, a benefit to the healthcare provider and the patient.

  20. Scalable parallel programming for high performance seismic simulation on petascale heterogeneous supercomputers

    Science.gov (United States)

    Zhou, Jun

    The 1994 Northridge earthquake in Los Angeles, California, killed 57 people, injured over 8,700 and caused an estimated $20 billion in damage. Petascale simulations are needed in California and elsewhere to provide society with a better understanding of the rupture and wave dynamics of the largest earthquakes at shaking frequencies required to engineer safe structures. As the heterogeneous supercomputing infrastructures are becoming more common, numerical developments in earthquake system research are particularly challenged by the dependence on the accelerator elements to enable "the Big One" simulations with higher frequency and finer resolution. Reducing time to solution and power consumption are two primary focus area today for the enabling technology of fault rupture dynamics and seismic wave propagation in realistic 3D models of the crust's heterogeneous structure. This dissertation presents scalable parallel programming techniques for high performance seismic simulation running on petascale heterogeneous supercomputers. A real world earthquake simulation code, AWP-ODC, one of the most advanced earthquake codes to date, was chosen as the base code in this research, and the testbed is based on Titan at Oak Ridge National Laboraratory, the world's largest hetergeneous supercomputer. The research work is primarily related to architecture study, computation performance tuning and software system scalability. An earthquake simulation workflow has also been developed to support the efficient production sets of simulations. The highlights of the technical development are an aggressive performance optimization focusing on data locality and a notable data communication model that hides the data communication latency. This development results in the optimal computation efficiency and throughput for the 13-point stencil code on heterogeneous systems, which can be extended to general high-order stencil codes. Started from scratch, the hybrid CPU/GPU version of AWP

  1. Grassroots Supercomputing

    CERN Multimedia

    Buchanan, Mark

    2005-01-01

    What started out as a way for SETI to plow through its piles or radio-signal data from deep space has turned into a powerful research tool as computer users acrosse the globe donate their screen-saver time to projects as diverse as climate-change prediction, gravitational-wave searches, and protein folding (4 pages)

  2. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    Panitkin, Sergey; The ATLAS collaboration; Klimentov, Alexei; Oleynik, Danila; Petrosyan, Artem; Schovancova, Jaroslava; Vaniachine, Alexandre; Wenaus, Torre

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently uses more than 100,000 cores at well over 100 Grid sites with a peak performance of 0.3 petaFLOPS, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real tim...

  3. Communication Characterization and Optimization of Applications Using Topology-Aware Task Mapping on Large Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; D' Azevedo, Eduardo [ORNL; Philip, Bobby [ORNL; Worley, Patrick H [ORNL

    2016-01-01

    On large supercomputers, the job scheduling systems may assign a non-contiguous node allocation for user applications depending on available resources. With parallel applications using MPI (Message Passing Interface), the default process ordering does not take into account the actual physical node layout available to the application. This contributes to non-locality in terms of physical network topology and impacts communication performance of the application. In order to mitigate such performance penalties, this work describes techniques to identify suitable task mapping that takes the layout of the allocated nodes as well as the application's communication behavior into account. During the first phase of this research, we instrumented and collected performance data to characterize communication behavior of critical US DOE (United States - Department of Energy) applications using an augmented version of the mpiP tool. Subsequently, we developed several reordering methods (spectral bisection, neighbor join tree etc.) to combine node layout and application communication data for optimized task placement. We developed a tool called mpiAproxy to facilitate detailed evaluation of the various reordering algorithms without requiring full application executions. This work presents a comprehensive performance evaluation (14,000 experiments) of the various task mapping techniques in lowering communication costs on Titan, the leadership class supercomputer at Oak Ridge National Laboratory.

  4. Computational Science with the Titan Supercomputer: Early Outcomes and Lessons Learned

    Science.gov (United States)

    Wells, Jack

    2014-03-01

    Modeling and simulation with petascale computing has supercharged the process of innovation and understanding, dramatically accelerating time-to-insight and time-to-discovery. This presentation will focus on early outcomes from the Titan supercomputer at the Oak Ridge National Laboratory. Titan has over 18,000 hybrid compute nodes consisting of both CPUs and GPUs. In this presentation, I will discuss the lessons we have learned in deploying Titan and preparing applications to move from conventional CPU architectures to a hybrid machine. I will present early results of materials applications running on Titan and the implications for the research community as we prepare for exascale supercomputer in the next decade. Lastly, I will provide an overview of user programs at the Oak Ridge Leadership Computing Facility with specific information how researchers may apply for allocations of computing resources. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.

  5. Unique Methodologies for Nano/Micro Manufacturing Job Training Via Desktop Supercomputer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, Clyde [Northern Illinois Univ., DeKalb, IL (United States); Karonis, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Lurio, Laurence [Northern Illinois Univ., DeKalb, IL (United States); Piot, Philippe [Northern Illinois Univ., DeKalb, IL (United States); Xiao, Zhili [Northern Illinois Univ., DeKalb, IL (United States); Glatz, Andreas [Northern Illinois Univ., DeKalb, IL (United States); Pohlman, Nicholas [Northern Illinois Univ., DeKalb, IL (United States); Hou, Minmei [Northern Illinois Univ., DeKalb, IL (United States); Demir, Veysel [Northern Illinois Univ., DeKalb, IL (United States); Song, Jie [Northern Illinois Univ., DeKalb, IL (United States); Duffin, Kirk [Northern Illinois Univ., DeKalb, IL (United States); Johns, Mitrick [Northern Illinois Univ., DeKalb, IL (United States); Sims, Thomas [Northern Illinois Univ., DeKalb, IL (United States); Yin, Yanbin [Northern Illinois Univ., DeKalb, IL (United States)

    2012-11-21

    This project establishes an initiative in high speed (Teraflop)/large-memory desktop supercomputing for modeling and simulation of dynamic processes important for energy and industrial applications. It provides a training ground for employment of current students in an emerging field with skills necessary to access the large supercomputing systems now present at DOE laboratories. It also provides a foundation for NIU faculty to quantum leap beyond their current small cluster facilities. The funding extends faculty and student capability to a new level of analytic skills with concomitant publication avenues. The components of the Hewlett Packard computer obtained by the DOE funds create a hybrid combination of a Graphics Processing System (12 GPU/Teraflops) and a Beowulf CPU system (144 CPU), the first expandable via the NIU GAEA system to ~60 Teraflops integrated with a 720 CPU Beowulf system. The software is based on access to the NVIDIA/CUDA library and the ability through MATLAB multiple licenses to create additional local programs. A number of existing programs are being transferred to the CPU Beowulf Cluster. Since the expertise necessary to create the parallel processing applications has recently been obtained at NIU, this effort for software development is in an early stage. The educational program has been initiated via formal tutorials and classroom curricula designed for the coming year. Specifically, the cost focus was on hardware acquisitions and appointment of graduate students for a wide range of applications in engineering, physics and computer science.

  6. PREFACE: HITES 2012: 'Horizons of Innovative Theories, Experiments, and Supercomputing in Nuclear Physics'

    Science.gov (United States)

    Hecht, K. T.

    2012-12-01

    This volume contains the contributions of the speakers of an international conference in honor of Jerry Draayer's 70th birthday, entitled 'Horizons of Innovative Theories, Experiments and Supercomputing in Nuclear Physics'. The list of contributors includes not only international experts in these fields, but also many former collaborators, former graduate students, and former postdoctoral fellows of Jerry Draayer, stressing innovative theories such as special symmetries and supercomputing, both of particular interest to Jerry. The organizers of the conference intended to honor Jerry Draayer not only for his seminal contributions in these fields, but also for his administrative skills at departmental, university, national and international level. Signed: Ted Hecht University of Michigan Conference photograph Scientific Advisory Committee Ani AprahamianUniversity of Notre Dame Baha BalantekinUniversity of Wisconsin Bruce BarrettUniversity of Arizona Umit CatalyurekOhio State Unversity David DeanOak Ridge National Laboratory Jutta Escher (Chair)Lawrence Livermore National Laboratory Jorge HirschUNAM, Mexico David RoweUniversity of Toronto Brad Sherill & Michigan State University Joel TohlineLouisiana State University Edward ZganjarLousiana State University Organizing Committee Jeff BlackmonLouisiana State University Mark CaprioUniversity of Notre Dame Tomas DytrychLouisiana State University Ana GeorgievaINRNE, Bulgaria Kristina Launey (Co-chair)Louisiana State University Gabriella PopaOhio University Zanesville James Vary (Co-chair)Iowa State University Local Organizing Committee Laura LinhardtLouisiana State University Charlie RascoLouisiana State University Karen Richard (Coordinator)Louisiana State University

  7. An Interface for Biomedical Big Data Processing on the Tianhe-2 Supercomputer.

    Science.gov (United States)

    Yang, Xi; Wu, Chengkun; Lu, Kai; Fang, Lin; Zhang, Yong; Li, Shengkang; Guo, Guixin; Du, YunFei

    2017-12-01

    Big data, cloud computing, and high-performance computing (HPC) are at the verge of convergence. Cloud computing is already playing an active part in big data processing with the help of big data frameworks like Hadoop and Spark. The recent upsurge of high-performance computing in China provides extra possibilities and capacity to address the challenges associated with big data. In this paper, we propose Orion-a big data interface on the Tianhe-2 supercomputer-to enable big data applications to run on Tianhe-2 via a single command or a shell script. Orion supports multiple users, and each user can launch multiple tasks. It minimizes the effort needed to initiate big data applications on the Tianhe-2 supercomputer via automated configuration. Orion follows the "allocate-when-needed" paradigm, and it avoids the idle occupation of computational resources. We tested the utility and performance of Orion using a big genomic dataset and achieved a satisfactory performance on Tianhe-2 with very few modifications to existing applications that were implemented in Hadoop/Spark. In summary, Orion provides a practical and economical interface for big data processing on Tianhe-2.

  8. Graph visualization for the analysis of the structure and dynamics of extreme-scale supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Berkbigler, K. P. (Kathryn P.); Bush, B. W. (Brian W.); Davis, Kei,; Hoisie, A. (Adolfy); Smith, S. A. (Steve A.)

    2002-01-01

    We are exploring the development and application of information visualization techniques for the analysis of new extreme-scale supercomputer architectures. Modern supercomputers typically comprise very large clusters of commodity SMPs interconnected by possibly dense and often nonstandard networks. The scale, complexity, and inherent nonlocality of the structure and dynamics of this hardware, and the systems and applications distributed over it, challenge traditional analysis methods. As part of the a la carte team at Los Alamos National Laboratory, who are simulating these advanced architectures, we are exploring advanced visualization techniques and creating tools to provide intuitive exploration, discovery, and analysis of these simulations. This work complements existing and emerging algorithmic analysis tools. Here we gives background on the problem domain, a description of a prototypical computer architecture of interest (on the order of 10,000 processors connected by a quaternary fat-tree network), and presentations of several visualizations of the simulation data that make clear the flow of data in the interconnection network.

  9. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sreepathi, Sarat [ORNL; Kumar, Jitendra [ORNL; Mills, Richard T. [Argonne National Laboratory; Hoffman, Forrest M. [ORNL; Sripathi, Vamsi [Intel Corporation; Hargrove, William Walter [United States Department of Agriculture (USDA), United States Forest Service (USFS)

    2017-09-01

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like the Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.

  10. Frequently updated noise threat maps created with use of supercomputing grid

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2014-09-01

    Full Text Available An innovative supercomputing grid services devoted to noise threat evaluation were presented. The services described in this paper concern two issues, first is related to the noise mapping, while the second one focuses on assessment of the noise dose and its influence on the human hearing system. The discussed serviceswere developed within the PL-Grid Plus Infrastructure which accumulates Polish academic supercomputer centers. Selected experimental results achieved by the usage of the services proposed were presented. The assessment of the environmental noise threats includes creation of the noise maps using either ofline or online data, acquired through a grid of the monitoring stations. A concept of estimation of the source model parameters based on the measured sound level for the purpose of creating frequently updated noise maps was presented. Connecting the noise mapping grid service with a distributed sensor network enables to automatically update noise maps for a specified time period. Moreover, a unique attribute of the developed software is the estimation of the auditory effects evoked by the exposure to noise. The estimation method uses a modified psychoacoustic model of hearing and is based on the calculated noise level values and on the given exposure period. Potential use scenarios of the grid services for research or educational purpose were introduced. Presentation of the results of predicted hearing threshold shift caused by exposure to excessive noise can raise the public awareness of the noise threats.

  11. Integration of PanDA workload management system with Titan supercomputer at OLCF

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00300320; Klimentov, Alexei; Oleynik, Danila; Panitkin, Sergey; Petrosyan, Artem; Vaniachine, Alexandre; Wenaus, Torre; Schovancova, Jaroslava

    2015-01-01

    The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, next LHC data taking run will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modi ed PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multi-core worker nodes. It also gives PanDA new capability to collect, in real time, information about unused...

  12. Research Review: Gene-Environment Interaction Research in Youth Depression--A Systematic Review with Recommendations for Future Research

    Science.gov (United States)

    Dunn, Erin C.; Uddin, Monica; Subramanian, S. V.; Smoller, Jordan W.; Galea, Sandro; Koenen, Karestan C.

    2011-01-01

    Background: Depression is a major public health problem among youth, currently estimated to affect as many as 9% of US children and adolescents. The recognition that both genes (nature) and environments (nurture) are important for understanding the etiology of depression has led to a rapid growth in research exploring gene-environment interactions…

  13. The genesis of neurosurgery and the evolution of the neurosurgical operative environment: part II--concepts for future development, 2003 and beyond.

    Science.gov (United States)

    Liu, Charles Y; Spicer, Mark; Apuzzo, Michael L J

    2003-01-01

    The future development of the neurosurgical operative environment is driven principally by concurrent development in science and technology. In the new millennium, these developments are taking on a Jules Verne quality, with the ability to construct and manipulate the human organism and its surroundings at the level of atoms and molecules seemingly at hand. Thus, an examination of currents in technology advancement from the neurosurgical perspective can provide insight into the evolution of the neurosurgical operative environment. In the future, the optimal design solution for the operative environment requirements of specialized neurosurgery may take the form of composites of venues that are currently mutually distinct. Advances in microfabrication technology and laser optical manipulators are expanding the scope and role of robotics, with novel opportunities for bionic integration. Assimilation of biosensor technology into the operative environment promises to provide neurosurgeons of the future with a vastly expanded set of physiological data, which will require concurrent simplification and optimization of analysis and presentation schemes to facilitate practical usefulness. Nanotechnology derivatives are shattering the maximum limits of resolution and magnification allowed by conventional microscopes. Furthermore, quantum computing and molecular electronics promise to greatly enhance computational power, allowing the emerging reality of simulation and virtual neurosurgery for rehearsal and training purposes. Progressive minimalism is evident throughout, leading ultimately to a paradigm shift as the nanoscale is approached. At the interface between the old and new technological paradigms, issues related to integration may dictate the ultimate emergence of the products of the new paradigm. Once initiated, however, history suggests that the process of change will proceed rapidly and dramatically, with the ultimate neurosurgical operative environment of the future

  14. Large scale simulations of lattice QCD thermodynamics on Columbia Parallel Supercomputers

    International Nuclear Information System (INIS)

    Ohta, Shigemi

    1989-01-01

    The Columbia Parallel Supercomputer project aims at the construction of a parallel processing, multi-gigaflop computer optimized for numerical simulations of lattice QCD. The project has three stages; 16-node, 1/4GF machine completed in April 1985, 64-node, 1GF machine completed in August 1987, and 256-node, 16GF machine now under construction. The machines all share a common architecture; a two dimensional torus formed from a rectangular array of N 1 x N 2 independent and identical processors. A processor is capable of operating in a multi-instruction multi-data mode, except for periods of synchronous interprocessor communication with its four nearest neighbors. Here the thermodynamics simulations on the two working machines are reported. (orig./HSI)

  15. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    Science.gov (United States)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  16. Wavelet transform-vector quantization compression of supercomputer ocean model simulation output

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J N; Brislawn, C M

    1992-11-12

    We describe a new procedure for efficient compression of digital information for storage and transmission purposes. The algorithm involves a discrete wavelet transform subband decomposition of the data set, followed by vector quantization of the wavelet transform coefficients using application-specific vector quantizers. The new vector quantizer design procedure optimizes the assignment of both memory resources and vector dimensions to the transform subbands by minimizing an exponential rate-distortion functional subject to constraints on both overall bit-rate and encoder complexity. The wavelet-vector quantization method, which originates in digital image compression. is applicable to the compression of other multidimensional data sets possessing some degree of smoothness. In this paper we discuss the use of this technique for compressing the output of supercomputer simulations of global climate models. The data presented here comes from Semtner-Chervin global ocean models run at the National Center for Atmospheric Research and at the Los Alamos Advanced Computing Laboratory.

  17. Use of QUADRICS supercomputer as embedded simulator in emergency management systems

    International Nuclear Information System (INIS)

    Bove, R.; Di Costanzo, G.; Ziparo, A.

    1996-07-01

    The experience related to the implementation of a MRBT, atmospheric spreading model with a short duration releasing, are reported. This model was implemented on a QUADRICS-Q1 supercomputer. First is reported a description of the MRBT model. It is an analytical model to study the speadings of light gases realised in the atmosphere cause incidental releasing. The solution of diffusion equation is Gaussian like. It yield the concentration of pollutant substance released. The concentration is function of space and time. Thus the QUADRICS architecture is introduced. And the implementation of the model is described. At the end it will be consider the integration of the QUADRICS-based model as simulator in a emergency management system

  18. An Optimized Parallel FDTD Topology for Challenging Electromagnetic Simulations on Supercomputers

    Directory of Open Access Journals (Sweden)

    Shugang Jiang

    2015-01-01

    Full Text Available It may not be a challenge to run a Finite-Difference Time-Domain (FDTD code for electromagnetic simulations on a supercomputer with more than 10 thousands of CPU cores; however, to make FDTD code work with the highest efficiency is a challenge. In this paper, the performance of parallel FDTD is optimized through MPI (message passing interface virtual topology, based on which a communication model is established. The general rules of optimal topology are presented according to the model. The performance of the method is tested and analyzed on three high performance computing platforms with different architectures in China. Simulations including an airplane with a 700-wavelength wingspan, and a complex microstrip antenna array with nearly 2000 elements are performed very efficiently using a maximum of 10240 CPU cores.

  19. Supercomputing in the Age of Discovering Superearths, Earths and Exoplanet Systems

    Science.gov (United States)

    Jenkins, Jon M.

    2015-01-01

    NASA's Kepler Mission was launched in March 2009 as NASA's first mission capable of finding Earth-size planets orbiting in the habitable zone of Sun-like stars, that range of distances for which liquid water would pool on the surface of a rocky planet. Kepler has discovered over 1000 planets and over 4600 candidates, many of them as small as the Earth. Today, Kepler's amazing success seems to be a fait accompli to those unfamiliar with her history. But twenty years ago, there were no planets known outside our solar system, and few people believed it was possible to detect tiny Earth-size planets orbiting other stars. Motivating NASA to select Kepler for launch required a confluence of the right detector technology, advances in signal processing and algorithms, and the power of supercomputing.

  20. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  1. Efficient development of memory bounded geo-applications to scale on modern supercomputers

    Science.gov (United States)

    Räss, Ludovic; Omlin, Samuel; Licul, Aleksandar; Podladchikov, Yuri; Herman, Frédéric

    2016-04-01

    Numerical modeling is an actual key tool in the area of geosciences. The current challenge is to solve problems that are multi-physics and for which the length scale and the place of occurrence might not be known in advance. Also, the spatial extend of the investigated domain might strongly vary in size, ranging from millimeters for reactive transport to kilometers for glacier erosion dynamics. An efficient way to proceed is to develop simple but robust algorithms that perform well and scale on modern supercomputers and permit therefore very high-resolution simulations. We propose an efficient approach to solve memory bounded real-world applications on modern supercomputers architectures. We optimize the software to run on our newly acquired state-of-the-art GPU cluster "octopus". Our approach shows promising preliminary results on important geodynamical and geomechanical problematics: we have developed a Stokes solver for glacier flow and a poromechanical solver including complex rheologies for nonlinear waves in stressed rocks porous rocks. We solve the system of partial differential equations on a regular Cartesian grid and use an iterative finite difference scheme with preconditioning of the residuals. The MPI communication happens only locally (point-to-point); this method is known to scale linearly by construction. The "octopus" GPU cluster, which we use for the computations, has been designed to achieve maximal data transfer throughput at minimal hardware cost. It is composed of twenty compute nodes, each hosting four Nvidia Titan X GPU accelerators. These high-density nodes are interconnected with a parallel (dual-rail) FDR InfiniBand network. Our efforts show promising preliminary results for the different physics investigated. The glacier flow solver achieves good accuracy in the relevant benchmarks and the coupled poromechanical solver permits to explain previously unresolvable focused fluid flow as a natural outcome of the porosity setup. In both cases

  2. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  3. Fire evolution in the radioactive forests of Ukraine and Belarus: future risks for the population and the environment

    Science.gov (United States)

    N. Evangeliou; Y. Balkanski; A. Cozic; WeiMin Hao; F. Mouillot; K. Thonicke; R. Paugam; S. Zibtsev; T. A. Mousseau; R. Wang; B. Poulter; A. Petkov; C. Yue; P. Cadule; B. Koffi; J. W. Kaiser; A. P. Moller

    2015-01-01

    In this paper, we analyze the current and future status of forests in Ukraine and Belarus that were contaminated after the nuclear disaster in 1986. Using several models, together with remote-sensing data and observations, we studied how climate change in these forests may affect fire regimes. We investigated the possibility of 137Cs displacement over Europe...

  4. The Future Security Environment: Why the U.S. Army Must Differentiate and Grow Millennial Officer Talent

    Science.gov (United States)

    2015-09-01

    USAWC students is available to Army and Department of Defense leaders, the Strategic Studies Institute publishes selected papers in its “Carlisle Papers...a security environment that now, more than ever, demands innovation, entrepreneurship , and adaptive leadership in an economy 4 that struggles to

  5. Carry-over effects of the social environment on future divorce probability in a wild bird population

    NARCIS (Netherlands)

    Culina, Antica; Hinde, Camilla; Sheldon, B.C.

    2015-01-01

    Initial mate choice and re-mating strategies (infidelity and divorce) influence individual fitness. Both of these should be influenced by the social environment, which determines the number and availability of potential partners. While most studies looking at this relationship take a

  6. Healthy and sustainable diets: Community concern about the effect of the future food environments and support for government regulating sustainable food supplies in Western Australia.

    Science.gov (United States)

    Harray, Amelia J; Meng, Xingqiong; Kerr, Deborah A; Pollard, Christina M

    2018-02-03

    To determine the level of community concern about future food supplies and perception of the importance placed on government regulation over the supply of environmentally friendly food and identify dietary and other factors associated with these beliefs in Western Australia. Data from the 2009 and 2012 Nutrition Monitoring Survey Series computer-assisted telephone interviews were pooled. Level of concern about the effect of the environment on future food supplies and importance of government regulating the supply of environmentally friendly food were measured. Multivariate regression analysed potential associations with sociodemographic variables, dietary health consciousness, weight status and self-reported intake of eight foods consistent with a sustainable diet. Western Australia. Community-dwelling adults aged 18-64 years (n = 2832). Seventy nine per cent of Western Australians were 'quite' or 'very' concerned about the effect of the environment on future food supplies. Respondents who paid less attention to the health aspects of their diet were less likely than those who were health conscious ('quite' or 'very' concerned) (OR = 0.53, 95% CI [0.35, 0.8] and 0.38 [0.17, 0.81] respectively). The majority of respondents (85.3%) thought it was 'quite' or 'very' important that government had regulatory control over an environmentally friendly food supply. Females were more likely than males to rate regulatory control as 'quite' or 'very' important' (OR = 1.63, 95% CI [1.09, 2.44], p = .02). Multiple regression modeling found that no other factors predicted concern or importance. There is a high level of community concern about the impact of the environment on future food supplies and most people believe it is important that the government regulates the issue. These attitudes dominate regardless of sociodemographic characteristics, weight status or sustainable dietary behaviours. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Focal relationships and the environment of project marketing. A literature review with suggestions for practitioners and future research

    DEFF Research Database (Denmark)

    Skaates, Maria Anne; Tikkanen, Henrikki

    2000-01-01

    Project marketing is an important mode of business-to-business marketing today. This paper assesses recent project marketing contributions, including predominantly those of members of the (mainly European) International Network for Project Marketing and Systems Selling (INPM). The emphasis...... of the review is upon the connection between focal relationships and the wider environment in which project marketing and systems selling takes place. First, several common definitions of projects and project marketing are presented and discussed. Second, the implications of three specific features of project...... business - discontinuity, uniqueness, and complexity - for the focal relationship and the broader marketing environment are considered at the level of multiple projects. Third, three overlapping types of postures that project-selling firms can adopt in relation to their focal relationships...

  8. Ignalina NPP its environment, safety and future, prospects of the energetic, ethnic and cultural situation: expert evaluation

    International Nuclear Information System (INIS)

    Morkunas, Z. V.; Ciuzas, A.; Jonaitis, V.; Sutiniene, I.

    1995-01-01

    According to the tasks defined in the 'Atomic Energy and the Environment' program an expert evaluative survey was done for the first time in Lithuania concerning the Ignalina NPP and its consequences and perspectives according to the concept which was prepared. The results of survey analysis, done by Lithuanian experts, are presented. Investigation involved these problems: evolution of the technical state safety, use and prospects of the nuclear power plant; evaluation of the activities of governmental and social institutions in connection with the nuclear power plant; Ignalina NPP and the environment; the effect of the nuclear power plant on agricultural activities and development; evolution of the ethnic and cultural situation; conclusions and recommendations for regulations of those areas. (author). 2 refs., 11 figs

  9. State of the environment reporting (SOER) and the policy process in South Africa: Learing for the future

    CSIR Research Space (South Africa)

    Will, C

    2006-01-01

    Full Text Available State of the Environment Report 2005 (Year One). Department of Environmental Affairs and Development Planning, Provincial Government of the Western Cape. Dryzek, J. S (1997) The Politics of the Earth: Environmental Discourses. Oxford University Press...) The Politics of Environmental Discourse. Ecological Modernisation and the Policy Process. Oxford University Press, Oxford. Hajer, M. A and Wagenaar, H. (eds) (2004) Deliberative Policy Analysis: Understanding Governance in the Network Society, Cambridge...

  10. PSYCHOLOGICAL STRATEGY OF COOPERATION, MOTIVATIONAL, INFORMATION AND TECHNOLOGICAL COMPONENTS OF FUTURE HUMANITARIAN TEACHER READINESS FOR PROFESSIONAL ACTIVITY IN POLYSUBJECTIVE LEARNING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Y. Spivakovska

    2014-04-01

    Full Text Available Redefining of modern information and communication technologies (ICT from teaching aids to teaching process subjects, continuous growth of their subjectivity necessary demands appropriate knowledge, skills, appropriate attitude to didactic capabilities of ICT, ability to cooperate with them and to build pupils learning activity aimed at formation and development of self organization, self development skills, promoting their subjective position in getting education that will be readiness of modern teacher to organize effective professional activities in polysubjective learning environment (PLE. The new tasks of humanitarian teacher related to self selection and design of educational content as well as the modeling of the learning process in conditions of PLE virtualized alternatives choice, impose special requirements to professionally important teacher’s personality qualities, rather to his readiness to implement effective professional work in such conditions. In this article the essence of future humanitarian teacher readiness concept to professional activity in polysubjective educational environment is proved. The structure of the readiness is analyzed. Psychological strategy of cooperation, reflective, motivational and informational partials are substantiated and characterized as components of the future humanitarian teacher readiness to professional activities in polysubjective educational environment.

  11. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  12. Developing a General Decision Tool for Future Cancer Care: Getting Feedback from Users in Busy Hospital Environments

    DEFF Research Database (Denmark)

    Dankl, Kathrina; Akoglu, Canan; Dahl Steffensen, Karina

    2017-01-01

    Background and Aims: During the last decade more and more design researchers and practitioners have been collaborating with clinicians, patients and relatives in order to improve healthcare systems. Shared decision making has thereby been one field, attracting increased attention. Hospitals pose...... Hospital in Vejle and Design School Kolding in Denmark with the main objective of creating a general decision aid for future cancer care. The aims of the collaborative design development are twofold: to enhance clarity and understanding of the decision aid via communication design and to decrease barriers...... for successful implementation via extensive involvement of patients, relatives and clinicians in the design process. The present abstract puts forward a participatory method of getting feedback from different stakeholder groups on the illustration design of the decision aid. Methods: The decision aid...

  13. DESIGNING OF ARCHITECTURE OF CLOUD-ORIENTED INFORMATION-EDUCATIONAL ENVIRONMENT TO PREPARE FUTURE IT-PROFESSIONALS

    Directory of Open Access Journals (Sweden)

    Olena H. Glazunova

    2014-12-01

    Full Text Available In the article author substantiated architecture of information-educational environment of the modern university, built on the basis of cloud technologies. A number of software and technology solutions based on virtualization, clustering and management of virtual resources that can be implemented on the basis of its own infrastructure of the institution are proposed. Model for the provision of educational services to students of IT-specialties, which is to provide access of students to teaching environmental resources: e-learning courses, resources of institutional repository, digital library, video portal, wiki portal, as well as virtual desktop with the required set of software package for the laboratory and project work through only one account in the e-learning system are substantiated. Scheme of student access to virtual learning resources, including virtual desktop directly through the web interface and by reference from resource for laboratory work in e-learning courses are proposed.

  14. The future implications of some long-lived fission product nuclides discharged to the environment in fuel reprocessing wastes

    International Nuclear Information System (INIS)

    Bryant, P.M.; Jones, J.A.

    1972-12-01

    Current reprocessing practice leads to the discharge to the environment of virtually all the krypton-85 and tritium, and a large fraction of the iodine-129, formed as fission products in reactor fuel. As nuclear power programmes expand the global inventory of these long-lived nuclides is increasing. The radiological significance of these discharges is assessed in terms of radiation exposure of various population groups during the next few decades. The results of this assessment show that krypton-85 will give higher dose rates than tritium or iodine-129, but that on conventional radiological protection criteria these do not justify taking action to remove krypton-85 from reprocessing plant effluents before the 21st century. (author)

  15. Radio-Isotopes Section, radiation Safety Division, Ministry Of The Environment, Israel: A General Review, And Future Developments

    International Nuclear Information System (INIS)

    Ben-Zion, S.

    1999-01-01

    The section of radio-isotopes in the Ministry Of Environment, is responsible for preventing environmental hazards fi.om radio-isotopes ''from cradle to grave's'. The management and the supervision of radioactive materials, includes about 350 institutes in Israel. We are dealing with the implementation and the enforcement of the environmental regulations and safety standards, and licensing for each institution and installation. Among our tasks are the following: Follow-up of the import, transportation and distribution, usage and storage and disposal of radio-isotopes, as well as legislation, risk-assessments, inspection, , and ''education'. We are also participating in committees / working groups discussing specific topics: Radioactive stores, Low RW disposal, Y2K, GIS, penalties charging, transportation and more

  16. Norfolk, Virginia—Planning to be the Coastal Community of the Future in a rising water environment

    Science.gov (United States)

    Homewood, G. M.

    2017-12-01

    Norfolk VA is the second most at-risk population center in North America from sea level rise while also being home to the world's largest naval base and one of the 3 largest east coast ports. Norfolk is one of the original cohort of cities in the 100 Resilient Cities effort pioneered by the Rockefeller Foundation and has changed its sea level adaptation strategy from "keep the water out" to "living with water" through a ground-breaking community visioning process. In Norfolk, this means, among other goals, finding co-benefits in public and private investments and interventions—these can be environmental, economic, social, recreational or other things we have not yet thought about—and it is in this area that the geosciences can benefit Norfolk's planning for a rising water environment.

  17. The heat is on: Australia's greenhouse future. Report to the Senate Environment, Communications, Information Technology and the Art References Committee

    International Nuclear Information System (INIS)

    2000-11-01

    On 11 August 1999, the Senate referred matters pertaining to global warming to the Environment, Communications, Information Technology and the Art References Committee for inquiring. The Committee is reporting on the progress and adequacy of Australian policies to reduce global warming, in light of Australia's commitments under the Framework Convention on Climate Change. It also critically evaluates the effectiveness Australian Government programs and policies, both state and Federal, in particular those aiming to provide for the development of emerging renewable energies, energy efficiency industries and the more efficient use of energy sources and the extent to which the Government's relations with industry under the Greenhouse Challenge Program are accountable and transparent. Projected effect of climate change on Australia's ecosystems and the potential introduction of national system of emissions trading within Australia are also examined

  18. Environmental radiation monitoring during 2013 and 2014 in the environment of future site temporary store central (ATC) Spanish

    International Nuclear Information System (INIS)

    Pujol, L.; Perez Zabaleta, E.; Pablo, M. A. de; Rodrguez Arevalo, J.; Nieva, A.

    2014-01-01

    During 2013 and 2014 samples were taken of representative waters surface and underground waters near checkpoints quality currently available ATC in location of the Guadiana River Basin aquifers. The CEDEX has identified the following radiographic parameters: total alpha activity index, total beta, beta rest, activity concentration of tritium and gamma spectrometry. It has also been performed specific action of some alpha emitters (radium isotopes and uranium). values average rate of total alpha activity, gross beta and beta rest, have an increased increment in waters down Zancara of the river, in the immediate environment of the facility. The index of alpha activity is consistent with the total amount of alpha emitters and determined that the contribution principal is due to isotopic uranium, probably contributed by leaching of the geological materials from the area. (Author)

  19. Onset and stability of gas hydrates under permafrost in an environment of surface climatic change : past and future

    International Nuclear Information System (INIS)

    Majorowicz, J.A.; Osadetz, K.; Safanda, J.

    2008-01-01

    This paper presented a model designed to simulate permafrost and gas hydrate formation in a changing surface temperature environment in the Beaufort-Mackenzie Basin (BMB). The numerical model simulated surface forcing due to general cooling trends that began in the late Miocene era. This study modelled the onset of permafrost formation and subsequent gas hydrate formation in the changing surface temperature environment for the BMB. Paleoclimatic data were used. The 1-D model was constrained by deep heat flow from well bottom hole temperatures; conductivity; permafrost thickness; and the thickness of the gas hydrates. The model used latent heat effects for the ice-bearing permafrost and hydrate intervals. Surface temperatures for glacial and interglacial histories for the last 14 million years were considered. The model also used a detailed Holocene temperature history as well as a scenario in which atmospheric carbon dioxide (CO 2 ) levels were twice as high as current levels. Two scenarios were considered: (1) the formation of gas hydrates from gas entrapped under geological seals; and (2) the formation of gas hydrates from gas located in free pore spaces simultaneously with permafrost formation. Results of the study showed that gas hydrates may have formed at a depth of 0.9 km only 1 million years ago. Results of the other modelling scenarios suggested that the hydrates formed 6 million years ago, when temperature changes caused the gas hydrate layer to expand both downward and upward. Detailed models of more recent glacial and interglacial histories showed that the gas hydrate zones will persist under the thick body of the BMB permafrost through current interglacial warming as well as in scenarios where atmospheric CO 2 is doubled. 28 refs., 13 figs

  20. The company's mainframes join CERN's openlab for DataGrid apps and are pivotal in a new $22 million Supercomputer in the U.K.

    CERN Multimedia

    2002-01-01

    Hewlett-Packard has installed a supercomputer system valued at more than $22 million at the Wellcome Trust Sanger Institute (WTSI) in the U.K. HP has also joined the CERN openlab for DataGrid applications (1 page).

  1. Research center Juelich to install Germany's most powerful supercomputer new IBM System for science and research will achieve 5.8 trillion computations per second

    CERN Multimedia

    2002-01-01

    "The Research Center Juelich, Germany, and IBM today announced that they have signed a contract for the delivery and installation of a new IBM supercomputer at the Central Institute for Applied Mathematics" (1/2 page).

  2. MODELS AND ALGORITHMS FOR COMPUTER-AIDED DESIGN OF TECHNOLOGICAL PROCESS OF CASTINGS PRODUCTION, DIRECTED FOR SUPERCOMPUTER SKIF

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2009-01-01

    Full Text Available The schema of development and technological process of castings production under supercomputer SKIF,  which al lows to decrease the periods of gating systems projecting, is  offered.  The model for appraisal of the variants of the castings production technology, which allows to range the modeled  technological processes and to choose the optimal regimes and gating system, is developed.

  3. Teaching Research Methods and Statistics in eLearning Environments:Pedagogy, Practical Examples and Possible Futures

    Directory of Open Access Journals (Sweden)

    Adam John Rock

    2016-03-01

    Full Text Available Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal, Ginsburg, & Schau, 1997. Given the ubiquitous and distributed nature of eLearning systems (Nof, Ceroni, Jeong, & Moghaddam, 2015, teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  4. 369 TFlop/s molecular dynamics simulations on the Roadrunner general-purpose heterogeneous supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Swaminarayan, Sriram [Los Alamos National Laboratory; Germann, Timothy C [Los Alamos National Laboratory; Kadau, Kai [Los Alamos National Laboratory; Fossum, Gordon C [IBM CORPORATION

    2008-01-01

    The authors present timing and performance numbers for a short-range parallel molecular dynamics (MD) code, SPaSM, that has been rewritten for the heterogeneous Roadrunner supercomputer. Each Roadrunner compute node consists of two AMD Opteron dual-core microprocessors and four PowerXCell 8i enhanced Cell microprocessors, so that there are four MPI ranks per node, each with one Opteron and one Cell. The interatomic forces are computed on the Cells (each with one PPU and eight SPU cores), while the Opterons are used to direct inter-rank communication and perform I/O-heavy periodic analysis, visualization, and checkpointing tasks. The performance measured for our initial implementation of a standard Lennard-Jones pair potential benchmark reached a peak of 369 Tflop/s double-precision floating-point performance on the full Roadrunner system (27.7% of peak), corresponding to 124 MFlop/Watt/s at a price of approximately 3.69 MFlops/dollar. They demonstrate an initial target application, the jetting and ejection of material from a shocked surface.

  5. A Parallel Supercomputer Implementation of a Biological Inspired Neural Network and its use for Pattern Recognition

    International Nuclear Information System (INIS)

    De Ladurantaye, Vincent; Lavoie, Jean; Bergeron, Jocelyn; Parenteau, Maxime; Lu Huizhong; Pichevar, Ramin; Rouat, Jean

    2012-01-01

    A parallel implementation of a large spiking neural network is proposed and evaluated. The neural network implements the binding by synchrony process using the Oscillatory Dynamic Link Matcher (ODLM). Scalability, speed and performance are compared for 2 implementations: Message Passing Interface (MPI) and Compute Unified Device Architecture (CUDA) running on clusters of multicore supercomputers and NVIDIA graphical processing units respectively. A global spiking list that represents at each instant the state of the neural network is described. This list indexes each neuron that fires during the current simulation time so that the influence of their spikes are simultaneously processed on all computing units. Our implementation shows a good scalability for very large networks. A complex and large spiking neural network has been implemented in parallel with success, thus paving the road towards real-life applications based on networks of spiking neurons. MPI offers a better scalability than CUDA, while the CUDA implementation on a GeForce GTX 285 gives the best cost to performance ratio. When running the neural network on the GTX 285, the processing speed is comparable to the MPI implementation on RQCHP's Mammouth parallel with 64 notes (128 cores).

  6. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  7. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan; Mills, Richard T.

    2012-04-18

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors per realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.

  8. The BlueGene/L Supercomputer and Quantum ChromoDynamics

    International Nuclear Information System (INIS)

    Vranas, P; Soltz, R

    2006-01-01

    In summary our update contains: (1) Perfect speedup sustaining 19.3% of peak for the Wilson D D-slash Dirac operator. (2) Measurements of the full Conjugate Gradient (CG) inverter that inverts the Dirac operator. The CG inverter contains two global sums over the entire machine. Nevertheless, our measurements retain perfect speedup scaling demonstrating the robustness of our methods. (3) We ran on the largest BG/L system, the LLNL 64 rack BG/L supercomputer, and obtained a sustained speed of 59.1 TFlops. Furthermore, the speedup scaling of the Dirac operator and of the CG inverter are perfect all the way up to the full size of the machine, 131,072 cores (please see Figure II). The local lattice is rather small (4 x 4 x 4 x 16) while the total lattice has been a lattice QCD vision for thermodynamic studies (a total of 128 x 128 x 256 x 32 lattice sites). This speed is about five times larger compared to the speed we quoted in our submission. As we have pointed out in our paper QCD is notoriously sensitive to network and memory latencies, has a relatively high communication to computation ratio which can not be overlapped in BGL in virtual node mode, and as an application is in a class of its own. The above results are thrilling to us and a 30 year long dream for lattice QCD

  9. Adapting NBODY4 with a GRAPE-6a Supercomputer for Web Access, Using NBodyLab

    Science.gov (United States)

    Johnson, V.; Aarseth, S.

    2006-07-01

    A demonstration site has been developed by the authors that enables researchers and students to experiment with the capabilities and performance of NBODY4 running on a GRAPE-6a over the web. NBODY4 is a sophisticated open-source N-body code for high accuracy simulations of dense stellar systems (Aarseth 2003). In 2004, NBODY4 was successfully tested with a GRAPE-6a, yielding an unprecedented low-cost tool for astrophysical research. The GRAPE-6a is a supercomputer card developed by astrophysicists to accelerate high accuracy N-body simulations with a cluster or a desktop PC (Fukushige et al. 2005, Makino & Taiji 1998). The GRAPE-6a card became commercially available in 2004, runs at 125 Gflops peak, has a standard PCI interface and costs less than 10,000. Researchers running the widely used NBODY6 (which does not require GRAPE hardware) can compare their own PC or laptop performance with simulations run on http://www.NbodyLab.org. Such comparisons may help justify acquisition of a GRAPE-6a. For workgroups such as university physics or astronomy departments, the demonstration site may be replicated or serve as a model for a shared computing resource. The site was constructed using an NBodyLab server-side framework.

  10. Assessment techniques for a learning-centered curriculum: evaluation design for adventures in supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Helland, B. [Ames Lab., IA (United States); Summers, B.G. [Oak Ridge National Lab., TN (United States)

    1996-09-01

    As the classroom paradigm shifts from being teacher-centered to being learner-centered, student assessments are evolving from typical paper and pencil testing to other methods of evaluation. Students should be probed for understanding, reasoning, and critical thinking abilities rather than their ability to return memorized facts. The assessment of the Department of Energy`s pilot program, Adventures in Supercomputing (AiS), offers one example of assessment techniques developed for learner-centered curricula. This assessment has employed a variety of methods to collect student data. Methods of assessment used were traditional testing, performance testing, interviews, short questionnaires via email, and student presentations of projects. The data obtained from these sources have been analyzed by a professional assessment team at the Center for Children and Technology. The results have been used to improve the AiS curriculum and establish the quality of the overall AiS program. This paper will discuss the various methods of assessment used and the results.

  11. Modeling radiative transport in ICF plasmas on an IBM SP2 supercomputer

    International Nuclear Information System (INIS)

    Johansen, J.A.; MacFarlane, J.J.; Moses, G.A.

    1995-01-01

    At the University of Wisconsin-Madison the authors have integrated a collisional-radiative-equilibrium model into their CONRAD radiation-hydrodynamics code. This integrated package allows them to accurately simulate the transport processes involved in ICF plasmas; including the important effects of self-absorption of line-radiation. However, as they increase the amount of atomic structure utilized in their transport models, the computational demands increase nonlinearly. In an attempt to meet this increased computational demand, they have recently embarked on a mission to parallelize the CONRAD program. The parallel CONRAD development is being performed on an IBM SP2 supercomputer. The parallelism is based on a message passing paradigm, and is being implemented using PVM. At the present time they have determined that approximately 70% of the sequential program can be executed in parallel. Accordingly, they expect that the parallel version will yield a speedup on the order of three times that of the sequential version. This translates into only 10 hours of execution time for the parallel version, whereas the sequential version required 30 hours

  12. Benchmarking Further Single Board Computers for Building a Mini Supercomputer for Simulation of Telecommunication Systems

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2016-01-01

    Full Text Available Parallel Discrete Event Simulation (PDES with the conservative synchronization method can be efficiently used for the performance analysis of telecommunication systems because of their good lookahead properties. For PDES, a cost effective execution platform may be built by using single board computers (SBCs, which offer relatively high computation capacity compared to their price or power consumption and especially to the space they take up. A benchmarking method is proposed and its operation is demonstrated by benchmarking ten different SBCs, namely Banana Pi, Beaglebone Black, Cubieboard2, Odroid-C1+, Odroid-U3+, Odroid-XU3 Lite, Orange Pi Plus, Radxa Rock Lite, Raspberry Pi Model B+, and Raspberry Pi 2 Model B+. Their benchmarking results are compared to find out which one should be used for building a mini supercomputer for parallel discrete-event simulation of telecommunication systems. The SBCs are also used to build a heterogeneous cluster and the performance of the cluster is tested, too.

  13. Investigation on future perspective of nuclear power generation. Countermeasures to global environment problems and role of stable energy supply

    International Nuclear Information System (INIS)

    Sikami, Yutaka

    1995-01-01

    This investigation is concerned with the long term energy demand and supply in the world, which was carried out by the Institute of Energy Economics Japan for the purpose of contributing to the deliberation of the Atomic Energy Commission of Japan. This perspective of the demand and supply took the ultralong period up to 2100 as the object, and two points of the newest information on energy resources and the greenhouse effect problem due to carbon dioxide are included. The model used for the simulation was the modified Edmond Riley model. Energy consumption was estimated from that per one person and the population classified into nine districts. The assumed conditions for energy demand and supply are explained. The simulation of energy demand and supply was carried out for basic case in which the present state continues, carbon dioxide restriction case and restriction and plutonium utilization case. The results of the simulation on energy demand and supply, the effect to environment and the problems of resources are reported. The energy consumption in the world continues to increase hereafter centering around developing countries, and in 2100, the primary energy supply more than three times as much as that in 1990 becomes necessary. Unless the release of carbon dioxide is restricted, the resolution of environmental problems becomes difficult. Nuclear power generation is affected by uranium resource depletion around 2100, and early countermeasures are necessary. (K.I.)

  14. The Right Amount of Glue: Technologies and Standards Relevant to a Future Solar-Terrestrial Data Environment

    Science.gov (United States)

    Gurman, J. B.; Dimitoglou, G.; Bogart, R.; Tian, K. Q.; Hill, F.; Wampler, S.; Martens, P. C.; Davey, A. R.

    2002-01-01

    In order to meet the challenge of developing a new system science, we will need to employ technology that enables researchers to access data from fields with which they are at least initially unfamiliar as well as from sources they use more regularly. At the same time, the quantity of data to be obtained by missions such as the Solar Dynamics Observatory demands ease and simplicity of data access. These competing demands must in turn fit within severely constrained funding for data analysis in such projects. Based on experience in only a single discipline but with a diversity of data types and sources, we will give examples of technology that have made a significant difference in the way people do science. Similarly, we will show how adoption of a well-documented data format has made it easier for one community to search, reduce, and analyze data. We will also describe a community-supported data reduction and analysis software tree with useful features. We will attempt to generalize the lessons learned in these instances to features the broader, solar-terrestrial community might find compelling, while avoiding overdesign of a common data environment.

  15. Mineral formation on metallic copper in a `Future repository site environment`: Textural considerations based on natural analogs

    Energy Technology Data Exchange (ETDEWEB)

    Amcoff, Oe. [Uppsala Univ. (Sweden). Inst. of Earth Sciences

    1998-01-01

    Copper mineral formation in the Swedish `repository site environment` is discussed. Special attention is given to ore mineral textures (=the spatial relation among minerals), with examples given from nature. It is concluded: By analogy with observations from natural occurrences, an initial coating of Cu-oxide on the canister surface (because of entrapped air during construction) will probably not hinder a later sulphidation process. Early formation of Cu-sulphides on the canister surface may be accompanied by formation of CuFe-sulphides. The latter phase(s) may form through replacement of the Cu-sulphides or, alternatively, by means of reaction between dissolved copper and fine-grained iron sulphide (pyrite) in the surrounding bentonite. Should for some reason the bentonite barrier fail and the conditions become strongly oxidizing, we can expect crustifications and rhythmic growths of Cu(II)-phases, like malachite (Cu{sub 2}(OH){sub 2}CO{sub 3}). A presence of Fe{sup 2} in the clay minerals making up the bentonite might prove to have an adverse effect on the canister stability, since, in this case, the bentonite might be expected to act as a sink for dissolved copper. The mode of mineral growth along the copper - bentonite interface remains an open question.

  16. The Future of Futures

    DEFF Research Database (Denmark)

    Frankel, Christian; Ossandón, José

    2013-01-01

    Review of Elena Esposito: The Future of Futures. The Time of Money in Financing and Society Cheltenham. Edward Elgar, 2011.......Review of Elena Esposito: The Future of Futures. The Time of Money in Financing and Society Cheltenham. Edward Elgar, 2011....

  17. Personality, perceived environment, and behavior systems related to future smoking intentions among youths: an application of problem-behavior theory in Shanghai, China.

    Science.gov (United States)

    Cai, Yong; Li, Rui; Zhu, Jingfen; Na, Li; He, Yaping; Redmon, Pam; Qiao, Yun; Ma, Jin

    2015-01-01

    Smoking among youths is a worldwide problem, particularly in China. Many endogenous and environmental factors influence smokers' intentions to smoke; therefore, a comprehensive model is needed to understand the significance and relationship of predictors. This study aimed to develop a prediction model based on problem-behavior theory (PBT) to interpret intentions to smoke among Chinese youths. We conducted a cross-sectional study of 26,675 adolescents from junior, senior, and vocational high schools in Shanghai, China. Data on smoking status, smoking knowledge, attitude toward smoking, parents' and peers' smoking, and media exposure to smoking were collected from students. A structural equation model was used to assess the developed prediction model. The experimental smoking rate and current smoking rate among the students were 11.0% and 3%, respectively. Our constructed model showed an acceptable fit to the data (comparative fit index = 0.987, root-mean-square error of approximation = 0.034). Intention to smoke was predicted by perceived environment (β = 0.455, P 0.05) which consisted of acceptance of tobacco use (β = 0.668, P < 0.001) and academic performance (β = 0.171, P < 0.001). The PBT-based model we developed provides a good understanding of the predictors of intentions to smoke and it suggests future interventions among youths should focus on components in perceived environment and behavior systems, and take into account the moderating effects of personality system.

  18. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  19. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    Science.gov (United States)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using

  20. Simulation of x-rays in refractive structure by the Monte Carlo method using the supercomputer SKIF

    International Nuclear Information System (INIS)

    Yaskevich, Yu.R.; Kravchenko, O.I.; Soroka, I.I.; Chembrovskij, A.G.; Kolesnik, A.S.; Serikova, N.V.; Petrov, P.V.; Kol'chevskij, N.N.

    2013-01-01

    Software 'Xray-SKIF' for the simulation of the X-rays in refractive structures by the Monte-Carlo method using the supercomputer SKIF BSU are developed. The program generates a large number of rays propagated from a source to the refractive structure. The ray trajectory under assumption of geometrical optics is calculated. Absorption is calculated for each ray inside of refractive structure. Dynamic arrays are used for results of calculation rays parameters, its restore the X-ray field distributions very fast at different position of detector. It was found that increasing the number of processors leads to proportional decreasing of calculation time: simulation of 10 8 X-rays using supercomputer with the number of processors from 1 to 30 run-times equal 3 hours and 6 minutes, respectively. 10 9 X-rays are calculated by software 'Xray-SKIF' which allows to reconstruct the X-ray field after refractive structure with a special resolution of 1 micron. (authors)

  1. The Past and Future Trends of Heat Stress Based On Wet Bulb Globe Temperature Index in Outdoor Environment of Tehran City, Iran.

    Science.gov (United States)

    Habibi Mohraz, Majid; Ghahri, Asghar; Karimi, Mehrdad; Golbabaei, Farideh

    2016-06-01

    The workers who are working in the open and warm environments are at risk of health effects of climate and heat changes. It is expected that the risk is increase with global warming. This study aimed to investigate the changes of Wet Bulb Globe Temperature (WBGT) index in the past and to predict their trend of future changes in Tehran, capital of Iran. The meteorological data recorded in Tehran, Iran during the statistical period between 1961 and 2009 were obtained from the Iran Meteorological Organization and based on them, WBGT index was calculated and processed using Man-Kendall correlation test. The results of Man-Kendall correlation test showed that the trend of changes of annual mean WBGT during the statistical period under study (1961-2009) has been significantly increasing. In addition, the result of proposed predictive model estimated that an increase of about 1.55 degree in WBGT index will be seen over 40 years from 2009 to 2050 in Tehran. Climate change in Tehran has had an effect on person's exposure to heat stresses consistent with global warming.

  2. Coherent 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an Optimal Supercomputer Optical Switch Fabric

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko

    2013-01-01

    We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates.......We demonstrate, for the first time, the feasibility of using 40 Gb/s SP-16QAM and 80 Gb/s PDM-16QAM in an optimized cell switching supercomputer optical interconnect architecture based on semiconductor optical amplifiers as ON/OFF gates....

  3. MaMiCo: Transient multi-instance molecular-continuum flow simulation on supercomputers

    Science.gov (United States)

    Neumann, Philipp; Bian, Xin

    2017-11-01

    We present extensions of the macro-micro-coupling tool MaMiCo, which was designed to couple continuum fluid dynamics solvers with discrete particle dynamics. To enable local extraction of smooth flow field quantities especially on rather short time scales, sampling over an ensemble of molecular dynamics simulations is introduced. We provide details on these extensions including the transient coupling algorithm, open boundary forcing, and multi-instance sampling. Furthermore, we validate the coupling in Couette flow using different particle simulation software packages and particle models, i.e. molecular dynamics and dissipative particle dynamics. Finally, we demonstrate the parallel scalability of the molecular-continuum simulations by using up to 65 536 compute cores of the supercomputer Shaheen II located at KAUST. Program Files doi:http://dx.doi.org/10.17632/w7rgdrhb85.1 Licensing provisions: BSD 3-clause Programming language: C, C++ External routines/libraries: For compiling: SCons, MPI (optional) Subprograms used: ESPResSo, LAMMPS, ls1 mardyn, waLBerla For installation procedures of the MaMiCo interfaces, see the README files in the respective code directories located in coupling/interface/impl. Journal reference of previous version: P. Neumann, H. Flohr, R. Arora, P. Jarmatz, N. Tchipev, H.-J. Bungartz. MaMiCo: Software design for parallel molecular-continuum flow simulations, Computer Physics Communications 200: 324-335, 2016 Does the new version supersede the previous version?: Yes. The functionality of the previous version is completely retained in the new version. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics or another particle-based solver whereas large parts are covered by a mesh-based CFD solver, e.g. a lattice Boltzmann automaton. Solution method: We couple existing MD and CFD solvers via MaMiCo (macro-micro coupling tool). Data exchange and

  4. Non-sectarian scenario experiments in socio-ecological knowledge building for multi-use marine environments: Insights from New Zealand's Marine Futures project

    KAUST Repository

    Le Heron, Richard

    2016-01-29

    The challenges of managing marine ecosystems for multiple users, while well recognised, has not led to clear strategies, principles or practice. The paper uses novel workshop based thought-experiments to address these concerns. These took the form of trans-disciplinary Non-Sectarian Scenario Experiments (NSSE), involving participants who agreed to put aside their disciplinary interests and commercial and institutional obligations. The NSSE form of co-production of knowledge is a distinctive addition to the participatory and scenario literatures in marine resource management (MRM). Set in the context of resource use conflicts in New Zealand, the workshops assembled diverse participants in the marine economy to co-develop and co-explore the making of socio-ecological knowledge and identify capability required for a new generation of multi-use oriented resource management. The thought-experiments assumed that non-sectarian navigation of scenarios will resource a step-change in marine management by facilitating new connections, relationships, and understandings of potential marine futures. Two questions guided workshop interactions: what science needs spring from pursuing imaginable possibilities and directions in a field of scenarios, and what kinds of institutions would aid the generation of science knowledge, and it application to policy and management solutions. The effectiveness of the thought- experiments helped identify ways of dealing with core problems in multi-use marine management, such as the urgent need to cope with ecological and socio-economic surprise, and define and address cumulative impacts. Discussion focuses on how the workshops offered fresh perspectives and insights into a number of challenges. These challenges include building relations of trust and collective organisation, showing the importance of values-means-ends pathways, developing facilitative legislation to enable initiatives, and the utility of the NSSEs in informing new governance and

  5. Car2x with software defined networks, network functions virtualization and supercomputers technical and scientific preparations for the Amsterdam Arena telecoms fieldlab

    NARCIS (Netherlands)

    Meijer R.J.; Cushing R.; De Laat C.; Jackson P.; Klous S.; Koning R.; Makkes M.X.; Meerwijk A.

    2015-01-01

    In the invited talk 'Car2x with SDN, NFV and supercomputers' we report about how our past work with SDN [1, 2] allows the design of a smart mobility fieldlab in the huge parking lot the Amsterdam Arena. We explain how we can engineer and test software that handle the complex conditions of the Car2X

  6. A criticality safety analysis code using a vectorized Monte Carlo method on the HITAC S-810 supercomputer

    International Nuclear Information System (INIS)

    Morimoto, Y.; Maruyama, H.

    1987-01-01

    A vectorized Monte Carlo criticality safety analysis code has been developed on the vector supercomputer HITAC S-810. In this code, a multi-particle tracking algorithm was adopted for effective utilization of the vector processor. A flight analysis with pseudo-scattering was developed to reduce the computational time needed for flight analysis, which represents the bulk of computational time. This new algorithm realized a speed-up of factor 1.5 over the conventional flight analysis. The code also adopted the multigroup cross section constants library of the Bodarenko type with 190 groups, with 132 groups being for fast and epithermal regions and 58 groups being for the thermal region. Evaluation work showed that this code reproduce the experimental results to an accuracy of about 1 % for the effective neutron multiplication factor. (author)

  7. Evaluating the networking characteristics of the Cray XC-40 Intel Knights Landing-based Cori supercomputer at NERSC

    Energy Technology Data Exchange (ETDEWEB)

    Doerfler, Douglas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Austin, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cook, Brandon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kandalla, Krishna [Cray Inc, Bloomington, MN (United States); Mendygral, Peter [Cray Inc, Bloomington, MN (United States)

    2017-09-12

    There are many potential issues associated with deploying the Intel Xeon Phi™ (code named Knights Landing [KNL]) manycore processor in a large-scale supercomputer. One in particular is the ability to fully utilize the high-speed communications network, given that the serial performance of a Xeon Phi TM core is a fraction of a Xeon®core. In this paper, we take a look at the trade-offs associated with allocating enough cores to fully utilize the Aries high-speed network versus cores dedicated to computation, e.g., the trade-off between MPI and OpenMP. In addition, we evaluate new features of Cray MPI in support of KNL, such as internode optimizations. We also evaluate one-sided programming models such as Unified Parallel C. We quantify the impact of the above trade-offs and features using a suite of National Energy Research Scientific Computing Center applications.

  8. Performance Evaluation of an Intel Haswell- and Ivy Bridge-Based Supercomputer Using Scientific and Engineering Applications

    Science.gov (United States)

    Saini, Subhash; Hood, Robert T.; Chang, Johnny; Baron, John

    2016-01-01

    We present a performance evaluation conducted on a production supercomputer of the Intel Xeon Processor E5- 2680v3, a twelve-core implementation of the fourth-generation Haswell architecture, and compare it with Intel Xeon Processor E5-2680v2, an Ivy Bridge implementation of the third-generation Sandy Bridge architecture. Several new architectural features have been incorporated in Haswell including improvements in all levels of the memory hierarchy as well as improvements to vector instructions and power management. We critically evaluate these new features of Haswell and compare with Ivy Bridge using several low-level benchmarks including subset of HPCC, HPCG and four full-scale scientific and engineering applications. We also present a model to predict the performance of HPCG and Cart3D within 5%, and Overflow within 10% accuracy.

  9. Center for Supercomputing Research and Development: Quarterly report, First quarter, 1987

    Energy Technology Data Exchange (ETDEWEB)

    1987-06-01

    This paper discusses progress on hardware and applications of superconducting design. The topic titles covered are: hardware development, architecture research, operating system research and development, Cedar Fortran, symbolic processing, compiler research, scientific workstation environment, and numerical library. (LSP)

  10. Lisbon: Supercomputer for Portugal financed from 'CERN Fund'

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    A powerful new computer is now in use at the Portuguese National Foundation for Scientific Computation (FCCN Lisbon), set up in 1987 to help fund university computing, to anticipate future requirements and to provide a fast computer at the National Civil Engineering Laboratory (LNEC) as a central node for remote access by major research institutes

  11. Harmonized Constraints in Software Engineering and Acquisition Process Management Requirements are the Clue to Meet Future Performance Goals Successfully in an Environment of Scarce Resources

    National Research Council Canada - National Science Library

    Reich, Holger

    2008-01-01

    This MBA project investigates the importance of correctly deriving requirements from the capability gap and operational environment, and translating them into the processes of contracting, software...

  12. A Sixty-Year Timeline of the Air Force Maui Optical and Supercomputing Site

    Science.gov (United States)

    2013-01-01

    Bulgaria, Canada, France, Japan , Russia, Switzerland, and Taiwan. Support Maui Economic Development Board, Inc., 2012 Sep 18 2007 Senator...scientists, engineers, and technical managers from ten countries, including China, Japan , Italy, and Russia. Support Schumacher, 2009; Maui Economic...of future waves. Public Good/Discoveries Borg , 2011 Aug 2011 Nine months after the removal of the AEOS broken azimuth drive motor, a new

  13. LDRD final report : a lightweight operating system for multi-core capability class supercomputers.

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Suzanne Marie; Hudson, Trammell B. (OS Research); Ferreira, Kurt Brian; Bridges, Patrick G. (University of New Mexico); Pedretti, Kevin Thomas Tauke; Levenhagen, Michael J.; Brightwell, Ronald Brian

    2010-09-01

    The two primary objectives of this LDRD project were to create a lightweight kernel (LWK) operating system(OS) designed to take maximum advantage of multi-core processors, and to leverage the virtualization capabilities in modern multi-core processors to create a more flexible and adaptable LWK environment. The most significant technical accomplishments of this project were the development of the Kitten lightweight kernel, the co-development of the SMARTMAP intra-node memory mapping technique, and the development and demonstration of a scalable virtualization environment for HPC. Each of these topics is presented in this report by the inclusion of a published or submitted research paper. The results of this project are being leveraged by several ongoing and new research projects.

  14. Detection and Characterization of Engineered Nanomaterials in the Environment: Current State-of-the-art and Future Directions Report, Annotated Bibliography, and Image Library

    Science.gov (United States)

    The increasing manufacture and implementation of engineered nanomaterials (ENMs) will continue to lead to the release of these materials into the environment. Reliably assessing the environmental exposure risk of ENMs will depend highly on the ability to quantify and characterize...

  15. Analyzing Future Complex National Security Challenges within the Joint, Interagency, Intergovernmental, and Multinational Environment. Proteus Futures Academic Workshop Held in Carlisle Barracks, Pennsylvania on 22-24 August 2006

    Science.gov (United States)

    2006-08-01

    Sensor-to-Shooter and Phenomenology : Macro Infrastructures of Intelligence Targets/ Micro: Signatures that Sensors Detect. 9 AnAlyzing Future...upon how humans and societies ariously act. Dr. Werther focuses on the “change process” as a dynamically fluid, contextually nuanced “ dance ” inol...harmonies; the “song they sing”) and dance (with respect to mutual actions) more than math. The latter, particularly, is endlessly recursie with

  16. Informatics and Nursing in a Post-Nursing Informatics World: Future Directions for Nurses in an Automated, Artificially Intelligent, Social-Networked Healthcare Environment.

    Science.gov (United States)

    Booth, Richard G

    2016-01-01

    The increased adoption and use of technology within healthcare and society has influenced the nursing informatics specialty in a multitude of fashions. Namely, the nursing informatics specialty currently faces a range of important decisions related to its knowledge base, established values and future directions - all of which are in need of development and future-proofing. In light of the increased use of automation, artificial intelligence and big data in healthcare, the specialty must also reconceptualize the roles of both nurses and informaticians to ensure that the nursing profession is ready to operate within future digitalized healthcare ecosystems. To explore these goals, the author of this manuscript outlines an examination of technological advancements currently taking place within healthcare, and also proposes implications for the nursing role and the nursing informatics specialty. Finally, recommendations and insights towards how the roles of nurses and informaticians might evolve or be shaped in the growing post-nursing informatics era are presented. Copyright © 2016 Longwoods Publishing.

  17. A complete implementation of the conjugate gradient algorithm on a reconfigurable supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, David H [Los Alamos National Laboratory; Dubois, Andrew J [Los Alamos National Laboratory; Connor, Carolyn M [Los Alamos National Laboratory; Boorman, Thomas M [Los Alamos National Laboratory; Poole, Stephen W [ORNL

    2008-01-01

    The conjugate gradient is a prominent iterative method for solving systems of sparse linear equations. Large-scale scientific applications often utilize a conjugate gradient solver at their computational core. In this paper we present a field programmable gate array (FPGA) based implementation of a double precision, non-preconditioned, conjugate gradient solver for fmite-element or finite-difference methods. OUf work utilizes the SRC Computers, Inc. MAPStation hardware platform along with the 'Carte' software programming environment to ease the programming workload when working with the hybrid (CPUIFPGA) environment. The implementation is designed to handle large sparse matrices of up to order N x N where N <= 116,394, with up to 7 non-zero, 64-bit elements per sparse row. This implementation utilizes an optimized sparse matrix-vector multiply operation which is critical for obtaining high performance. Direct parallel implementations of loop unrolling and loop fusion are utilized to extract performance from the various vector/matrix operations. Rather than utilize the FPGA devices as function off-load accelerators, our implementation uses the FPGAs to implement the core conjugate gradient algorithm. Measured run-time performance data is presented comparing the FPGA implementation to a software-only version showing that the FPGA can outperform processors running up to 30x the clock rate. In conclusion we take a look at the new SRC-7 system and estimate the performance of this algorithm on that architecture.

  18. AUTODYN - an interactive non-linear dynamic analysis program for microcomputers through supercomputers

    International Nuclear Information System (INIS)

    Birnbaum, N.K.; Cowler, M.S.; Itoh, M.; Katayama, M.; Obata, H.

    1987-01-01

    AUTODYN uses a two dimensional coupled finite difference approach similar to the one described by Cowler and Hancock (1979). Both translational and axial symmetry are treated. The scheme allows alternative numerical processors to be selectively used to model different components/regions of a problem. Finite difference grids operated on by these processors can be coupled together in space and time to efficiently compute structural (or fluid-structure) interactions. AUTODYN currently includes a Lagrange processor for modeling solid continua and structures, an Euler processor for modeling fluids and the large distortion of solids, an ALE (Arbitrary Lagrange Euler) processor for specialized flow models and a shell processor for modeling thin structures. At present, all four processors use explicit time integration but implicit options will be added to the Lagrange and ALE processors in the near future. Material models are included for solids, liquids and gases (including HE detonation products). (orig.)

  19. Science Driven Supercomputing Architectures: AnalyzingArchitectural Bottlenecks with Applications and Benchmark Probes

    Energy Technology Data Exchange (ETDEWEB)

    Kamil, S.; Yelick, K.; Kramer, W.T.; Oliker, L.; Shalf, J.; Shan,H.; Strohmaier, E.

    2005-09-26

    There is a growing gap between the peak speed of parallel computing systems and the actual delivered performance for scientific applications. In general this gap is caused by inadequate architectural support for the requirements of modern scientific applications, as commercial applications and the much larger market they represent, have driven the evolution of computer architectures. This gap has raised the importance of developing better benchmarking methodologies to characterize and to understand the performance requirements of scientific applications, to communicate them efficiently to influence the design of future computer architectures. This improved understanding of the performance behavior of scientific applications will allow improved performance predictions, development of adequate benchmarks for identification of hardware and application features that work well or poorly together, and a more systematic performance evaluation in procurement situations. The Berkeley Institute for Performance Studies has developed a three-level approach to evaluating the design of high end machines and the software that runs on them: (1) A suite of representative applications; (2) A set of application kernels; and (3) Benchmarks to measure key system parameters. The three levels yield different type of information, all of which are useful in evaluating systems, and enable NSF and DOE centers to select computer architectures more suited for scientific applications. The analysis will further allow the centers to engage vendors in discussion of strategies to alleviate the present architectural bottlenecks using quantitative information. These may include small hardware changes or larger ones that may be out interest to non-scientific workloads. Providing quantitative models to the vendors allows them to assess the benefits of technology alternatives using their own internal cost-models in the broader marketplace, ideally facilitating the development of future computer

  20. Sandia`s research network for Supercomputing `93: A demonstration of advanced technologies for building high-performance networks

    Energy Technology Data Exchange (ETDEWEB)

    Gossage, S.A.; Vahle, M.O.

    1993-12-01

    Supercomputing `93, a high-performance computing and communications conference, was held November 15th through 19th, 1993 in Portland, Oregon. For the past two years, Sandia National Laboratories has used this conference to showcase and focus its communications and networking endeavors. At the 1993 conference, the results of Sandia`s efforts in exploring and utilizing Asynchronous Transfer Mode (ATM) and Synchronous Optical Network (SONET) technologies were vividly demonstrated by building and operating three distinct networks. The networks encompassed a Switched Multimegabit Data Service (SMDS) network running at 44.736 megabits per second, an ATM network running on a SONET circuit at the Optical Carrier (OC) rate of 155.52 megabits per second, and a High Performance Parallel Interface (HIPPI) network running over a 622.08 megabits per second SONET circuit. The SMDS and ATM networks extended from Albuquerque, New Mexico to the showroom floor, while the HIPPI/SONET network extended from Beaverton, Oregon to the showroom floor. This paper documents and describes these networks.

  1. Getting To Exascale: Applying Novel Parallel Programming Models To Lab Applications For The Next Generation Of Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Dube, Evi [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shereda, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nau, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Harris, Lance [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-09-27

    As supercomputing moves toward exascale, node architectures will change significantly. CPU core counts on nodes will increase by an order of magnitude or more. Heterogeneous architectures will become more commonplace, with GPUs or FPGAs providing additional computational power. Novel programming models may make better use of on-node parallelism in these new architectures than do current models. In this paper we examine several of these novel models – UPC, CUDA, and OpenCL –to determine their suitability to LLNL scientific application codes. Our study consisted of several phases: We conducted interviews with code teams and selected two codes to port; We learned how to program in the new models and ported the codes; We debugged and tuned the ported applications; We measured results, and documented our findings. We conclude that UPC is a challenge for porting code, Berkeley UPC is not very robust, and UPC is not suitable as a general alternative to OpenMP for a number of reasons. CUDA is well supported and robust but is a proprietary NVIDIA standard, while OpenCL is an open standard. Both are well suited to a specific set of application problems that can be run on GPUs, but some problems are not suited to GPUs. Further study of the landscape of novel models is recommended.

  2. The ASCI Network for SC '99: A Step on the Path to a 100 Gigabit Per Second Supercomputing Network

    Energy Technology Data Exchange (ETDEWEB)

    PRATT,THOMAS J.; TARMAN,THOMAS D.; MARTINEZ,LUIS M.; MILLER,MARC M.; ADAMS,ROGER L.; CHEN,HELEN Y.; BRANDT,JAMES M.; WYCKOFF,PETER S.

    2000-07-24

    This document highlights the Discom{sup 2}'s Distance computing and communication team activities at the 1999 Supercomputing conference in Portland, Oregon. This conference is sponsored by the IEEE and ACM. Sandia, Lawrence Livermore and Los Alamos National laboratories have participated in this conference for eleven years. For the last four years the three laboratories have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives rubric. Communication support for the ASCI exhibit is provided by the ASCI DISCOM{sup 2} project. The DISCOM{sup 2} communication team uses this forum to demonstrate and focus communication and networking developments within the community. At SC 99, DISCOM built a prototype of the next generation ASCI network demonstrated remote clustering techniques, demonstrated the capabilities of the emerging Terabit Routers products, demonstrated the latest technologies for delivering visualization data to the scientific users, and demonstrated the latest in encryption methods including IP VPN technologies and ATM encryption research. The authors also coordinated the other production networking activities within the booth and between their demonstration partners on the exhibit floor. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support Sandia's overall strategies in ASCI networking.

  3. Porting, parallelization and performance evaluation experiences with massively parallel supercomputing system based on transputer

    Energy Technology Data Exchange (ETDEWEB)

    Fruscione, M.; Stofella, P.; Cleri, F.; Mazzeo, M.; Ornelli, P.; Schiano, P.

    1991-02-01

    This paper decribes the most important aspects and results obtained from the porting and parallelization of two programs, VPMC and EULERO, on a Meiko multiprocessor `Computing Surface` system. The VPMC program was developed by ENEA (the Italian Agency for Energy, New Technologies and the Environment) to simulate travelling electrons. EULERO is a fluid dynamics simulation program, owned by CIRA (Centro Italiano di Ricerche Aereospaziali) which uses it for its aereo space components projects. This report gives short descriptions of the two programs and their parallelization methodologies, and provides a performance evaluation of the Meiko `Computing Surface` system. Moreover, these performance data are compared with corresponding data obtained with IBM 3090, CRAY and other computers by ENEA and CIRA in their research and development activities.

  4. Performance characteristics of hybrid MPI/OpenMP implementations of NAS parallel benchmarks SP and BT on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-03-29

    The NAS Parallel Benchmarks (NPB) are well-known applications with the fixed algorithms for evaluating parallel systems and tools. Multicore supercomputers provide a natural programming paradigm for hybrid programs, whereby OpenMP can be used with the data sharing with the multicores that comprise a node and MPI can be used with the communication between nodes. In this paper, we use SP and BT benchmarks of MPI NPB 3.3 as a basis for a comparative approach to implement hybrid MPI/OpenMP versions of SP and BT. In particular, we can compare the performance of the hybrid SP and BT with the MPI counterparts on large-scale multicore supercomputers. Our performance results indicate that the hybrid SP outperforms the MPI SP by up to 20.76%, and the hybrid BT outperforms the MPI BT by up to 8.58% on up to 10,000 cores on BlueGene/P at Argonne National Laboratory and Jaguar (Cray XT4/5) at Oak Ridge National Laboratory. We also use performance tools and MPI trace libraries available on these supercomputers to further investigate the performance characteristics of the hybrid SP and BT.

  5. Assessing and Managing the Current and Future Pest Risk from Water Hyacinth, (Eichhornia crassipes), an Invasive Aquatic Plant Threatening the Environment and Water Security.

    Science.gov (United States)

    Kriticos, Darren J; Brunel, Sarah

    2016-01-01

    Understanding and managing the biological invasion threats posed by aquatic plants under current and future climates is a growing challenge for biosecurity and land management agencies worldwide. Eichhornia crassipes is one of the world's worst aquatic weeds. Presently, it threatens aquatic ecosystems, and hinders the management and delivery of freshwater services in both developed and developing parts of the world. A niche model was fitted using CLIMEX, to estimate the potential distribution of E. crassipes under historical and future climate scenarios. Under two future greenhouse gas emission scenarios for 2080 simulated with three Global Climate Models, the area with a favourable temperature regime appears set to shift polewards. The greatest potential for future range expansion lies in Europe. Elsewhere in the northern hemisphere temperature gradients are too steep for significant geographical range expansion under the climate scenarios explored here. In the Southern Hemisphere, the southern range boundary for E. crassipes is set to expand southwards in Argentina, Australia and New Zealand; under current climate conditions it is already able to invade the southern limits of Africa. The opportunity exists to prevent its spread into the islands of Tasmania in Australia and the South Island of New Zealand, both of which depend upon hydroelectric facilities that would be threatened by the presence of E. crassipes. In Europe, efforts to slow or stop the spread of E. crassipes will face the challenge of limited internal biosecurity capacity. The modelling technique demonstrated here is the first application of niche modelling for an aquatic weed under historical and projected future climates. It provides biosecurity agencies with a spatial tool to foresee and manage the emerging invasion threats in a manner that can be included in the international standard for pest risk assessments. It should also support more detailed local and regional management.

  6. Assessing and Managing the Current and Future Pest Risk from Water Hyacinth, (Eichhornia crassipes, an Invasive Aquatic Plant Threatening the Environment and Water Security.

    Directory of Open Access Journals (Sweden)

    Darren J Kriticos

    Full Text Available Understanding and managing the biological invasion threats posed by aquatic plants under current and future climates is a growing challenge for biosecurity and land management agencies worldwide. Eichhornia crassipes is one of the world's worst aquatic weeds. Presently, it threatens aquatic ecosystems, and hinders the management and delivery of freshwater services in both developed and developing parts of the world. A niche model was fitted using CLIMEX, to estimate the potential distribution of E. crassipes under historical and future climate scenarios. Under two future greenhouse gas emission scenarios for 2080 simulated with three Global Climate Models, the area with a favourable temperature regime appears set to shift polewards. The greatest potential for future range expansion lies in Europe. Elsewhere in the northern hemisphere temperature gradients are too steep for significant geographical range expansion under the climate scenarios explored here. In the Southern Hemisphere, the southern range boundary for E. crassipes is set to expand southwards in Argentina, Australia and New Zealand; under current climate conditions it is already able to invade the southern limits of Africa. The opportunity exists to prevent its spread into the islands of Tasmania in Australia and the South Island of New Zealand, both of which depend upon hydroelectric facilities that would be threatened by the presence of E. crassipes. In Europe, efforts to slow or stop the spread of E. crassipes will face the challenge of limited internal biosecurity capacity. The modelling technique demonstrated here is the first application of niche modelling for an aquatic weed under historical and projected future climates. It provides biosecurity agencies with a spatial tool to foresee and manage the emerging invasion threats in a manner that can be included in the international standard for pest risk assessments. It should also support more detailed local and regional

  7. Our World, Our Future: Bilingual Activities on Population and the Environment = Nuestro Mundo, Nuestro Futuro: Actividades Bilingues Acerca de la Poblacion y el Medio Ambiente.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This bilingual activity guide helps to develop students' understandings of the interdependence of people and the environment. Interdisciplinary resources are provided featuring environmental education lessons with applications to the social studies, science, math, and family life education curricula. It is designed for the middle school level, but…

  8. A design methodology for domain-optimized power-efficient supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Mohiyuddin, Marghoob [Univ. of California, Berkeley, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Murphy, Mark [Univ. of California, Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Shalf, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wawrzynek, John [Univ. of California, Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2009-01-01

    As power has become the pre-eminent design constraint for future HPC systems, computational efficiency is being emphasized over simply peak performance. Recently, static benchmark codes have been used to find a power efficient architecture. Unfortunately, because compilers generate sub-optimal code, benchmark performance can be a poor indicator of the performance potential of architecture design points. Therefore, we present hardware/software cotuning as a novel approach for system design, in which traditional architecture space exploration is tightly coupled with software auto-tuning for delivering substantial improvements in area and power efficiency. We demonstrate the proposed methodology by exploring the parameter space of a Tensilica-based multi-processor running three of the most heavily used kernels in scientific computing, each with widely varying micro-architectural requirements: sparse matrix vector multiplication, stencil-based computations, and general matrix-matrix multiplication. Results demonstrate that co-tuning significantly improves hardware area and energy efficiency - a key driver for next generation of HPC system design.

  9. Computational Solutions for Today’s Navy: New Methods are Being Employed to Meet the Navy’s Changing Software-Development Environment

    Science.gov (United States)

    2008-03-01

    software- development environment. ▶ Frank W. Bentrem, Ph.D., John T. Sample, Ph.D., and Michael M. Harris he Naval Research Labor - atory (NRL) is the...sonars (Through-the-Sensor technology), supercomputer generated numer- ical models, and historical/ clima - tological databases. It uses a vari- ety of

  10. Science Driven Supercomputing Architectures: AnalyzingArchitectural Bottlenecks with Applications and Benchmark Probes

    Energy Technology Data Exchange (ETDEWEB)

    Kamil, S.; Yelick, K.; Kramer, W.T.; Oliker, L.; Shalf, J.; Shan,H.; Strohmaier, E.

    2005-09-26

    There is a growing gap between the peak speed of parallelcomputing systems and the actual delivered performance for scientificapplications. In general this gap is caused by inadequate architecturalsupport for the requirements of modern scientific applications, ascommercial applications and the much larger market they represent, havedriven the evolution of computer architectures. This gap has raised theimportance of developing better benchmarking methodologies tocharacterize and to understand the performance requirements of scientificapplications, to communicate them efficiently to influence the design offuture computer architectures. This improved understanding of theperformance behavior of scientific applications will allow improvedperformance predictions, development of adequate benchmarks foridentification of hardware and application features that work well orpoorly together, and a more systematic performance evaluation inprocurement situations. The Berkeley Institute for Performance Studieshas developed a three-level approach to evaluating the design of high endmachines and the software that runs on them: 1) A suite of representativeapplications; 2) A set of application kernels; and 3) Benchmarks tomeasure key system parameters. The three levels yield different type ofinformation, all of which are useful in evaluating systems, and enableNSF and DOE centers to select computer architectures more suited forscientific applications. The analysis will further allow the centers toengage vendors in discussion of strategies to alleviate the presentarchitectural bottlenecks using quantitative information. These mayinclude small hardware changes or larger ones that may be out interest tonon-scientific workloads. Providing quantitative models to the vendorsallows them to assess the benefits of technology alternatives using theirown internal cost-models in the broader marketplace, ideally facilitatingthe development of future computer architectures more suited forscientific

  11. Early establishment response of different Pinus nigra ssp. salzmanii seed sources on contrasting environments: Implications for future reforestation programs and assisted population migration.

    Science.gov (United States)

    Taïbi, K; del Campo, A D; Aguado, A; Mulet, J M

    2016-04-15

    Forest restoration constitutes an important issue within adaptive environmental management for climate change at global scale. However, effective implementation of these programs can only be achieved by revising current seed transfer guidelines, as they lack inherent spatial and temporal dynamics associated with climate change. In this sense, provenance trials may provide key information on the relative performance of different populations and/or genotypes under changing ecological conditions. This study addresses a methodological approach to evaluate early plantation performance and the consequent phenotypic plasticity and the pattern of the adaptation of different seed sources in contrasting environments. To this end, six seed sources of Salzmann pine were tested at three contrasting trial sites testing a hypothetical assisted population migration. Adaptation at each site was assessed through Joint Regression and Additive Main effect and Multiplication Interaction (AMMI) models. Most of the observed variation was attributed to the environment (above 90% for all traits), even so genotype and genotype by environment interaction (GxE) were significant. Seedlings out-planted under better site conditions did not differ in survival but in height growth. However, on sites with higher constraints, survival differed among seed sources and diameter growth was high. The adaptation analyses (AMMI) indicated that the cold-continental seed source 'Soria' performed as a generalist seed source, whereas 'Cordilleras Béticas', the southernmost seed source, was more adapted to harsh environments (frost and drought) in terms of survival. The results supported partially the hypothesis that assisted migration of seed sources makes sense within limited transfer distances, and this was reinforced by the GxE results. The present study could be valuable to address adaptive transfer of seedings in ecological restoration and to determine the suitable seed sources for reforestation programs

  12. Linking the fine-scale social environment to mating decisions: a future direction for the study of extra-pair paternity.

    Science.gov (United States)

    Maldonado-Chaparro, Adriana A; Montiglio, Pierre-Olivier; Forstmeier, Wolfgang; Kempenaers, Bart; Farine, Damien R

    2018-03-13

    Variation in extra-pair paternity (EPP) among individuals of the same population could result from stochastic demography or from individual differences in mating strategies. Although the adaptive value of EPP has been widely studied, much less is known about the characteristics of the social environment that drive the observed patterns of EPP. Here, we demonstrate how concepts and well-developed tools for the study of social behaviour (such as social network analysis) can enhance the study of extra-pair mating decisions (focussing in particular on avian mating systems). We present several hypotheses that describe how characteristics of the social environment in which individuals are embedded might influence the levels of EPP in a socially monogamous population. We use a multi-level social approach (Hinde, 1976) to achieve a detailed description of the social structure and social dynamics of individuals in a group. We propose that the pair-bond, the direct (local) social environment and the indirect (extended) social environment, can contribute in different ways to the variation observed in the patterns of EPP, at both the individual and the population level. A strength of this approach is that it integrates into the analysis (indirect) interactions with all potential mates in a population, thus extending the current framework to study extra-pair mating behaviour. We also encourage the application of social network methods such as temporal dynamic analysis to depict temporal changes in the patterns of interactions among individuals in a group, and to study how this affects mating behaviour. We argue that this new framework will contribute to a better understanding of the proximate mechanisms that drive variation in EPP within populations in socially monogamous species, and might ultimately provide insights into the evolution and maintenance of mating systems. © 2018 Cambridge Philosophical Society.

  13. Multi-decadal changes in tundra environments and ecosystems: Synthesis of the International Polar Year-Back to the Future Project (IPY-BTF)

    DEFF Research Database (Denmark)

    Callaghan, Terry V.; Tweedie, Craig E.; Åkerman, Jonas

    2011-01-01

    opportunity for such research through the Back to the Future (BTF) project (IPY project #512). This article synthesizes the results from 13 papers within this Ambio Special Issue. Abiotic changes include glacial recession in the Altai Mountains, Russia; increased snow depth and hardness, permafrost warming......, and increased growing season length in sub-arctic Sweden; drying of ponds in Greenland; increased nutrient availability in Alaskan tundra ponds, and warming at most locations studied. Biotic changes ranged from relatively minor plant community change at two sites in Greenland to moderate change in the Yukon...

  14. An investigation into the challenges facing the future provision of continuing professional development for allied health professionals in a changing healthcare environment

    International Nuclear Information System (INIS)

    Gibbs, Vivien

    2011-01-01

    This paper outlines current challenges facing healthcare providers and education providers in trying to ensure Allied Health Professionals (AHPs) are fit for practice, in a climate driven by financial constraints and service improvement directives from the Department of Health (DH). Research was undertaken in 2009 to investigate the current provision of Continuing Professional Development (CPD) in the southwest region of England. The purpose was to define exactly what problems existed with this provision, and to propose changes which could be implemented in order to ensure that the provision meets the needs of stakeholders in future years.

  15. Sharing visualization experiences among remote virtual environments

    Energy Technology Data Exchange (ETDEWEB)

    Disz, T.L.; Papka, M.E.; Pellegrino, M.; Stevens, R. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.

    1995-12-31

    Virtual reality has become an increasingly familiar part of the science of visualization and communication of information. This, combined with the increase in connectivity of remote sites via high-speed networks, allows for the development of a collaborative distributed virtual environment. Such an environment enables the development of supercomputer simulations with virtual reality visualizations that can be displayed at multiple sites, with each site interacting, viewing, and communicating about the results being discovered. The early results of an experimental collaborative virtual reality environment are discussed in this paper. The issues that need to be addressed in the implementation, as well as preliminary results are covered. Also provided are a discussion of plans and a generalized application programmers interface for CAVE to CAVE will be provided.

  16. Oil-particle interactions and submergence from crude oil spills in marine and freshwater environments: review of the science and future research needs

    Science.gov (United States)

    Fitzpatrick, Faith A.; Boufadel, Michael C.; Johnson, Rex; Lee, Kenneth W.; Graan, Thomas P.; Bejarano, Adriana C.; Zhu, Zhenduo; Waterman, David; Capone, Daniel M.; Hayter, Earl; Hamilton, Stephen K.; Dekker, Timothy; Garcia, Marcelo H.; Hassan, Jacob S.

    2015-01-01

    Oil-particle interactions and oil submergence are of much interest to oil spill responders and scientists, especially as transportation of light and heavy crude oils increases in North America’s coastal marine and freshwater environments. This report contains an up-to-date review of the state of the science for oil-particle aggregates (OPAs), in terms of their formation and stability which may alter the transport, fate, and toxicity of the residual oil and, hence, its level of ecological risk. Operational considerations—detection, containment, and recovery—are discussed.

  17. Futurism in Education: Methodologies.

    Science.gov (United States)

    Hencley, Stephen P.; Yates, James R.

    This book is one expression of the trend to achieve a more systematic study of the future within the specific context of educational futures and their environments. It is intended to bring to educational leaders in a practical manner many of the technological forecasting techniques previously familiar only to science, the military, and industry.…

  18. E-health systems for management of MDR-TB in resource-poor environments: a decade of experience and recommendations for future work.

    Science.gov (United States)

    Fraser, Hamish S F; Habib, Ali; Goodrich, Mark; Thomas, David; Blaya, Joaquin A; Fils-Aime, Joseph Reginald; Jazayeri, Darius; Seaton, Michael; Khan, Aamir J; Choi, Sharon S; Kerrison, Foster; Falzon, Dennis; Becerra, Mercedes C

    2013-01-01

    Multi-drug resistant TB (MDR-TB) is a complex infectious disease that is a growing threat to global health. It requires lengthy treatment with multiple drugs and specialized laboratory testing. To effectively scale up treatment to thousands of patients requires good information systems to support clinical care, reporting, drug forecasting, supply chain management and monitoring. Over the last decade we have developed the PIH-EMR electronic medical record system, and subsequently OpenMRS-TB, to support the treatment of MDR-TB in Peru, Haiti, Pakistan, and other resource-poor environments. We describe here the experience with implementing these systems and evaluating many aspects of their performance, and review other systems for MDR-TB management. We recommend a new approach to information systems to address the barriers to scale up MDR-TB treatment, particularly access to the appropriate drugs and lab data. We propose moving away from fragmented, vertical systems to focus on common platforms, addressing all stages of TB care, support for open data standards and interoperability, care for a wide range of diseases including HIV, integration with mHealth applications, and ability to function in resource-poor environments.

  19. Scalable coherent interface: Links to the future

    Science.gov (United States)

    Gustavson, D. B.; Kristiansen, E.

    1991-11-01

    The Scalable Coherent Interface (SCI) was developed to support closely coupled multiprocessors and their caches in a distributed shared-memory environment, but its scalability and the efficient generality of its architecture make it work very well over a wide range of applications. It can replace a local area network for connecting workstations on a campus. It can be a powerful I/O channel for a supercomputer. It can be the processor, cache-memory I/O connection in a highly parallel computer. It can gather data from enormous particle detectors and distribute it among thousands of processors. It can connect a desktop microprocessor to memory chips a few millimeters away, disk drivers a few meters away, and servers a few kilometers away.

  20. Future directions in shielding methods and analysis

    International Nuclear Information System (INIS)

    Goldstein, H.

    1987-01-01

    Over the nearly half century history of shielding against reactor radiation, there has been a see-saw battle between theory and measurement. During that period the capability and accuracy of calculational methods have been enormously improved. The microscopic cross sections needed as input to the theoretical computations are now also known to adequate accuracy (with certain exceptions). Nonetheless, there remain substantial classes of shielding problems not yet accessible to satisfactory computational methods, particularly where three-dimensional geometries are involved. This paper discusses promising avenues to approach such problems, especially in the light of recent and expected advances in supercomputers. In particular, it seems that Monte Carlo methods should be much more advantageous in the new computer environment than they have been in the past

  1. Changing the Learning Environment in the College of Engineering and Applied Science: The impact of Educational Training on Future Faculty and Student- Centered Pedagogy on Undergraduate Students

    Science.gov (United States)

    Gaskins, Whitney

    Over the past 20 years there have been many changes to the primary and secondary educational system that have impacted students, teachers, and post-secondary institutions across the United States of America. One of the most important is the large number of standardized tests students are required to take to show adequate performance in school. Students think differently because they are taught differently due to this focus on standardized testing, thus changing the skill sets students acquire in secondary school. This presents a critical problem for colleges and universities, as they now are using practices for and have expectations of these students that are unrealistic for the changing times. High dropout rates in the College of Engineering have been attributed to the cultural atmosphere of the institution. Students have reported a low sense of belonging and low relatability to course material. This study developed a "preparing the future" faculty program that gave graduate students at the University of Cincinnati a unique training experience that helped them understand the students they will educate. They received educational training, developed from a future educator's curriculum that covered classroom management, standards, and pedagogy. Graduate students who participated in the training program reported increases in self-efficacy and student understanding. To reduce negative experiences and increase motivation, Challenge Based Learning (CBL) was introduced in an undergraduate Basic Electric Circuits (BEC) course. CBL is a structured model for course content with a foundation in problem-based learning. CBL offers general concepts from which students derive the challenges they will address. Results show an improved classroom experience for students who were taught with CBL.

  2. Assessing mobile food vendors (a.k.a. street food vendors)--methods, challenges, and lessons learned for future food-environment research.

    Science.gov (United States)

    Lucan, S C; Varona, M; Maroko, A R; Bumol, J; Torrens, L; Wylie-Rosett, J

    2013-08-01

    Mobile food vendors (also known as street food vendors) may be important sources of food, particularly in minority and low-income communities. Unfortunately, there are no good data sources on where, when, or what vendors sell. The lack of a published assessment method may contribute to the relative exclusion of mobile food vendors from existing food-environment research. A goal of this study was to develop, pilot, and refine a method to assess mobile food vendors. Cross-sectional assessment of mobile food vendors through direct observations and brief interviews. Using printed maps, investigators canvassed all streets in Bronx County, NY (excluding highways but including entrance and exit ramps) in 2010, looking for mobile food vendors. For each vendor identified, researchers recorded a unique identifier, the vendor's location, and direct observations. Investigators also recorded vendors answers to where, when, and what they sold. Of 372 identified vendors, 38% did not answer brief-interview questions (19% were 'in transit', 15% refused; others were absent from their carts/trucks/stands or with customers). About 7% of vendors who ultimately answered questions were reluctant to engage with researchers. Some vendors expressed concerns about regulatory authority; only 34% of vendors had visible permits or licenses and many vendors had improvised illegitimate-appearing set-ups. The majority of vendors (75% of those responding) felt most comfortable speaking Spanish; 5% preferred other non-English languages. Nearly a third of vendors changed selling locations (streets, neighbourhoods, boroughs) day-to-day or even within a given day. There was considerable variability in times (hours, days, months) in which vendors reported doing business; for 86% of vendors, weather was a deciding factor. Mobile food vendors have a variable and fluid presence in an urban environment. Variability in hours and locations, having most comfort with languages other than English, and reluctance

  3. Public (Q)SAR Services, Integrated Modeling Environments, and Model Repositories on the Web: State of the Art and Perspectives for Future Development.

    Science.gov (United States)

    Tetko, Igor V; Maran, Uko; Tropsha, Alexander

    2017-03-01

    Thousands of (Quantitative) Structure-Activity Relationships (Q)SAR models have been described in peer-reviewed publications; however, this way of sharing seldom makes models available for the use by the research community outside of the developer's laboratory. Conversely, on-line models allow broad dissemination and application representing the most effective way of sharing the scientific knowledge. Approaches for sharing and providing on-line access to models range from web services created by individual users and laboratories to integrated modeling environments and model repositories. This emerging transition from the descriptive and informative, but "static", and for the most part, non-executable print format to interactive, transparent and functional delivery of "living" models is expected to have a transformative effect on modern experimental research in areas of scientific and regulatory use of (Q)SAR models. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The seasonal carbon and water balances of the Cerrado environment of Brazil: Past, present, and future influences of land cover and land use

    Science.gov (United States)

    Arantes, Arielle Elias; Ferreira, Laerte G.; Coe, Michael T.

    2016-07-01

    The Brazilian savanna (known as Cerrado) is an upland biome made up of various vegetation types from herbaceous to arboreal. In this paper, MODIS remote sensing vegetation greenness from the Enhanced Vegetation Index (EVI) and evapotranspiration (ET) data for the 2000-2012 period were analyzed to understand the differences in the net primary productivity (NPP-proxy), carbon, and the evaporative flux of the major Cerrado natural and anthropic landscapes. The understanding of the carbon and evaporative fluxes of the main natural and anthropic vegetation types is of fundamental importance in studies regarding the impacts of land cover and land use changes in the regional and global climate. The seasonal dynamics of EVI and ET of the main natural and anthropic vegetation types of the Cerrado biome were analyzed using a total of 35 satellite-based samples distributed over representative Cerrado landscapes. Carbon and water fluxes were estimated for different scenarios, such as, a hypothetical unconverted Cerrado, 2002 and 2050 scenarios based on values derived from literature and on the PROBIO land cover and land use map for the Cerrado. The total growing season biomass for 2002 in the Cerrado region was estimated to be 28 gigatons of carbon and the evapotranspiration was 1336 gigatons of water. The mean estimated growing season evapotranspiration and biomass for 2002 was 576 Gt of water and 12 Gt of carbon for pasture and croplands compared to 760 Gt of water and 15 Gt of carbon for the Cerrado natural vegetation. In a modeled future scenario for the year 2050, the ET flux from natural Cerrado vegetation was 394 Gt less than in 2002 and 991 Gt less than in an unconverted scenario, with only natural vegetation, while the carbon was 8 Gt less than in 2002 and 21 Gt less than in this hypothetical pre-conversion Cerrado. On the other hand, the sum of the pasture and cropland ET flux increased by 405 Gt in 2050 relative to 2002 and the carbon by 11 Gt of carbon. Given the

  5. Uranium, its impact on the national and global energy mix; and its history, distribution, production, nuclear fuel-cycle, future, and relation to the environment

    Science.gov (United States)

    Finch, Warren Irvin

    1997-01-01

    The many aspects of uranium, a heavy radioactive metal used to generate electricity throughout the world, are briefly described in relatively simple terms intended for the lay reader. An adequate glossary of unfamiliar terms is given. Uranium is a new source of electrical energy developed since 1950, and how we harness energy from it is explained. It competes with the organic coal, oil, and gas fuels as shown graphically. Uranium resources and production for the world are tabulated and discussed by country and for various energy regions in the United States. Locations of major uranium deposits and power reactors in the United States are mapped. The nuclear fuel-cycle of uranium for a typical light-water reactor is illustrated at the front end-beginning with its natural geologic occurrence in rocks through discovery, mining, and milling; separation of the scarce isotope U-235, its enrichment, and manufacture into fuel rods for power reactors to generate electricity-and at the back end-the reprocessing and handling of the spent fuel. Environmental concerns with the entire fuel cycle are addressed. The future of the use of uranium in new, simplified, 'passively safe' reactors for the utility industry is examined. The present resource assessment of uranium in the United States is out of date, and a new assessment could aid the domestic uranium industry.

  6. Genetics, lifestyle and environment. UK Biobank is an open access resource following the lives of 500,000 participants to improve the health of future generations.

    Science.gov (United States)

    Trehearne, Andrew

    2016-03-01

    UK Biobank is a long-term prospective epidemiology study having recruited and now following the lives of 500,000 people in England, Scotland and Wales, aged 40-69 years when they joined the study (Sudlow et al., PLoS Med 12(3):e1001779, 2015). Participants were recruited by letter and asked to attend one of 22 assessment centres in towns and cities across Britain, where they provided consent, answered detailed questions about their health and lifestyle, had body measures taken and donated blood, urine and saliva. Participants provided consent for the long-term follow-up of their health via medical records, such as general practice and hospital records, cancer and death records. Samples are being stored long term for a wide range of analyses, including genetic. The resource is open to all bona fide scientists from the UK and overseas, academic and industry who register via its access management system. Summary of UK Biobank data can be viewed via its Data Showcase and the resource will be strengthened over time as the results of new analyses and studies are returned, health links and participants provide additional information about themselves. Some will attend full repeat assessment visits. UK Biobank is open for business, and it hopes researchers will find it a valuable tool to improve the health of future generations.

  7. Compound-specific stable isotope analysis of organic contaminants in natural environments: a critical review of the state of the art, prospects, and future challenges

    International Nuclear Information System (INIS)

    Schmidt, Torsten C.; Haderlein, Stefan B.; Zwank, Luc; Elsner, Martin; Berg, Michael; Meckenstock, Rainer U.

    2004-01-01

    Compound-specific stable isotope analysis (CSIA) using gas chromatography-isotope ratio mass spectrometry (GC/IRMS) has developed into a mature analytical method in many application areas over the last decade. This is in particular true for carbon isotope analysis, whereas measurements of the other elements amenable to CSIA (hydrogen, nitrogen, oxygen) are much less routine. In environmental sciences, successful applications to date include (i) the allocation of contaminant sources on a local, regional, and global scale, (ii) the identification and quantification of (bio)transformation reactions on scales ranging from batch experiments to contaminated field sites, and (iii) the characterization of elementary reaction mechanisms that govern product formation. These three application areas are discussed in detail. The investigated spectrum of compounds comprises mainly n-alkanes, monoaromatics such as benzene and toluene, methyl tert-butyl ether (MTBE), polycyclic aromatic hydrocarbons (PAHs), and chlorinated hydrocarbons such as tetrachloromethane, trichloroethylene, and polychlorinated biphenyls (PCBs). Future research directions are primarily set by the state of the art in analytical instrumentation and method development. Approaches to utilize HPLC separation in CSIA, the enhancement of sensitivity of CSIA to allow field investigations in the μg L -1 range, and the development of methods for CSIA of other elements are reviewed. Furthermore, an alternative scheme to evaluate isotope data is outlined that would enable estimates of position-specific kinetic isotope effects and, thus, allow one to extract mechanistic chemical and biochemical information. (orig.)

  8. Dam Mycobacterium avium subspecies paratuberculosis (MAP) infection status does not predetermine calves for future shedding when raised in a contaminated environment: a cohort study.

    Science.gov (United States)

    Eisenberg, Susanne W F; Rutten, Victor P M G; Koets, Ad P

    2015-06-19

    Uptake of Mycobacterium avium subsp. paratuberculosis (MAP) by calves in the first days of life from colostrum, milk and faeces is regarded an important moment of transmission. The objective of this study was to quantify the association between the MAP status of dams as determined by the presence of MAP DNA and antibody in colostrum and that of DNA in faeces and the environment with subsequent MAP shedding of their daughters. A cohort of 117 dam-daughter pairs giving birth/being born on eight commercial dairy farms with endemic paratuberculosis was followed where colostrum, faecal and environmental samples (dust) were analysed for the presence of MAP using an IS900 real-time PCR. Antibodies in colostrum were measured by ELISA. Analysis of dust samples showed that on all farms environmental MAP exposure occurred continuously. In significantly more colostrum samples (48%) MAP DNA was detected compared to faecal samples (37%). MAP specific antibodies were present in 34% of the colostrum samples. In total MAP DNA was present in faecal samples of 41% of the daughters at least once during the sampling period. The association between faecal shedding in the offspring and the dam MAP status defined by MAP PCR on colostrum, MAP PCR on faeces or ELISA on colostrum was determined by an exact cox regression analysis for discrete data. The model indicated that the hazard for faecal shedding in daughters born to MAP positive dams was not significantly different compared to daughters born to MAP negative dams. When born to a dam with DNA positive faeces the HR was 1.05 (CI 0.6; 1.8) and with DNA positive colostrum the HR was 1.17 (CI 0.6; 2.3). When dam status was defined by a combination of both PCR outcomes (faeces and colostrum) and the ELISA outcome the HR was 1.26 (CI 0.9; 1.9). Therefore, this study indicates that neither the presence of MAP DNA in colostrum, MAP DNA in faeces nor the presence of MAP antibodies in colostrum of the dam significantly influences the hazard of

  9. A report documenting the completion of the Los Alamos National Laboratory portion of the ASC level II milestone ""Visualization on the supercomputing platform

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [Los Alamos National Laboratory; Patchett, John M [Los Alamos National Laboratory; Lo, Li - Ta [Los Alamos National Laboratory; Mitchell, Christopher [Los Alamos National Laboratory; Mr Marle, David [KITWARE INC.; Brownlee, Carson [UNIV OF UTAH

    2011-01-24

    This report provides documentation for the completion of the Los Alamos portion of the ASC Level II 'Visualization on the Supercomputing Platform' milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratory and Los Alamos National Laboratory. The milestone text is shown in Figure 1 with the Los Alamos portions highlighted in boldfaced text. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is the most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors (GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the perfromance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. In conclusion, we improved CPU-based rendering performance by a a factor of 2-10 times on our tests. In addition, we evaluated CPU and CPU-based rendering performance. We encourage production visualization experts to consider

  10. Health-promoting compounds of broccoli (Brassica oleracea L. var. italica) plants as affected by nitrogen fertilisation in projected future climatic change environments.

    Science.gov (United States)

    Zaghdoud, Chokri; Carvajal, Micaela; Moreno, Diego A; Ferchichi, Ali; Del Carmen Martínez-Ballesta, María

    2016-01-30

    The complex interactions between CO2 increase and salinity were investigated in relation to decreased N supply, in order to determine the nutritional quality of broccoli (Brassica oleracea L. var. italica) plants under these conditions. Three different decreased N fertilisation regimes (NO3(-)/NH4(+) ratios of 100:0, 50:50 and 0:100 respectively) were combined with ambient (380 ppm) and elevated (800 ppm) [CO2 ] under non-saline (0 mmol L(-1) NaCl) and saline (80 mmol L(-1) NaCl) conditions. Nutrients (minerals, soluble protein and total amino acids) and natural antioxidants (glucosinolates, phenolic acids, flavonoids and vitamin C) were determined. In NH4(+) -fed broccoli plants, a marked growth reduction was shown and a redistribution of amino acids to cope with NH4(+) toxicity resulted in higher levels of indolic glucosinolate and total phenolic compounds. However, the positive effect of the higher [CO2] - ameliorating adverse effects of salinity--was only observed when N was supplied as NO3(-). Under reduced N fertilisation, the total glucosinolates were increased by a decreased NO3(-)/NH4 (+) ratio and elevated [CO2] but were unaffected by salinity. Under future climatic challenges, such as increased salinity and elevated [CO2], a clear genotypic dependence of S metabolism was observed in broccoli plants. In addition, an influence of the form in which N was supplied on plant nutritional quality was observed; a combined NO3(-)/NH4(+) (50:50) supply allowed broccoli plants not only to deal with NH4(+) toxicity but also to modify their glucosinolate content and profile. Thus, for different modes of N fertilisation, the interaction with climatic factors must be considered in the search for an optimal balance between yield and nutritional quality. © 2015 Society of Chemical Industry.

  11. Finger Millet: A "Certain" Crop for an "Uncertain" Future and a Solution to Food Insecurity and Hidden Hunger under Stressful Environments.

    Science.gov (United States)

    Gupta, Sanjay Mohan; Arora, Sandeep; Mirza, Neelofar; Pande, Anjali; Lata, Charu; Puranik, Swati; Kumar, J; Kumar, Anil

    2017-01-01

    Crop growth and productivity has largely been vulnerable to various abiotic and biotic stresses that are only set to be compounded due to global climate change. Therefore developing improved varieties and designing newer approaches for crop improvement against stress tolerance have become a priority now-a-days. However, most of the crop improvement strategies are directed toward staple cereals such as rice, wheat, maize etc., whereas attention on minor cereals such as finger millet [ Eleusine coracana (L.) Gaertn.] lags far behind. It is an important staple in several semi-arid and tropical regions of the world with excellent nutraceutical properties as well as ensuring food security in these areas even during harsh environment. This review highlights the importance of finger millet as a model nutraceutical crop. Progress and prospects in genetic manipulation for the development of abiotic and biotic stress tolerant varieties is also discussed. Although limited studies have been conducted for genetic improvement of finger millets, its nutritional significance in providing minerals, calories and protein makes it an ideal model for nutrition-agriculture research. Therefore, improved genetic manipulation of finger millets for resistance to both abiotic and biotic stresses, as well as for enhancing nutrient content will be very effective in millet improvement. Key message: Apart from the excellent nutraceutical value of finger millet, its ability to tolerate various abiotic stresses and resist pathogens make it an excellent model for exploring vast genetic and genomic potential of this crop, which provide us a wide choice for developing strategies for making climate resilient staple crops.

  12. Finger Millet: A “Certain” Crop for an “Uncertain” Future and a Solution to Food Insecurity and Hidden Hunger under Stressful Environments

    Directory of Open Access Journals (Sweden)

    Anil Kumar

    2017-04-01

    Full Text Available Crop growth and productivity has largely been vulnerable to various abiotic and biotic stresses that are only set to be compounded due to global climate change. Therefore developing improved varieties and designing newer approaches for crop improvement against stress tolerance have become a priority now-a-days. However, most of the crop improvement strategies are directed toward staple cereals such as rice, wheat, maize etc., whereas attention on minor cereals such as finger millet [Eleusine coracana (L. Gaertn.] lags far behind. It is an important staple in several semi-arid and tropical regions of the world with excellent nutraceutical properties as well as ensuring food security in these areas even during harsh environment. This review highlights the importance of finger millet as a model nutraceutical crop. Progress and prospects in genetic manipulation for the development of abiotic and biotic stress tolerant varieties is also discussed. Although limited studies have been conducted for genetic improvement of finger millets, its nutritional significance in providing minerals, calories and protein makes it an ideal model for nutrition-agriculture research. Therefore, improved genetic manipulation of finger millets for resistance to both abiotic and biotic stresses, as well as for enhancing nutrient content will be very effective in millet improvement.Key message: Apart from the excellent nutraceutical value of finger millet, its ability to tolerate various abiotic stresses and resist pathogens make it an excellent model for exploring vast genetic and genomic potential of this crop, which provide us a wide choice for developing strategies for making climate resilient staple crops.

  13. Associative Memories for Supercomputers

    Science.gov (United States)

    1992-12-01

    Engineering. March 1991. The Hague. The Netherlands. 5) "Application des hologrammes synth6tiques au stockage sur disque optique" P. Marchand, A...1upprimer tous Its meuvements m~caniques de la tete de lecture au-dessus de Ia surface du disque. 2.4 Encodage des hologrammes Ainsi I’adressage des...de lecture. De plus, Its donrindes sent stockdes suir le disque tasiie des hologrammes vanec, modifiant proportionneliernent Ia sous ferme

  14. Super-computer architecture

    CERN Document Server

    Hockney, R W

    1977-01-01

    This paper examines the design of the top-of-the-range, scientific, number-crunching computers. The market for such computers is not as large as that for smaller machines, but on the other hand it is by no means negligible. The present work-horse machines in this category are the CDC 7600 and IBM 360/195, and over fifty of the former machines have been sold. The types of installation that form the market for such machines are not only the major scientific research laboratories in the major countries-such as Los Alamos, CERN, Rutherford laboratory-but also major universities or university networks. It is also true that, as with sports cars, innovations made to satisfy the top of the market today often become the standard for the medium-scale computer of tomorrow. Hence there is considerable interest in examining present developments in this area. (0 refs).

  15. Supercomputer debugging workshop `92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-02-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  16. The GF11 supercomputer

    International Nuclear Information System (INIS)

    Beetem, J.; Weingarten, D.

    1986-01-01

    GF11 is a parallel computer currently under construction at the IBM Yorktown Research Center. The machine incorporates 576 floating-point processors arrangedin a modified SIMD architecture. Each has space for 2 Mbytes of memory and is capable of 20 Mflops, giving the total machine a peak of 1.125 Gbytes of memory and 11.52 Gflops. The floating-point processors are interconnected by a dynamically reconfigurable non-blocking switching network. At each machine cycle any of 1024 pre-selected permutations of data can be realized among the processors. The main intended application of GF11 is a class of calculations arising from quantum chromodynamics

  17. The GF11 supercomputer

    International Nuclear Information System (INIS)

    Beetem, J.; Denneau, M.; Weingarten, D.

    1985-01-01

    GF11 is a parallel computer currently under construction at the IBM Yorktown Research Center. The machine incorporates 576 floating- point processors arranged in a modified SIMD architecture. Each has space for 2 Mbytes of memory and is capable of 20 Mflops, giving the total machine a peak of 1.125 Gbytes of memory and 11.52 Gflops. The floating-point processors are interconnected by a dynamically reconfigurable nonblocking switching network. At each machine cycle any of 1024 pre-selected permutations of data can be realized among the processors. The main intended application of GF11 is a class of calculations arising from quantum chromodynamics

  18. Unraveling the patterns of late Holocene debris-flow activity on a cone in the Swiss Alps: Chronology, environment and implications for the future

    Science.gov (United States)

    Stoffel, Markus; Conus, Delphine; Grichting, Michael A.; Lièvre, Igor; Maître, Gilles

    2008-02-01

    , which not only shifted from June and July to August and September over the 20th century, but also seemed to be initiated primarily by persistent precipitation rather than summer thunderstorms. From the reconstructions, based on RCM simulations, there are indications that debris-flow frequencies might continue to decrease in the future, as precipitation events are projected to occur less frequently in summer but become more common in spring or autumn.

  19. Superconductivity and the environment: a Roadmap

    Science.gov (United States)

    Nishijima, Shigehiro; Eckroad, Steven; Marian, Adela; Choi, Kyeongdal; Kim, Woo Seok; Terai, Motoaki; Deng, Zigang; Zheng, Jun; Wang, Jiasu; Umemoto, Katsuya; Du, Jia; Febvre, Pascal; Keenan, Shane; Mukhanov, Oleg; Cooley, Lance D.; Foley, Cathy P.; Hassenzahl, William V.; Izumi, Mitsuru

    2013-11-01

    disasters will be helped by future supercomputer technologies that support huge amounts of data and sophisticated modeling, and with the aid of superconductivity these systems might not require the energy of a large city. We present different sections on applications that could address (or are addressing) a range of environmental issues. The Roadmap covers water purification, power distribution and storage, low-environmental impact transport, environmental sensing (particularly for the removal of unexploded munitions), monitoring the Earth’s magnetic fields for earthquakes and major solar activity, and, finally, developing a petaflop supercomputer that only requires 3% of the current supercomputer power provision while being 50 times faster. Access to fresh water. With only 2.5% of the water on Earth being fresh and climate change modeling forecasting that many areas will become drier, the ability to recycle water and achieve compact water recycling systems for sewage or ground water treatment is critical. The first section (by Nishijima) points to the potential of superconducting magnetic separation to enable water recycling and reuse. Energy. The Equinox Summit held in Waterloo Canada 2011 (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources) identified electricity use as humanity’s largest contributor to greenhouse gas emissions. Our appetite for electricity is growing faster than for any other form of energy. The communiqué from the summit said ‘Transforming the ways we generate, distribute and store electricity is among the most pressing challenges facing society today…. If we want to stabilize CO2 levels in our atmosphere at 550 parts per million, all of that growth needs to be met by non-carbon forms of energy’ (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources). Superconducting technologies can provide the energy efficiencies to achieve, in the European Union alone, 33-65% of the required reduction in greenhouse

  20. Superconductivity and the environment: a Roadmap

    International Nuclear Information System (INIS)

    Nishijima, Shigehiro; Eckroad, Steven; Marian, Adela; Choi, Kyeongdal; Kim, Woo Seok; Terai, Motoaki; Deng, Zigang; Zheng, Jun; Wang, Jiasu; Umemoto, Katsuya; Du, Jia; Keenan, Shane; Foley, Cathy P; Febvre, Pascal; Mukhanov, Oleg; Cooley, Lance D; Hassenzahl, William V; Izumi, Mitsuru

    2013-01-01

    disasters will be helped by future supercomputer technologies that support huge amounts of data and sophisticated modeling, and with the aid of superconductivity these systems might not require the energy of a large city. We present different sections on applications that could address (or are addressing) a range of environmental issues. The Roadmap covers water purification, power distribution and storage, low-environmental impact transport, environmental sensing (particularly for the removal of unexploded munitions), monitoring the Earth’s magnetic fields for earthquakes and major solar activity, and, finally, developing a petaflop supercomputer that only requires 3% of the current supercomputer power provision while being 50 times faster. Access to fresh water. With only 2.5% of the water on Earth being fresh and climate change modeling forecasting that many areas will become drier, the ability to recycle water and achieve compact water recycling systems for sewage or ground water treatment is critical. The first section (by Nishijima) points to the potential of superconducting magnetic separation to enable water recycling and reuse. Energy. The Equinox Summit held in Waterloo Canada 2011 (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources) identified electricity use as humanity’s largest contributor to greenhouse gas emissions. Our appetite for electricity is growing faster than for any other form of energy. The communiqué from the summit said ‘Transforming the ways we generate, distribute and store electricity is among the most pressing challenges facing society today…. If we want to stabilize CO 2 levels in our atmosphere at 550 parts per million, all of that growth needs to be met by non-carbon forms of energy’ (2011 Equinox Summit: Energy 2030 http://wgsi.org/publications-resources). Superconducting technologies can provide the energy efficiencies to achieve, in the European Union alone, 33–65% of the required reduction in

  1. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  2. Cities and environment. Indicators of environmental performance in the 'Cities of the future'; Byer og miljoe : indikatorer for miljoeutviklingen i 'Framtidens byer'

    Energy Technology Data Exchange (ETDEWEB)

    Haagensen, Trine

    2012-07-15

    This report contains selected indicators and statistics that describe the urban environmental status and development in 13 of the largest municipalities in Norway. These cities are part of the program 'Cities of the Future' agreed upon between 13 cities, the private sector and the state, led by the Ministry of the Environment. Cities of the Future had about 1.7 million inhabitants (as of 1 January 2010), equivalent to about 1/3 of the population in Norway. In 2009 the population growth in these municipalities was about 49 per cent of the total population growth. Some of the greatest challenges to combine urban development with environmental considerations are therefore found here. The white paper no. 26 (2006-2007) The government's environmental policy and the state of the environment in Norway, has also added to the importance of the urban environment with a comprehensive description of the land use and transport policy. Good land use management contains indicators related to the density of land use and construction activities within urban settlements. Within urban settlements, the area per inhabitant decreased both within the Cities of the Future and in all municipalities in Norway (2000-2009). The coalescing within the urban settlements decreased per inhabitant (2004-2009), which means that new buildings have been built outside already established urban settlements in this period. Too high density of built-up areas may be at the expense of access to playgrounds, recreational areas or touring grounds, indicators of the population's access to these areas show that there has been a reduction in access in the Cities of the Future as for the municipalities in Norway. Within transport, the focus is on the degree to which the inhabitants choose to use environmentally-friendly transportation instead of cars. Only Oslo has more than 50 per cent of daily travel by environmentally-friendly transportation. Among the Cities of the Future, the use of

  3. Computational fluid dynamics: complex flows requiring supercomputers. January 1975-July 1988 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-July 1988

    International Nuclear Information System (INIS)

    1988-08-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets, and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate published search. (Contains 83 citations fully indexed and including a title list.)

  4. Computational fluid dynamics: Complex flows requiring supercomputers. January 1975-December 1989 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-December 1989

    International Nuclear Information System (INIS)

    1990-01-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate Published Search. (This updated bibliography contains 132 citations, 49 of which are new entries to the previous edition.)

  5. Global Learning and Observation to Benefit the Environment (GLOBE) Mission EARTH (GME) program delivers climate change science content, pedagogy, and data resources to K12 educators, future teachers, and professional development providers.

    Science.gov (United States)

    Ostrom, T.

    2017-12-01

    This presentation will include a series of visuals that discuss how hands-on learning activities and field investigations from the the Global Learning and Observation to Benefit the Environment (GLOBE) Mission EARTH (GME) program deliver climate change science content, pedagogy, and data resources to K12 educators, future teachers, and professional development providers. The GME program poster presentation will also show how teachers strengthen student preparation for Science, Technology, Engineering, Art and Mathematics (STEAM)-related careers while promoting diversity in the future STEM workforce. In addition to engaging students in scientific inquiry, the GME program poster will show how career exploration and preparation experiences is accomplished through direct connection to scientists and real science practices. The poster will show which hands-on learning activities that are being implemented in more than 30,000 schools worldwide, with over a million students, teachers, and scientists collecting environmental measurements using the GLOBE scientific protocols. This poster will also include how Next Generation Science Standards connect to GME learning progressions by grade strands. The poster will present the first year of results from the implementation of the GME program. Data is currently being agrigated by the east, midwest and westen regional operations.

  6. Future food.

    Science.gov (United States)

    Wahlqvist, Mark L

    2016-12-01

    Food systems have changed markedly with human settlement and agriculture, industrialisation, trade, migration and now the digital age. Throughout these transitions, there has been a progressive population explosion and net ecosystem loss and degradation. Climate change now gathers pace, exacerbated by ecological dysfunction. Our health status has been challenged by a developing people-environment mismatch. We have regarded ecological conquest and innovative technology as solutions, but have not understood how ecologically dependent and integrated we are. We are ecological creatures interfaced by our sensoriness, microbiomes, shared regulatory (endocrine) mechanisms, immune system, biorhythms and nutritional pathways. Many of us are 'nature-deprived'. We now suffer what might be termed ecological health disorders (EHD). If there were less of us, nature's resilience might cope, but more than 9 billion people by 2050 is probably an intolerable demand on the planet. Future food must increasingly take into account the pressures on ecosystem-dependent food systems, with foods probably less biodiverse, although eating in this way allows optimal health; energy dysequilibrium with less physical activity and foods inappropriately energy dense; and less socially-conducive food habits. 'Personalised Nutrition', with extensive and resource-demanding nutrigenomic, metabolomic and microbiomic data may provide partial health solutions in clinical settings, but not be justified for ethical, risk management or sustainability reasons in public health. The globally prevalent multidimensional malnutritional problems of food insecurity, quality and equity require local, regional and global action to prevent further ecosystem degradation as well as to educate, provide sustainable livelihoods and encourage respectful social discourse and practice about the role of food.

  7. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    Future contingents are contingent statements about the future — such as future events, actions, states etc. To qualify as contingent the predicted event, state, action or whatever is at stake must neither be impossible nor inevitable. Statements such as “My mother shall go to London” or “There...... will be a sea-battle tomorrow” could serve as standard examples. What could be called the problem of future contingents concerns how to ascribe truth-values to such statements. If there are several possible decisions out of which one is going to be made freely tomorrow, can there be a truth now about which one......, ‘future contingents’ could also refer to future contingent objects. A statement like “The first astronaut to go to Mars will have a unique experience” could be analyzed as referring to an object not yet existing, supposing that one day in the distant future some person will indeed travel to Mars...

  8. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    Future contingents are contingent statements about the future — such as future events, actions, states etc. To qualify as contingent the predicted event, state, action or whatever is at stake must neither be impossible nor inevitable. Statements such as “My mother shall go to London” or “There...... will be a sea-battle tomorrow” could serve as standard examples. What could be called the problem of future contingents concerns how to ascribe truth-values to such statements. If there are several possible decisions out of which one is going to be made freely tomorrow, can there be a truth now about which one......, ‘future contingents’ could also refer to future contingent objects. A statement like “The first astronaut to go to Mars will have a unique experience” could be analyzed as referring to an object not yet existing, supposing that one day in the distant future some person will indeed travel to Mars...

  9. Discounting Future Green: Money versus the Environment

    Science.gov (United States)

    Hardisty, David J.; Weber, Elke U.

    2009-01-01

    In 3 studies, participants made choices between hypothetical financial, environmental, and health gains and losses that took effect either immediately or with a delay of 1 or 10 years. In all 3 domains, choices indicated that gains were discounted more than losses. There were no significant differences in the discounting of monetary and…

  10. Clinical experimentation with aerosol antibiotics: current and future methods of administration

    Directory of Open Access Journals (Sweden)

    Zarogoulidis P

    2013-10-01

    Full Text Available Paul Zarogoulidis,1,2 Ioannis Kioumis,1 Konstantinos Porpodis,1 Dionysios Spyratos,1 Kosmas Tsakiridis,3 Haidong Huang,4 Qiang Li,4 J Francis Turner,5 Robert Browning,6 Wolfgang Hohenforst-Schmidt,7 Konstantinos Zarogoulidis1 1Pulmonary Department, G Papanikolaou General Hospital, Aristotle University of Thessaloniki, Thessaloniki, Greece; 2Department of Interventional Pneumology, Ruhrlandklinik, West German Lung Center, University Hospital, University Duisburg-Essen, Essen, Germany; 3Cardiothoracic Surgery Department, Saint Luke Private Hospital of Health Excellence, Thessaloniki, Greece; 4Department of Respiratory Diseases, Shanghai Hospital/First Affiliated Hospital of the Second Military Medical University, Shanghai, People’s Republic of China; 5Pulmonary Medicine, University of Nevada School of Medicine, National Supercomputing Center for Energy and the Environment University of Nevada, Las Vegas, NV, USA; 6Pulmonary and Critical Care Medicine, Interventional Pulmonology, National Naval Medical Center, Walter Reed Army Medical Center, Bethesda, MD, USA; 7II Medical Department, Regional Clinic of Coburg, University of Wuerzburg, Coburg, Germany Abstract: Currently almost all antibiotics are administered by the intravenous route. Since several systems and situations require more efficient methods of administration, investigation and experimentation in drug design has produced local treatment modalities. Administration of antibiotics in aerosol form is one of the treatment methods of increasing interest. As the field of drug nanotechnology grows, new molecules have been produced and combined with aerosol production systems. In the current review, we discuss the efficiency of aerosol antibiotic studies along with aerosol production systems. The different parts of the aerosol antibiotic methodology are presented. Additionally, information regarding the drug molecules used is presented and future applications of this method are discussed

  11. Summaries of research and development activities by using supercomputer system of JAEA in FY2012. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As more than 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2012, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as apriority issue. This report presents a great amount of R and D results accomplished by using the system in FY2012, as well as user support, operational records and overviews of the system, and so on. (author)

  12. Summaries of research and development activities by using supercomputer system of JAEA in FY2015. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-01-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2015, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2015, as well as user support, operational records and overviews of the system, and so on. (author)

  13. Summaries of research and development activities by using supercomputer system of JAEA in FY2014. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2016-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. As shown in the fact that about 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology. In FY2014, the system was used for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue, as well as for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science. This report presents a great number of R and D results accomplished by using the system in FY2014, as well as user support, operational records and overviews of the system, and so on. (author)

  14. Summaries of research and development activities by using supercomputer system of JAEA in FY2013. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-02-01

    Japan Atomic Energy Agency (JAEA) conducts research and development (R and D) in various fields related to nuclear power as a comprehensive institution of nuclear energy R and Ds, and utilizes computational science and technology in many activities. About 20 percent of papers published by JAEA are concerned with R and D using computational science, the supercomputer system of JAEA has become an important infrastructure to support computational science and technology utilization. In FY2013, the system was used not only for JAEA's major projects such as Fast Reactor Cycle System, Fusion R and D and Quantum Beam Science, but also for R and D aiming to restore Fukushima (nuclear plant decommissioning and environmental restoration) as a priority issue. This report presents a great amount of R and D results accomplished by using the system in FY2013, as well as user support, operational records and overviews of the system, and so on. (author)

  15. Future Textiles

    DEFF Research Database (Denmark)

    Hansen, Anne-Louise Degn; Jensen, Hanne Troels Fusvad; Hansen, Martin

    2011-01-01

    Magasinet Future Textiles samler resultaterne fra projektet Future Textiles, der markedsfører området intelligente tekstiler. I magasinet kan man læse om trends, drivkræfter, udfordringer samt få ideer til nye produkter inden for intelligente tekstiler. Områder som bæredygtighed og kundetilpasning...

  16. Heuristic Scheduling in Grid Environments: Reducing the Operational Energy Demand

    Science.gov (United States)

    Bodenstein, Christian

    In a world where more and more businesses seem to trade in an online market, the supply of online services to the ever-growing demand could quickly reach its capacity limits. Online service providers may find themselves maxed out at peak operation levels during high-traffic timeslots but too little demand during low-traffic timeslots, although the latter is becoming less frequent. At this point deciding which user is allocated what level of service becomes essential. The concept of Grid computing could offer a meaningful alternative to conventional super-computing centres. Not only can Grids reach the same computing speeds as some of the fastest supercomputers, but distributed computing harbors a great energy-saving potential. When scheduling projects in such a Grid environment however, simply assigning one process to a system becomes so complex in calculation that schedules are often too late to execute, rendering their optimizations useless. Current schedulers attempt to maximize the utility, given some sort of constraint, often reverting to heuristics. This optimization often comes at the cost of environmental impact, in this case CO 2 emissions. This work proposes an alternate model of energy efficient scheduling while keeping a respectable amount of economic incentives untouched. Using this model, it is possible to reduce the total energy consumed by a Grid environment using 'just-in-time' flowtime management, paired with ranking nodes by efficiency.

  17. Visualization system for grid environment in the nuclear field

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Matsumoto, Nobuko; Idomura, Yasuhiro; Tani, Masayuki

    2006-01-01

    An innovative scientific visualization system is needed to integratedly visualize large amount of data which are distributedly generated in remote locations as a result of a large-scale numerical simulation using a grid environment. One of the important functions in such a visualization system is a parallel visualization which enables to visualize data using multiple CPUs of a supercomputer. The other is a distributed visualization which enables to execute visualization processes using a local client computer and remote computers. We have developed a toolkit including these functions in cooperation with the commercial visualization software AVS/Express, called Parallel Support Toolkit (PST). PST can execute visualization processes with three kinds of parallelism (data parallelism, task parallelism and pipeline parallelism) using local and remote computers. We have evaluated PST for large amount of data generated by a nuclear fusion simulation. Here, two supercomputers Altix3700Bx2 and Prism installed in JAEA are used. From the evaluation, it can be seen that PST has a potential to efficiently visualize large amount of data in a grid environment. (author)

  18. Full-Particle Simulations on Electrostatic Plasma Environment near Lunar Vertical Holes

    Science.gov (United States)

    Miyake, Y.; Nishino, M. N.

    2015-12-01

    The Kaguya satellite and the Lunar Reconnaissance Orbiter have observed a number of vertical holes on the terrestrial Moon [Haruyama et al., GRL, 2009; Robinson et al., PSS, 2012], which have spatial scales of tens of meters and are possible lava tube skylights. The hole structure has recently received particular attention, because the structure gives an important clue to the complex volcanic history of the Moon. The holes also have high potential as locations for constructing future lunar bases, because of fewer extra-lunar rays/particles and micrometeorites reaching the hole bottoms. In this sense, these holes are not only interesting in selenology, but are also significant from the viewpoint of electrostatic environments. The subject can also be an interesting resource of research in comparative planetary science, because hole structures have been found in other solar system bodies such as the Mars. The lunar dayside electrostatic environment is governed by electrodynamic interactions among the solar wind plasma, photoelectrons, and the charged lunar surface, providing topologically complex boundaries to the plasma. We use the three-dimensional, massively-parallelized, particle-in-cell simulation code EMSES [Miyake and Usui, POP, 2009] to simulate the near-hole plasma environment on the Moon [Miyake and Nishino, Icarus, 2015]. We took into account the solar wind plasma downflow, photoelectron emission from the sunlit part of the lunar surface, and plasma charge deposition on the surface. The simulation domain consists of 400×400×2000 grid points and contains about 25 billion plasma macro-particles. Thus, we need to use supercomputers for the simulations. The vertical wall of the hole introduces a new boundary for both photo and solar wind electrons. The current balance condition established at a hole bottom is altered by the limited solar wind electron penetration into the hole and complex photoelectron current paths inside the hole. The self

  19. Sustainable Futures

    Science.gov (United States)

    Sustainable Futures is a voluntary program that encourages industry to use predictive models to screen new chemicals early in the development process and offers incentives to companies subject to TSCA section 5.

  20. Environment and energy

    International Nuclear Information System (INIS)

    Jung, Sun Ho; Choi, Seong Bu; Im, Sang Hun; Kim, Gwang Yeol

    1998-04-01

    This book introduces environment and energy technology with current situation and promotion for development giving description of solar heat light of the sun, bio energy wind power, utilization technique of environment using like hydrogen energy and a solar furnace, ocean development like tidal energy, wave-power energy, new ocean energy technology and development plan for ocean technique with direction, purpose field and importance and the future life space environment architecture and technology natural lighting.

  1. Directors of the Future

    DEFF Research Database (Denmark)

    Buur, Jacob; Arnal, Luis; Have, Claus

    For the panel at EPIC 2008 we invited three prominent ethnographers, from consulting, corporate and academic environments, to stop thinking about the past and present (usually the realm of ethnography) and to "play" with a future vision of ethnography. Ahead of the conference the panelists engaged...... in a storytelling process with actors from the Dacapo Theatre to create a concrete scenario of what the future might hold. Theatre has the capacity to speak directly to personal experiences and emotions. With this panel we wanted to move beyond the slightly distanced, reflective stance that ethnographers may take...

  2. Future-Proofing Nursing Education

    OpenAIRE

    Nicholas Ralph; Melanie Birks; Ysanne Chapman; Karen Francis

    2014-01-01

    The relevance of pre-registration programs of nursing education to current and emerging trends in healthcare and society could have a significant future impact on the nursing profession. In this article, we use a PESTEL (politics, economics, society, technology, environment, and law) framework to identify significant current and future priorities in Australian healthcare. Following the PESTEL analysis, we conduct a rev...

  3. The future of energy

    CERN Document Server

    Towler, Brian F

    2014-01-01

    Using the principle that extracting energy from the environment always involves some type of impact on the environment, The Future of Energy discusses the sources, technologies, and tradeoffs involved in meeting the world's energy needs. A historical, scientific, and technical background set the stage for discussions on a wide range of energy sources, including conventional fossil fuels like oil, gas, and coal, as well as emerging renewable sources like solar, wind, geothermal, and biofuels. Readers will learn that there are no truly ""green"" energy sources-all energy usage involves some trad

  4. Future perspectives

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    International involvement in particle physics is what the International Committee for Future Accelerators (ICFA) is all about. At the latest Future Perspectives meeting at Brookhaven from 5-10 October (after a keynote speech by doyen Viktor Weisskopf, who regretted the emergence of 'a nationalistic trend'), ICFA reviewed progress and examined its commitments in the light of the evolving world particle physics scene. Particular aims were to review worldwide accelerator achievements and plans, to survey the work of the four panels, and to discuss ICFA's special role in future cooperation in accelerator construction and use, and in research and development work for both accelerators and for detectors

  5. Future Savvy

    DEFF Research Database (Denmark)

    Gordon, Adam

    -tank forecasts, consultant reports, and stock-market guides. These resources are crucial, but they are also of very mixed quality. How can decision-makers know which predictions to take seriously, which to be wary of, and which to throw out entirely? Future Savvy provides analytical filters to judging predictive......There's no shortage of predictions available to organizations looking to anticipate and profit from future events or trends. Apparently helpful forecasts are ubiquitous in everyday communications such as newspapers and business magazines, and in specialized sources such as government and think...... material of all types, including providing a battery of critical tests to apply to any forecast to assess its validity, and judge how to fit it into everyday management thinking. The book synthesizes information assessment skills and future studies tools into a single template that allows managers to apply...

  6. Energy Futures

    DEFF Research Database (Denmark)

    Davies, Sarah Rachael; Selin, Cynthia

    2012-01-01

    foresight and public and stakeholder engagement are used to reflect on?and direct?the impacts of new technology. In this essay we draw on our experience of anticipatory governance, in the shape of the ?NanoFutures? project on energy futures, to present a reflexive analysis of engagement and deliberation. We...... draw out five tensions of the practice of deliberation on energy technologies. Through tracing the lineages of these dilemmas, we discuss some of the implications of these tensions for the practice of civic engagement and deliberation in a set of questions for this community of practitioner-scholars....

  7. Navy Telemedicine: Current Research and Future Directions

    National Research Council Canada - National Science Library

    Reed, Cheryl

    2002-01-01

    .... This report reviews military and civilian models for evaluating telemedicine systems in order to determine future directions for Navy telemedicine research within the current funding environment...

  8. Bitcoin futures

    DEFF Research Database (Denmark)

    Brøgger, Søren Bundgaard

    2018-01-01

    Med introduktionen af et futures-marked er Bitcoin-eksponering blevet tilgængelig for en bredere gruppe af investorer, som hidtil ikke har kunnet eller villet tilgå det underliggende marked for Bitcoin. Artiklen finder, at kontrakterne umiddelbart favoriserer spekulanter på bekostning af hedgers og...

  9. Iraq's future

    International Nuclear Information System (INIS)

    Henderson, S.

    1998-01-01

    The large oil reserves of Iraq make it an important player in the long-term political energy world. This article briefly reviews the oil industry''s development and current status in Iraq and discusses the planned oil and gas field development. Finally there is a political discussion regarding the future of Iraq in terms of religion, race and neighbouring countries. (UK)

  10. Games and Entertainment in Ambient Intelligence Environments

    NARCIS (Netherlands)

    Nijholt, Antinus; Reidsma, Dennis; Poppe, Ronald Walter; Aghajan, H.; López-Cózar Delgado, R.; Augusto, J.C.

    2009-01-01

    In future ambient intelligence (AmI) environments we assume intelligence embedded in the environment and its objects (floors, furniture, mobile robots). These environments support their human inhabitants in their activities and interactions by perceiving them through sensors (proximity sensors,

  11. Workflows in a secure environment

    Energy Technology Data Exchange (ETDEWEB)

    Klasky, Scott A [ORNL; Podhorszki, Norbert [ORNL

    2008-01-01

    Petascale simulations on the largest supercomputers in the US require advanced data management techniques in order to optimize the application scien- tist time, and to optimize the time spent on the supercomputers. Researchers in such problems are starting to require workflow automation during their simula- tions in order to monitor the simulations, and in order to automate many of the complex analysis which must take place from the data that is generated from these simulations. Scientific workflows are being used to monitor simulations running on these supercomputers by applying a series of complex analysis, and finally producing images and movies from the variables produced in the simulation, or from the derived quantities produced by the analysis. The typical scenario is where the large calculation runs on the supercomputer, and the auxiliary diagnos- tics/monitors are run on resources, which are either on the local area network of the supercomputer, or over the wide area network. The supercomputers at one of the largest centers are highly secure, and the only method to log into the center is interactive authentication by using One Time Passwords (OTP) that are generated by a security device and expire in half a minute. Therefore, grid certificates are not a current option on these machines in the Department of Energy at Oak Ridge Na- tional Laboratory. In this paper we describe how we have extended the Kepler sci- entific workflow management system to be able to run operations on these supercomputers, how workflows themselves can be executed as batch jobs, and fi- nally, how external data-transfer operations can be utilized when they need to per- form authentication for their own as well.

  12. Toward sustainable energy futures

    Energy Technology Data Exchange (ETDEWEB)

    Pasztor, J. (United Nations Environment Programme, Nairobi (Kenya))

    1990-01-01

    All energy systems have adverse as well as beneficial impacts on the environment. They vary in quality, quantity, in time and in space. Environmentally sensitive energy management tries to minimize the adverse impacts in an equitable manner between different groups in the most cost-effective ways. Many of the enviornmental impacts of energy continue to be externalized. Consequently, these energy systems which can externalize their impacts more easily are favoured, while others remain relatively expensive. The lack of full integration of environmental factors into energy policy and planning is the overriding problem to be resolved before a transition towards sustainable energy futures can take place. The most pressing problem in the developing countries relates to the unsustainable and inefficient use of biomass resources, while in the industrialized countries, the major energy-environment problems arise out of the continued intensive use of fossil fuel resources. Both of these resource issues have their role to play in climate change. Although there has been considerable improvement in pollution control in a number of situations, most of the adverse impacts will undoubtedly increase in the future. Population growth will lead to increased demand, and there will also be greater use of lower grade fuels. Climate change and the crisis in the biomass resource base in the developing countries are the most critical energy-environment issues to be resolved in the immediate future. In both cases, international cooperation is an essential requirement for successful resolution. 26 refs.

  13. Multi-scale atmospheric environment modelling for urban areas

    Directory of Open Access Journals (Sweden)

    A. A. Baklanov

    2009-04-01

    Full Text Available Modern supercomputers allow realising multi-scale systems for assessment and forecasting of urban meteorology, air pollution and emergency preparedness and considering nesting with obstacle-resolved models. A multi-scale modelling system with downscaling from regional to city-scale with the Environment – HIgh Resolution Limited Area Model (Enviro-HIRLAM and to micro-scale with the obstacle-resolved Micro-scale Model for Urban Environment (M2UE is suggested and demonstrated. The M2UE validation results versus the Mock Urban Setting Trial (MUST experiment indicate satisfactory quality of the model. Necessary conditions for the choice of nested models, building descriptions, areas and resolutions of nested models are analysed. Two-way nesting (up- and down-scaling, when scale effects both directions (from the meso-scale on the micro-scale and from the micro-scale on the meso-scale, is also discussed.

  14. Parliamentarians and environment

    International Nuclear Information System (INIS)

    Boy, D.

    2004-01-01

    The data presented in this report come from an inquiry carried out by Sofres between March 5 and April 23, 2003, with a sample of 200 parliamentarians (122 deputies and 78 senators) who explained their attitude with respect to the question of environment. The questionnaire comprises 5 main dimensions dealing with: the relative importance of the environment stake, the attitudes with respect to past, present and future environment policies, the attitude with respect to specific stakes (energy, wastes), the attitude with respect to some problems of conservation of the natural heritage, and the attitude with respect to the participation of the public to some environment-related decisions. (J.S.)

  15. Robot Futures

    DEFF Research Database (Denmark)

    Christoffersen, Anja; Grindsted Nielsen, Sally; Jochum, Elizabeth Ann

    Robots are increasingly used in health care settings, e.g., as homecare assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance...... is a useful testbed for developing and evaluating what makes robots expressive; it is also a useful platform for designing robot behaviors and dialogue that result in believable characters. Therefore theatre is a valuable testbed for studying human-robot interaction (HRI). We investigate how audiences...... perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initial findings, and outline future work....

  16. Drawing Futures

    OpenAIRE

    2016-01-01

    Drawing Futures brings together international designers and artists for speculations in contemporary drawing for art and architecture. Despite numerous developments in technological manufacture and computational design that provide new grounds for designers, the act of drawing still plays a central role as a vehicle for speculation. There is a rich and long history of drawing tied to innovations in technology as well as to revolutions in our philosophical understanding of the world. In re...

  17. Future directions

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    Topics presented concerning the future developments in risk analysis are: safety goals, US severe accident policy, code developments, research programs, analyses and operation action, linking with the deterministic analyses. Principle consideration in risk is defined as protection of both general population, and nearby residents. The principal goal should be consistent with risk of other man-caused activities, the cost benefit after minimum safety levels are achieved, and proportional to benefits to be gained

  18. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    Science.gov (United States)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  19. Future trends in reprocessing

    International Nuclear Information System (INIS)

    Rouyer, H.

    1994-01-01

    This paper about future trends in reprocessing essentially reflects French experience and points of view as an example of countries which, like England and Japan, consider that reprocessing is the best solution for the back end of the fuel cycle. In order to know what the future will be, it is necessary to look back at the past and try to find what have been the main reasons for evolution in that period. For reprocessing, it appears that these motivations have been 'safety and economics'. They will remain the motivations for the future. In addition, new motivations for development are starting to appear which are still imprecise but can be expressed as follows: 'which guarantees will public opinion require in order to be convinced that solutions for waste management, proposed by specialists shall ensure that a healthy environment is preserved for the use of future generations'. Consequently the paper examines successively the evolution of reprocessing in the recent past, what the immediate future could be and finally what should be necessary in the long term. (Author)

  20. Future climate

    International Nuclear Information System (INIS)

    La Croce, A.

    1991-01-01

    According to George Woodwell, founder of the Woods Hole Research Center, due the combustion of fossil fuels, deforestation and accelerated respiration, the net annual increase of carbon, in the form of carbon dioxide, to the 750 billion tonnes already present in the earth's atmosphere, is in the order of 3 to 5 billion tonnes. Around the world, scientists, investigating the probable effects of this increase on the earth's future climate, are now formulating coupled air and ocean current models which take account of water temperature and salinity dependent carbon dioxide exchange mechanisms acting between the atmosphere and deep layers of ocean waters

  1. High-resolution RCMs as pioneers for future GCMs

    Science.gov (United States)

    Schar, C.; Ban, N.; Arteaga, A.; Charpilloz, C.; Di Girolamo, S.; Fuhrer, O.; Hoefler, T.; Leutwyler, D.; Lüthi, D.; Piaget, N.; Ruedisuehli, S.; Schlemmer, L.; Schulthess, T. C.; Wernli, H.

    2017-12-01

    Currently large efforts are underway to refine the horizontal resolution of global and regional climate models to O(1 km), with the intent to represent convective clouds explicitly rather than using semi-empirical parameterizations. This refinement will move the governing equations closer to first principles and is expected to reduce the uncertainties of climate models. High resolution is particularly attractive in order to better represent critical cloud feedback processes (e.g. related to global climate sensitivity and extratropical summer convection) and extreme events (such as heavy precipitation events, floods, and hurricanes). The presentation will be illustrated using decade-long simulations at 2 km horizontal grid spacing, some of these covering the European continent on a computational mesh with 1536x1536x60 grid points. To accomplish such simulations, use is made of emerging heterogeneous supercomputing architectures, using a version of the COSMO limited-area weather and climate model that is able to run entirely on GPUs. Results show that kilometer-scale resolution dramatically improves the simulation of precipitation in terms of the diurnal cycle and short-term extremes. The modeling framework is used to address changes of precipitation scaling with climate change. It is argued that already today, modern supercomputers would in principle enable global atmospheric convection-resolving climate simulations, provided appropriately refactored codes were available, and provided solutions were found to cope with the rapidly growing output volume. A discussion will be provided of key challenges affecting the design of future high-resolution climate models. It is suggested that km-scale RCMs should be exploited to pioneer this terrain, at a time when GCMs are not yet available at such resolutions. Areas of interest include the development of new parameterization schemes adequate for km-scale resolution, the exploration of new validation methodologies and data

  2. Future Talks,

    Directory of Open Access Journals (Sweden)

    Catherine Defeyt

    2010-11-01

    Full Text Available La conservation des matériaux modernes et les difficultés qui la caractérisent étaient l’objet du colloque international Future Talks, organisé par Die Neue Sammlung, The International Design Museum, les 22 et 23 octobre 2009 à Munich. Conservateurs-restaurateurs spécialisés, représentants des  institutions muséales les plus prestigieuses d’Europe et d’outre-Atlantique ainsi que chercheurs en sciences appliquées y ont présenté leurs travaux et recherches. En matière de design, d’art moderne e...

  3. Developing Students' Futures Thinking in Science Education

    Science.gov (United States)

    Jones, Alister; Buntting, Cathy; Hipkins, Rose; McKim, Anne; Conner, Lindsey; Saunders, Kathy

    2012-01-01

    Futures thinking involves a structured exploration into how society and its physical and cultural environment could be shaped in the future. In science education, an exploration of socio-scientific issues offers significant scope for including such futures thinking. Arguments for doing so include increasing student engagement, developing students'…

  4. Future Dead

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits

    Today the dying and the bereaved attend memorialization both online and offline. Cemeteries, urns, coffins, graves, memorials, monuments, websites, social network sites, applications and software services, form technologies that are influenced by discourse, culture, public, professional and econo......Today the dying and the bereaved attend memorialization both online and offline. Cemeteries, urns, coffins, graves, memorials, monuments, websites, social network sites, applications and software services, form technologies that are influenced by discourse, culture, public, professional...... and economic power. They constitute parts of an intricately weaved and interrelated network of practices and designs dealing with death, mourning, memorialization and remembrance. The paper presents findings from two research projects; the 2015 exhibition Death: The Human Experience at Bristol Museum and Art...... Gallery (bristolmuseums.org.uk) and the Future Cemetery Design Competition 2016 held by the Centre for Death and Society and Arnos Vale Cemetery in Bristol (futurecemetery.org). Grounded in sociological theory on death and memorialization technologies, ethnographic fieldwork and survey results (n=348...

  5. Preservation of Built Environments

    DEFF Research Database (Denmark)

    Pilegaard, Marie Kirstine

    When built environments and recently also cultural environments are to be preserved, the historic and architectural values are identified as the key motivations. In Denmark the SAVE system is used as a tool to identify architectural values, but in recent years it has been criticized for having...... a too narrow aesthetic goal, especially when it comes to the evaluation of built environments as a whole. Architectural value has therefore been perceived as a different concept than aesthetic value, primarily related to a static and unchanging expression. This fact creates problems in relation...... to current conservation tasks, which today include more and more untraditionally built environments, including cultural environments. Architectural value must in this case rather be associated with development, ongoing processes, and allow room for future change. The Danish architect Johannes Exner, defines...

  6. A High Throughput Workflow Environment for Cosmological Simulations

    Science.gov (United States)

    Brandon, Erickson; Evrard, A. E.; Singh, R.; Marru, S.; Pierce, M.; Becker, M. R.; Kravtsov, A.; Busha, M. T.; Wechsler, R. H.; Ricker, P. M.; DES Simulations Working Group

    2013-01-01

    The Simulation Working Group (SimWG) of the Dark Energy Survey (DES) is collaborating with an XSEDE science gateway team to develop a distributed workflow management layer for the production of wide-area synthetic galaxy catalogs from large N-body simulations. We use the suite of tools in Airavata, an Apache Incubator project, to generate and archive multiple 10^10-particle N-body simulations of nested volumes on XSEDE supercomputers. Lightcone outputs are moved via Globus Online to SLAC, where they are transformed into multi-band, catalog-level descriptions of gravitationally lensed galaxies covering 10,000 sq deg to high redshift. We outline the method and discuss efficiency and provenance improvements brought about in N-body production. Plans to automate data movement and post-processing within the workflow are sketched, as are risks associated with working in an environment of constantly evolving services.

  7. Innovation and future in Westinghouse

    International Nuclear Information System (INIS)

    Congedo, T.; Dulloo, A.; Goosen, J.; Llovet, R.

    2007-01-01

    For the past six years, Westinghouse has used a Road Map process to direct technology development in a way that integrates the efforts of our businesses to addresses the needs of our customers and respond to significant drivers in the evolving business environment. As the nuclear industry experiences a resurgence, it is ever more necessary that we increase our planning horizon to 10-15 years in the future so as to meet the expectations of our customers. In the Future Point process, driven by the methods of Design for Six Sigma (DFSS), Westinghouse considers multiple possible future scenarios to plan long term evolutionary and revolutionary development that can reliably create the major products and services of the future market. the products and services of the future stretch the imagination from what we provide today. However, the journey to these stretch targets prompts key development milestones that will help deliver ideas useful for nearer term products. (Author) 1 refs

  8. Collaborating CPU and GPU for large-scale high-order CFD simulations with complex grids on the TianHe-1A supercomputer

    International Nuclear Information System (INIS)

    Xu, Chuanfu; Deng, Xiaogang; Zhang, Lilun; Fang, Jianbin; Wang, Guangxue; Jiang, Yi; Cao, Wei; Che, Yonggang; Wang, Yongxian; Wang, Zhenghua; Liu, Wei; Cheng, Xinghua

    2014-01-01

    Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations for high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU–GPU collaborative simulations

  9. Supercomputer debugging workshop '92

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.S.

    1993-01-01

    This report contains papers or viewgraphs on the following topics: The ABCs of Debugging in the 1990s; Cray Computer Corporation; Thinking Machines Corporation; Cray Research, Incorporated; Sun Microsystems, Inc; Kendall Square Research; The Effects of Register Allocation and Instruction Scheduling on Symbolic Debugging; Debugging Optimized Code: Currency Determination with Data Flow; A Debugging Tool for Parallel and Distributed Programs; Analyzing Traces of Parallel Programs Containing Semaphore Synchronization; Compile-time Support for Efficient Data Race Detection in Shared-Memory Parallel Programs; Direct Manipulation Techniques for Parallel Debuggers; Transparent Observation of XENOOPS Objects; A Parallel Software Monitor for Debugging and Performance Tools on Distributed Memory Multicomputers; Profiling Performance of Inter-Processor Communications in an iWarp Torus; The Application of Code Instrumentation Technology in the Los Alamos Debugger; and CXdb: The Road to Remote Debugging.

  10. [Teacher enhancement at Supercomputing `96

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-13

    The SC`96 Education Program provided a three-day professional development experience for middle and high school science, mathematics, and computer technology teachers. The program theme was Computers at Work in the Classroom, and a majority of the sessions were presented by classroom teachers who have had several years experience in using these technologies with their students. The teachers who attended the program were introduced to classroom applications of computing and networking technologies and were provided to the greatest extent possible with lesson plans, sample problems, and other resources that could immediately be used in their own classrooms. The attached At a Glance Schedule and Session Abstracts describes in detail the three-day SC`96 Education Program. Also included is the SC`96 Education Program evaluation report and the financial report.

  11. Advanced architectures for astrophysical supercomputing

    Science.gov (United States)

    Barsdell, B. R.

    2012-01-01

    This thesis explores the substantial benefits offered to astronomy research by advanced 'many-core' computing architectures, which can provide up to ten times more computing power than traditional processors. It begins by analysing the computations that are best suited to massively parallel computing and advocates a powerful, general approach to the use of many-core devices. These concepts are then put into practice to develop a fast data processing pipeline, with which new science outcomes are achieved in the field of pulsar astronomy, including the discovery of a new star. The work demonstrates how technology originally developed for the consumer market can now be used to accelerate the rate of scientific discovery.

  12. Future nuclear power generation

    International Nuclear Information System (INIS)

    Mosbah, D.S.; Nasreddine, M.

    2006-01-01

    The book includes an introduction then it speaks about the options to secure sources of energy, nuclear power option, nuclear plants to generate energy including light-water reactors (LWR), heavy-water reactors (HWR), advanced gas-cooled reactors (AGR), fast breeder reactors (FBR), development in the manufacture of reactors, fuel, uranium in the world, current status of nuclear power generation, economics of nuclear power, nuclear power and the environment and nuclear power in the Arab world. A conclusion at the end of the book suggests the increasing demand for energy in the industrialized countries and in a number of countries that enjoy special and economic growth such as China and India pushes the world to search for different energy sources to insure the urgent need for current and anticipated demand in the near and long-term future in light of pessimistic and optimistic outlook for energy in the future. This means that states do a scientific and objective analysis of the currently available data for the springboard to future plans to secure the energy required to support economy and welfare insurance.

  13. Build Less Code, Deliver More Science: An Experience Report on Composing Scientific Environments using Component-based and Commodity Software Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Computational Sciences and Math Division; Liu, Yan [Concordia University Montreal, Quebec, (Canada).; Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Computational Sciences and Math Division; Elsethagen, Todd O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Computational Sciences and Math Division; Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Computational Sciences and Math Division

    2013-07-17

    Modern scientific software is daunting in its diversity and complexity. From massively parallel simulations running on the world’s largest supercomputers, to visualizations and user support environments that manage ever growing complex data collections, the challenges for software engineers are plentiful. While high performance simulators are necessarily specialized codes to maximize performance on specific supercomputer architectures, we argue the vast majority of supporting infrastructure, data management and analysis tools can leverage commodity open source and component-based technologies. This approach can significantly drive down the effort and costs of building complex, collaborative scientific user environments, as well as increase their reliability and extensibility. In this paper we describe our experiences in creating an initial user environment for scientists involved in modeling the detailed effects of climate change on the environment of selected geographical regions. Our approach composes the user environment using the Velo scientific knowledge management platform and the MeDICi Integration Framework for scientific workflows. These established platforms leverage component-based technologies and extend commodity open source platforms with abstractions and capabilities that make them amenable for broad use in science. Using this approach we were able to deliver an operational user environment capable of running thousands of simulations in a 7 month period, and achieve significant software reuse.

  14. Computational mechanics - Advances and trends; Proceedings of the Session - Future directions of Computational Mechanics of the ASME Winter Annual Meeting, Anaheim, CA, Dec. 7-12, 1986

    Science.gov (United States)

    Noor, Ahmed K. (Editor)

    1986-01-01

    The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.

  15. Energy, Environment and IMCC

    DEFF Research Database (Denmark)

    Mogensen, Mogens Bjerg

    2012-01-01

    This paper gives a brief description of the important role that the ionic and mixed conducting ceramics (IMCC) type of materials will play in the R&D of energy and environment technologies of the - presumably - near future. IMCC materials based technologies for energy harvesting, conversion...... and storage as well as for monitoring and protection of our environment are exemplified. The strong impact of the international IMCC research on development of devices based on such materials is illustrated, and some recent trends in the scientific exploration of IMCC are highlighted. Important groups...

  16. Global environment and cogeneration

    International Nuclear Information System (INIS)

    Miyahara, Atsushi

    1992-01-01

    The environment problems on global scale have been highlighted in addition to the local problems due to the rapid increase of population, the increase of energy demand and so on. The global environment summit was held in Brazil. Now, global environment problems are the problems for mankind, and their importance seems to increase toward 21st century. In such circumstances, cogeneration can reduce carbon dioxide emission in addition to energy conservation, therefore, attention has been paid as the countermeasure for global environment. The background of global environment problems is explained. As to the effectiveness of cogeneration for global environment, the suitability of city gas to environment, energy conservation, the reduction of carbon dioxide and nitrogen oxides emission are discussed. As for the state of spread of cogeneration, as of March, 1992, those of 2250 MW in terms of power generation capacity have been installed in Japan. It is forecast that cogeneration will increase hereafter. As the future systems of cogeneration, city and industry energy center conception, industrial repowering, multiple house cogeneration and fuel cells are described. (K.I.)

  17. Encapsulated environment

    NARCIS (Netherlands)

    McLellan, Tom M.; Daanen, Hein A M; Cheung, Stephen S.

    2013-01-01

    In many occupational settings, clothing must be worn to protect individuals from hazards in their work environment. However, personal protective clothing (PPC) restricts heat exchange with the environment due to high thermal resistance and low water vapor permeability. As a consequence, individuals

  18. Iowa's Environment.

    Science.gov (United States)

    Ruth, Amy, Ed.

    1994-01-01

    This theme issue explores the changes in Iowa's environment. When Native Americans lived in Iowa hundreds of years ago, the land was rich in tall grasslands, fertile soil, wildlife, wetlands, and unpolluted waters. When European-American pioneers settled Iowa in 1833, they changed the environment in order to survive. The first article in this…

  19. Security Guards for the Future Web

    National Research Council Canada - National Science Library

    Reed, Nancy; Bryson, Dave; Garriss, James; Gosnell, Steve; Heaton, Brook; Huber, Gary; Jacobs, David; Pulvermacher, Mary; Semy, Salim; Smith, Chad; Standard, John

    2004-01-01

    .... Guard technology needs to keep pace with the evolving Web environment. The authors conjectured that a family of security guard services would be needed to provide the full range of functionality necessary to support the future Web...

  20. The future is 'ambient'

    Science.gov (United States)

    Lugmayr, Artur

    2006-02-01

    The research field of ambient media starts to spread rapidly and first applications for consumer homes are on the way. Ambient media is the logical continuation of research around media. Media has been evolving from old media (e.g. print media), to integrated presentation in one form (multimedia - or new media), to generating a synthetic world (virtual reality), to the natural environment is the user-interface (ambient media), and will be evolving towards real/synthetic undistinguishable media (bio-media or bio-multimedia). After the IT bubble was bursting, multimedia was lacking a vision of potential future scenarios and applications. Within this research paper the potentials, applications, and market available solutions of mobile ambient multimedia are studied. The different features of ambient mobile multimedia are manifold and include wearable computers, adaptive software, context awareness, ubiquitous computers, middleware, and wireless networks. The paper especially focuses on algorithms and methods that can be utilized to realize modern mobile ambient systems.

  1. Technology for the future

    International Nuclear Information System (INIS)

    1994-01-01

    Sixteen research centres in the Federal German Republic are associated in the ''Working Pool of Research Centres'' (AGF). As national research centres these institutions engage in scientific-technical and biological-medical research and development based on interdisciplinary cooperation and intensive deployment of personnel, capital, and technical equipment. They make substantial contributions to state-promoted programmes in the following areas: energy research and technology; basic nuclear research; transport and traffic systems; aerospace research and polar research; data processing and applied computer science; environment protection and health; biology and medicine; and marine engineering and geosciences. The authors of this new volume of AGF topics deal with so-called key technologies, i.e., developments determining the direction of future activities. Topics relevant to energy are solar research and fusion research. (orig./UA) [de

  2. Preservation of Built Environments

    DEFF Research Database (Denmark)

    Pilegaard, Marie Kirstine

    architectural value in preservation work as a matter of maintaining the buildings -as keeping them "alive" and allowing this to continue in the future. The predominantly aesthetic preservation approach will stop the buildings' life process, which is the same as - "letting them die". Finnebyen in Aarhus......When built environments and recently also cultural environments are to be preserved, the historic and architectural values are identified as the key motivations. In Denmark the SAVE system is used as a tool to identify architectural values, but in recent years it has been criticized for having...... is an example of a residential area, where the planning authority currently has presented a preservational district plan, following guidelines from the SAVE method. The purpose is to protect the area's architectural values in the future. The predominantly aesthetic approach is here used coupled to the concept...

  3. Advanced space system concepts and their orbital support needs (1980 - 2000). Volume 3: Detailed data. Part 1: Catalog of initiatives, functional options, and future environments and goals. [for the U.S. space program

    Science.gov (United States)

    Bekey, I.; Mayer, H. L.; Wolfe, M. G.

    1976-01-01

    The following areas were discussed in relation to a study of the commonality of space vehicle applications to future national needs: (1) index of initiatives (civilian observation, communication, support), brief illustrated description of each initiative, time periods (from 1980 to 2000+) for implementation of these initiatives; (2) data bank of functional system options, presented in the form of data sheets, one for each of the major functions, with the system option for near-term, midterm, and far-term space projects applicable to each subcategory of functions to be fulfilled; (3) table relating initiatives and desired goals (public service and humanistic, materialistic, scientific and intellectual); and (4) data on size, weight and cost estimations.

  4. Are Cloud Environments Ready for Scientific Applications?

    Science.gov (United States)

    Mehrotra, P.; Shackleford, K.

    2011-12-01

    Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to

  5. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  6. Futurism: Gaining a Toehold in Public Policy

    Science.gov (United States)

    Holden, Constance

    1975-01-01

    What has come to be known as applied futurism or futuristics, as a mode of thought, has been emerging from the academic environment into the realm of public policy. Insights noted at the Second General Assembly of the World Future Society are presented. (EB)

  7. Robotic environments

    NARCIS (Netherlands)

    Bier, H.H.

    2011-01-01

    Technological and conceptual advances in fields such as artificial intelligence, robotics, and material science have enabled robotic architectural environments to be implemented and tested in the last decade in virtual and physical prototypes. These prototypes are incorporating sensing-actuating

  8. Healthy Environments

    OpenAIRE

    2012-01-01

    This issue of Early Childhood in Focus draws attention to some key global challenges in providing healthy environments for young children. Section 1 recognises that multisectoral policy responses are needed to ensure adequate housing and improved water and sanitation, as well as recreational spaces. For young children, physical spaces are closely intertwined with emotional security and feelings of well-being. Section 2 explores the opportunities and challenges of living in urban environments....

  9. Securing a better future for all: Nuclear techniques for global development and environmental protection. NA factsheet on radioisotope production and radiation technology contributing to better health care and a cleaner environment

    International Nuclear Information System (INIS)

    2012-01-01

    Radioisotope and radiation technology finds numerous applications in a wide variety of fields, most importantly in medicine, industry, agriculture and the environment. However, in order to take full advantage of the benefits offered by this technology, it is essential to provide the necessary infrastructure as well as qualified personnel. The IAEA strives to promote worldwide availability of products and facilities in order to offer the benefits of radioisotope products and radiation technology to developing countries. In particular, the IAEA helps Member States to achieve self-sufficiency in the production of radioisotopes and radiopharmaceuticals, strengthen quality assurance practices and regulatory compliance as well as facilitate human resources development. The multipronged need based approach includes providing advice, assistance and capacity building support for: Development, production and quality assurance of reactor and accelerator based medical isotopes and radiopharmaceuticals for both the diagnosis and treatment of diseases, especially cancer; Establishment of irradiation facilities and utilization of gamma radiation, electron beam and X ray technology for varied applications, including tackling pollutants, wastewater treatment, sterilization of medical products, disinfestation of food grains, and synthesis and characterization of advanced materials; Application of radiation and isotopes in industrial process management.

  10. Synthetic environments

    Science.gov (United States)

    Lukes, George E.; Cain, Joel M.

    1996-02-01

    The Advanced Distributed Simulation (ADS) Synthetic Environments Program seeks to create robust virtual worlds from operational terrain and environmental data sources of sufficient fidelity and currency to interact with the real world. While some applications can be met by direct exploitation of standard digital terrain data, more demanding applications -- particularly those support operations 'close to the ground' -- are well-served by emerging capabilities for 'value-adding' by the user working with controlled imagery. For users to rigorously refine and exploit controlled imagery within functionally different workstations they must have a shared framework to allow interoperability within and between these environments in terms of passing image and object coordinates and other information using a variety of validated sensor models. The Synthetic Environments Program is now being expanded to address rapid construction of virtual worlds with research initiatives in digital mapping, softcopy workstations, and cartographic image understanding. The Synthetic Environments Program is also participating in a joint initiative for a sensor model applications programer's interface (API) to ensure that a common controlled imagery exploitation framework is available to all researchers, developers and users. This presentation provides an introduction to ADS and the associated requirements for synthetic environments to support synthetic theaters of war. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefitting from automated analysis of mapping, earth resources and reconnaissance imagery. And it provides an overview and status of the joint initiative for a sensor model API.

  11. FutureCoast: "Listen to your futures"

    Science.gov (United States)

    Pfirman, S. L.; Eklund, K.; Thacher, S.; Orlove, B. S.; Diane Stovall-Soto, G.; Brunacini, J.; Hernandez, T.

    2014-12-01

    Two science-arts approaches are emerging as effective means to convey "futurethinking" to learners: systems gaming and experiential futures. FutureCoast exemplifies the latter: by engaging participants with voicemails supposedly leaking from the cloud of possible futures, the storymaking game frames the complexities of climate science in relatable contexts. Because participants make the voicemails themselves, FutureCoast opens up creative ways for people to think about possibly climate-changed futures and personal ways to talk about them. FutureCoast is a project of the PoLAR Partnership with a target audience of informal adult learners primarily reached via mobile devices and online platforms. Scientists increasingly use scenarios and storylines as ways to explore the implications of environmental change and societal choices. Stories help people make connections across experiences and disciplines and link large-scale events to personal consequences. By making the future seem real today, FutureCoast's framework helps people visualize and plan for future climate changes. The voicemails contributed to FutureCoast are spread through the game's intended timeframe (2020 through 2065). Based on initial content analysis of voicemail text, common themes include ecosystems and landscapes, weather, technology, societal issues, governance and policy. Other issues somewhat less frequently discussed include security, food, industry and business, health, energy, infrastructure, water, economy, and migration. Further voicemail analysis is examining: temporal dimensions (salient time frames, short vs. long term issues, intergenerational, etc.), content (adaptation vs. mitigation, challenges vs. opportunities, etc.), and emotion (hopeful, resigned, etc. and overall emotional context). FutureCoast also engaged audiences through facilitated in-person experiences, geocaching events, and social media (Tumblr, Twitter, Facebook, YouTube). Analysis of the project suggests story

  12. Enacting Environments

    DEFF Research Database (Denmark)

    Lippert, Ingmar

    2013-01-01

    Enacting Environments is an ethnography of the midst of the encounter between corporations, sustainable development and climate change. At this intersection 'environmental management' and 'carbon accounting' are put into practice. Purportedly, these practices green capitalism. Drawing on fieldwork...... of day-to-day practices of corporate environmental accountants and managers, Ingmar Lippert reconstructs their work as achieving to produce a reality of environment that is simultaneously stable and flexible enough for a particular corporate project: to stage the company, and in consequence capitalism......, as in control over its relations to an antecedent environment. Not confined to mere texts or meetings between shiny stakeholders co-governing the corporation – among them some of the world's biggest auditing firms, an environmental non-governmental organisation (NGO) and standards – control is found...

  13. Educational Environments.

    Science.gov (United States)

    Yee, Roger, Ed.

    This book presents examples of the United States' most innovative new educational facilities for decision makers developing educational facilities of the future. It showcases some of the most recent and significant institutional projects from a number of the United States' top architecture and design firms. The architecture and interior design…

  14. Troubled waters. The future of the oceans. Human activity is polluting the marine environment and the economic livelihoods of millions who fish the seas. Science can help change the picture

    International Nuclear Information System (INIS)

    McIntyre, A.D.

    2003-01-01

    -fished, and 9% are depleted. In the light of this review, what can we say about the future of the oceans? One issue closely watched is global climate change. The major drivers of this are thought to be anthropogenic carbon dioxide and aerosols released by humans into the air. Climate warming will cause ocean temperatures to rise and its volume to expand, as well as melting of land-based ice that will add fresh water to the oceans. As a consequence, the sea level will rise. Unfortunately, we do not yet have sufficient understanding of the many processes at work in the ocean-atmosphere system to make accurate predictions about the physical changes that will certainly occur - nor can we be clear about the biological effects of changes in level and temperature of the oceans

  15. Observing environments

    DEFF Research Database (Denmark)

    Alrøe, Hugo Fjelsted; Noe, Egon

    2012-01-01

    in different ways. The aim of this paper is to clarify the conceptions of environment in constructivist approaches, and thereby to assist the sciences of complex systems and complex environmental problems. Method: We describe the terms used for “the environment” in von Uexküll, Maturana & Varela, and Luhmann...

  16. African Environment

    African Journals Online (AJOL)

    Environmental Studies and Regional Planning Bulletin African Environment is published in French and English, and for some issues, in Arabic. (only the issue below has been received by AJOL). Vol 10, No 3 (1999). DOWNLOAD FULL TEXT Open Access DOWNLOAD FULL TEXT Subscription or Fee Access. Table of ...

  17. Architecture & Environment

    Science.gov (United States)

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  18. Scenarios for the future

    International Nuclear Information System (INIS)

    Haegermark, H.; Bergmark, M.

    1995-06-01

    This project aims primarily to give a basis for the joint R and D program for the Swedish electric utility industry, in the form of pictures of the future up to 2020. The work was performed during four seminars in a group of managers and R and D planners. The four scenarios differ mainly in the assumptions of high or low economic growth and on market or political rule. Assumptions on essential uncertainties about the future have been combined in a consistent manner, e.g. on the structure of the utility industry, the role of nuclear power, the importance of the greenhouse gas issue, the influence of new technology developments and on changes of values in society. Certain other development appear in all scenarios, e.g. the impact of information technology throughout society, the internationalization of business in general and industrial production in particular, considerations for the environment and care for natural resources. The four scenarios are: 'Technology on the throne' (market rule/high growth); 'Intense competition' (market rule/low growth); 'Monopoly takes over' (political rule/high growth); and 'Green local society' (political rule/low growth). Some of the important factors pointed out by the study are: Increased customer mobility between regions and countries; The impact of information technology; Societal value changes; Sustainable development as an important driving force; Structure of the utility industry. Diversifying into new services. New players; Access to knowledge and competence; Ways for handling the greenhouse gas problem; Preparedness for nuclear power phase-out. 12 figs, 6 tabs

  19. Some Aspects of Futurism

    Science.gov (United States)

    Sangchai, Samporn

    1975-01-01

    The article, an overview, surveys various schools of futures research with reference to futurism's dimensions (methodologies, typologies, and distance in time); planning for alternative futures; orientations; and inner-future orientations (mysticism vs. science). Developing nations are advised to adapt developed nations' learnings selectively, and…

  20. A Marketing Approach to Commodity Futures Exchanges : A Case Study of the Dutch Hog Industry

    NARCIS (Netherlands)

    Meulenberg, M.T.G.; Pennings, J.M.E.

    2002-01-01

    This paper proposes a marketing strategic approach to commodity futures exchanges to optimise the (hedging) services offered. First, the environment of commodity futures exchanges is examined. Second, the threats and opportunities of commodity futures exchanges are analysed. Our analysis