WorldWideScience

Sample records for scientific computing mcqmc98

  1. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  2. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  3. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  4. High-End Scientific Computing

    Science.gov (United States)

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  5. Visualization in scientific computing

    National Research Council Canada - National Science Library

    Nielson, Gregory M; Shriver, Bruce D; Rosenblum, Lawrence J

    1990-01-01

    The purpose of this text is to provide a reference source to scientists, engineers, and students who are new to scientific visualization or who are interested in expanding their knowledge in this subject...

  6. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  7. Scientific applications of symbolic computation

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1976-02-01

    The use of symbolic computation systems for problem solving in scientific research is reviewed. The nature of the field is described, and particular examples are considered from celestial mechanics, quantum electrodynamics and general relativity. Symbolic integration and some more recent applications of algebra systems are also discussed [fr

  8. Scientific Computing in Electrical Engineering

    CERN Document Server

    Amrhein, Wolfgang; Zulehner, Walter

    2018-01-01

    This collection of selected papers presented at the 11th International Conference on Scientific Computing in Electrical Engineering (SCEE), held in St. Wolfgang, Austria, in 2016, showcases the state of the art in SCEE. The aim of the SCEE 2016 conference was to bring together scientists from academia and industry, mathematicians, electrical engineers, computer scientists, and physicists, and to promote intensive discussions on industrially relevant mathematical problems, with an emphasis on the modeling and numerical simulation of electronic circuits and devices, electromagnetic fields, and coupled problems. The focus in methodology was on model order reduction and uncertainty quantification. This extensive reference work is divided into six parts: Computational Electromagnetics, Circuit and Device Modeling and Simulation, Coupled Problems and Multi‐Scale Approaches in Space and Time, Mathematical and Computational Methods Including Uncertainty Quantification, Model Order Reduction, and Industrial Applicat...

  9. Mastering scientific computing with R

    CERN Document Server

    Gerrard, Paul

    2015-01-01

    If you want to learn how to quantitatively answer scientific questions for practical purposes using the powerful R language and the open source R tool ecosystem, this book is ideal for you. It is ideally suited for scientists who understand scientific concepts, know a little R, and want to be able to start applying R to be able to answer empirical scientific questions. Some R exposure is helpful, but not compulsory.

  10. Computer application in scientific investigations

    International Nuclear Information System (INIS)

    Govorun, N.N.

    1981-01-01

    A short review of the computer development and application and software in JINR for the last 15 years is presented. Main trends of studies on computer application in experimental and theoretical investigations are enumerated: software of computers and their systems, software of data processing systems, designing automatic and automized systems for measuring track detectors images, development of technique of carrying out experiments on computer line, packets of applied computer codes and specialized systems. The development of the on line technique is successfully used in investigations of nuclear processes at relativistic energies. The new trend is the development of television methods of data output and its computer recording [ru

  11. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  12. The 12-th INS scientific computational programs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This issue is the collection of the paper on INS scientific computational programs. Separate abstracts were presented for 3 of the papers in this report. The remaining 5 were considered outside the subject scope of INIS. (J.P.N.)

  13. Numerical and symbolic scientific computing

    CERN Document Server

    Langer, Ulrich

    2011-01-01

    The book presents the state of the art and results and also includes articles pointing to future developments. Most of the articles center around the theme of linear partial differential equations. Major aspects are fast solvers in elastoplasticity, symbolic analysis for boundary problems, symbolic treatment of operators, computer algebra, and finite element methods, a symbolic approach to finite difference schemes, cylindrical algebraic decomposition and local Fourier analysis, and white noise analysis for stochastic partial differential equations. Further numerical-symbolic topics range from

  14. Pascal-SC a computer language for scientific computation

    CERN Document Server

    Bohlender, Gerd; von Gudenberg, Jürgen Wolff; Rheinboldt, Werner; Siewiorek, Daniel

    1987-01-01

    Perspectives in Computing, Vol. 17: Pascal-SC: A Computer Language for Scientific Computation focuses on the application of Pascal-SC, a programming language developed as an extension of standard Pascal, in scientific computation. The publication first elaborates on the introduction to Pascal-SC, a review of standard Pascal, and real floating-point arithmetic. Discussions focus on optimal scalar product, standard functions, real expressions, program structure, simple extensions, real floating-point arithmetic, vector and matrix arithmetic, and dynamic arrays. The text then examines functions a

  15. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  16. Exploring HPCS languages in scientific computing

    International Nuclear Information System (INIS)

    Barrett, R F; Alam, S R; Almeida, V F d; Bernholdt, D E; Elwasif, W R; Kuehn, J A; Poole, S W; Shet, A G

    2008-01-01

    As computers scale up dramatically to tens and hundreds of thousands of cores, develop deeper computational and memory hierarchies, and increased heterogeneity, developers of scientific software are increasingly challenged to express complex parallel simulations effectively and efficiently. In this paper, we explore the three languages developed under the DARPA High-Productivity Computing Systems (HPCS) program to help address these concerns: Chapel, Fortress, and X10. These languages provide a variety of features not found in currently popular HPC programming environments and make it easier to express powerful computational constructs, leading to new ways of thinking about parallel programming. Though the languages and their implementations are not yet mature enough for a comprehensive evaluation, we discuss some of the important features, and provide examples of how they can be used in scientific computing. We believe that these characteristics will be important to the future of high-performance scientific computing, whether the ultimate language of choice is one of the HPCS languages or something else

  17. Exploring HPCS languages in scientific computing

    Science.gov (United States)

    Barrett, R. F.; Alam, S. R.; Almeida, V. F. d.; Bernholdt, D. E.; Elwasif, W. R.; Kuehn, J. A.; Poole, S. W.; Shet, A. G.

    2008-07-01

    As computers scale up dramatically to tens and hundreds of thousands of cores, develop deeper computational and memory hierarchies, and increased heterogeneity, developers of scientific software are increasingly challenged to express complex parallel simulations effectively and efficiently. In this paper, we explore the three languages developed under the DARPA High-Productivity Computing Systems (HPCS) program to help address these concerns: Chapel, Fortress, and X10. These languages provide a variety of features not found in currently popular HPC programming environments and make it easier to express powerful computational constructs, leading to new ways of thinking about parallel programming. Though the languages and their implementations are not yet mature enough for a comprehensive evaluation, we discuss some of the important features, and provide examples of how they can be used in scientific computing. We believe that these characteristics will be important to the future of high-performance scientific computing, whether the ultimate language of choice is one of the HPCS languages or something else.

  18. Scientific Computing and Apple's Intel Transition

    CERN Document Server

    CERN. Geneva

    2006-01-01

    Intel's published processor roadmap and how it may affect the future of personal and scientific computing About the speaker: Eric Albert is Senior Software Engineer in Apple's Core Technologies group. During Mac OS X's transition to Intel processors he has worked on almost every part of the operating system, from the OS kernel and compiler tools to appli...

  19. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  20. Scientific Computing Kernels on the Cell Processor

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel W.; Shalf, John; Oliker, Leonid; Kamil, Shoaib; Husbands, Parry; Yelick, Katherine

    2007-04-04

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of using the recently-released STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. First, we introduce a performance model for Cell and apply it to several key scientific computing kernels: dense matrix multiply, sparse matrix vector multiply, stencil computations, and 1D/2D FFTs. The difficulty of programming Cell, which requires assembly level intrinsics for the best performance, makes this model useful as an initial step in algorithm design and evaluation. Next, we validate the accuracy of our model by comparing results against published hardware results, as well as our own implementations on a 3.2GHz Cell blade. Additionally, we compare Cell performance to benchmarks run on leading superscalar (AMD Opteron), VLIW (Intel Itanium2), and vector (Cray X1E) architectures. Our work also explores several different mappings of the kernels and demonstrates a simple and effective programming model for Cell's unique architecture. Finally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  1. RXY/DRXY-a postprocessing graphical system for scientific computation

    International Nuclear Information System (INIS)

    Jin Qijie

    1990-01-01

    Scientific computing require computer graphical function for its visualization. The developing objects and functions of a postprocessing graphical system for scientific computation are described, and also briefly described its implementation

  2. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  3. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  4. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  5. Scientific computing in electrical engineering SCEE 2010

    Energy Technology Data Exchange (ETDEWEB)

    Michielsen, Bastiaan [Office National d' Etudes et de Recherches Aerospatiales (ONERA), 31 - Toulouse (France); Poirier, Jean-Rene (eds.) [LAPLACE-ENSEEIHT, Toulouse (France)

    2012-07-01

    Selected from papers presented at the 8th Scientific Computation in Electrical Engineering conference in Toulouse in 2010, the contributions to this volume cover every angle of numerically modelling electronic and electrical systems, including computational electromagnetics, circuit theory and simulation and device modelling. On computational electromagnetics, the chapters examine cutting-edge material ranging from low-frequency electrical machine modelling problems to issues in high-frequency scattering. Regarding circuit theory and simulation, the book details the most advanced techniques for modelling networks with many thousands of components. Modelling devices at microscopic levels is covered by a number of fundamental mathematical physics papers, while numerous papers on model order reduction help engineers and systems designers to bring their modelling of industrial-scale systems within the reach of present-day computational power. Complementing these more specific papers, the volume also contains a selection of mathematical methods which can be used in any application domain. (orig.)

  6. Scientific computing vol III - approximation and integration

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  7. Scientific computing vol II - eigenvalues and optimization

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the second of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses more advanced topics than volume one, and is largely not a prerequisite for volume three. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 49 examples, 110 exercises, 66 algorithms, 24 interactive JavaScript programs, 77 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in LAPACK, GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either upper level undergraduate...

  8. Scientific computing with MATLAB and Octave

    CERN Document Server

    Quarteroni, Alfio; Gervasio, Paola

    2014-01-01

    This textbook is an introduction to Scientific Computing, in which several numerical methods for the computer-based solution of certain classes of mathematical problems are illustrated. The authors show how to compute the zeros, the extrema, and the integrals of continuous functions, solve linear systems, approximate functions using polynomials and construct accurate approximations for the solution of ordinary and partial differential equations. To make the format concrete and appealing, the programming environments Matlab and Octave are adopted as faithful companions. The book contains the solutions to several problems posed in exercises and examples, often originating from important applications. At the end of each chapter, a specific section is devoted to subjects which were not addressed in the book and contains bibliographical references for a more comprehensive treatment of the material. From the review: ".... This carefully written textbook, the third English edition, contains substantial new developme...

  9. Scientific Computing in the CH Programming Language

    Directory of Open Access Journals (Sweden)

    Harry H. Cheng

    1993-01-01

    Full Text Available We have developed a general-purpose block-structured interpretive programming Ianguage. The syntax and semantics of this language called CH are similar to C. CH retains most features of C from the scientific computing point of view. In this paper, the extension of C to CH for numerical computation of real numbers will be described. Metanumbers of −0.0, 0.0, Inf, −Inf, and NaN are introduced in CH. Through these metanumbers, the power of the IEEE 754 arithmetic standard is easily available to the programmer. These metanumbers are extended to commonly used mathematical functions in the spirit of the IEEE 754 standard and ANSI C. The definitions for manipulation of these metanumbers in I/O; arithmetic, relational, and logic operations; and built-in polymorphic mathematical functions are defined. The capabilities of bitwise, assignment, address and indirection, increment and decrement, as well as type conversion operations in ANSI C are extended in CH. In this paper, mainly new linguistic features of CH in comparison to C will be described. Example programs programmed in CH with metanumbers and polymorphic mathematical functions will demonstrate capabilities of CH in scientific computing.

  10. Visual computing scientific visualization and imaging systems

    CERN Document Server

    2014-01-01

    This volume aims to stimulate discussions on research involving the use of data and digital images as an understanding approach for analysis and visualization of phenomena and experiments. The emphasis is put not only on graphically representing data as a way of increasing its visual analysis, but also on the imaging systems which contribute greatly to the comprehension of real cases. Scientific Visualization and Imaging Systems encompass multidisciplinary areas, with applications in many knowledge fields such as Engineering, Medicine, Material Science, Physics, Geology, Geographic Information Systems, among others. This book is a selection of 13 revised and extended research papers presented in the International Conference on Advanced Computational Engineering and Experimenting -ACE-X conferences 2010 (Paris), 2011 (Algarve), 2012 (Istanbul) and 2013 (Madrid). The examples were particularly chosen from materials research, medical applications, general concepts applied in simulations and image analysis and ot...

  11. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  12. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  13. International Symposium on Scientific Computing, Computer Arithmetic and Validated Numerics

    CERN Document Server

    DEVELOPMENTS IN RELIABLE COMPUTING

    1999-01-01

    The SCAN conference, the International Symposium on Scientific Com­ puting, Computer Arithmetic and Validated Numerics, takes place bian­ nually under the joint auspices of GAMM (Gesellschaft fiir Angewandte Mathematik und Mechanik) and IMACS (International Association for Mathematics and Computers in Simulation). SCAN-98 attracted more than 100 participants from 21 countries all over the world. During the four days from September 22 to 25, nine highlighted, plenary lectures and over 70 contributed talks were given. These figures indicate a large participation, which was partly caused by the attraction of the organizing country, Hungary, but also the effec­ tive support system have contributed to the success. The conference was substantially supported by the Hungarian Research Fund OTKA, GAMM, the National Technology Development Board OMFB and by the J6zsef Attila University. Due to this funding, it was possible to subsidize the participation of over 20 scientists, mainly from Eastern European countries. I...

  14. Computational Simulations and the Scientific Method

    Science.gov (United States)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  15. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  16. Studying Scientific Discovery by Computer Simulation.

    Science.gov (United States)

    1983-03-30

    Mendel’s laws of inheritance, the law of Gay- Lussac for gaseous reactions, tile law of Dulong and Petit, the derivation of atomic weights by Avogadro...neceseary mid identify by block number) scientific discovery -ittri sic properties physical laws extensive terms data-driven heuristics intensive...terms theory-driven heuristics conservation laws 20. ABSTRACT (Continue on revere. side It necessary and identify by block number) Scientific discovery

  17. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    International Nuclear Information System (INIS)

    Hules, John A.

    2008-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics

  18. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  19. Good enough practices in scientific computing.

    Science.gov (United States)

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  20. Computer-supported analysis of scientific measurements

    NARCIS (Netherlands)

    de Jong, Hidde

    1998-01-01

    In the past decade, large-scale databases and knowledge bases have become available to researchers working in a range of scientific disciplines. In many cases these databases and knowledge bases contain measurements of properties of physical objects which have been obtained in experiments or at

  1. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  2. 3rd International Conference on High Performance Scientific Computing

    CERN Document Server

    Kostina, Ekaterina; Phu, Hoang; Rannacher, Rolf

    2008-01-01

    This proceedings volume contains a selection of papers presented at the Third International Conference on High Performance Scientific Computing held at the Hanoi Institute of Mathematics, Vietnamese Academy of Science and Technology (VAST), March 6-10, 2006. The conference has been organized by the Hanoi Institute of Mathematics, Interdisciplinary Center for Scientific Computing (IWR), Heidelberg, and its International PhD Program ``Complex Processes: Modeling, Simulation and Optimization'', and Ho Chi Minh City University of Technology. The contributions cover the broad interdisciplinary spectrum of scientific computing and present recent advances in theory, development of methods, and applications in practice. Subjects covered are mathematical modelling, numerical simulation, methods for optimization and control, parallel computing, software development, applications of scientific computing in physics, chemistry, biology and mechanics, environmental and hydrology problems, transport, logistics and site loca...

  3. 5th International Conference on High Performance Scientific Computing

    CERN Document Server

    Hoang, Xuan; Rannacher, Rolf; Schlöder, Johannes

    2014-01-01

    This proceedings volume gathers a selection of papers presented at the Fifth International Conference on High Performance Scientific Computing, which took place in Hanoi on March 5-9, 2012. The conference was organized by the Institute of Mathematics of the Vietnam Academy of Science and Technology (VAST), the Interdisciplinary Center for Scientific Computing (IWR) of Heidelberg University, Ho Chi Minh City University of Technology, and the Vietnam Institute for Advanced Study in Mathematics. The contributions cover the broad interdisciplinary spectrum of scientific computing and present recent advances in theory, development of methods, and practical applications. Subjects covered include mathematical modeling; numerical simulation; methods for optimization and control; parallel computing; software development; and applications of scientific computing in physics, mechanics and biomechanics, material science, hydrology, chemistry, biology, biotechnology, medicine, sports, psychology, transport, logistics, com...

  4. 6th International Conference on High Performance Scientific Computing

    CERN Document Server

    Phu, Hoang; Rannacher, Rolf; Schlöder, Johannes

    2017-01-01

    This proceedings volume highlights a selection of papers presented at the Sixth International Conference on High Performance Scientific Computing, which took place in Hanoi, Vietnam on March 16-20, 2015. The conference was jointly organized by the Heidelberg Institute of Theoretical Studies (HITS), the Institute of Mathematics of the Vietnam Academy of Science and Technology (VAST), the Interdisciplinary Center for Scientific Computing (IWR) at Heidelberg University, and the Vietnam Institute for Advanced Study in Mathematics, Ministry of Education The contributions cover a broad, interdisciplinary spectrum of scientific computing and showcase recent advances in theory, methods, and practical applications. Subjects covered numerical simulation, methods for optimization and control, parallel computing, and software development, as well as the applications of scientific computing in physics, mechanics, biomechanics and robotics, material science, hydrology, biotechnology, medicine, transport, scheduling, and in...

  5. Frontiers of massively parallel scientific computation

    International Nuclear Information System (INIS)

    Fischer, J.R.

    1987-07-01

    Practical applications using massively parallel computer hardware first appeared during the 1980s. Their development was motivated by the need for computing power orders of magnitude beyond that available today for tasks such as numerical simulation of complex physical and biological processes, generation of interactive visual displays, satellite image analysis, and knowledge based systems. Representative of the first generation of this new class of computers is the Massively Parallel Processor (MPP). A team of scientists was provided the opportunity to test and implement their algorithms on the MPP. The first results are presented. The research spans a broad variety of applications including Earth sciences, physics, signal and image processing, computer science, and graphics. The performance of the MPP was very good. Results obtained using the Connection Machine and the Distributed Array Processor (DAP) are presented

  6. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  7. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  8. Highly parallel machines and future of scientific computing

    International Nuclear Information System (INIS)

    Singh, G.S.

    1992-01-01

    Computing requirement of large scale scientific computing has always been ahead of what state of the art hardware could supply in the form of supercomputers of the day. And for any single processor system the limit to increase in the computing power was realized a few years back itself. Now with the advent of parallel computing systems the availability of machines with the required computing power seems a reality. In this paper the author tries to visualize the future large scale scientific computing in the penultimate decade of the present century. The author summarized trends in parallel computers and emphasize the need for a better programming environment and software tools for optimal performance. The author concludes this paper with critique on parallel architectures, software tools and algorithms. (author). 10 refs., 2 tabs

  9. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  10. ASCR Cybersecurity for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Piesert, Sean

    2015-02-27

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE to execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.

  11. Scientific computing an introduction using Maple and Matlab

    CERN Document Server

    Gander, Walter; Kwok, Felix

    2014-01-01

    Scientific computing is the study of how to use computers effectively to solve problems that arise from the mathematical modeling of phenomena in science and engineering. It is based on mathematics, numerical and symbolic/algebraic computations and visualization. This book serves as an introduction to both the theory and practice of scientific computing, with each chapter presenting the basic algorithms that serve as the workhorses of many scientific codes; we explain both the theory behind these algorithms and how they must be implemented in order to work reliably in finite-precision arithmetic. The book includes many programs written in Matlab and Maple – Maple is often used to derive numerical algorithms, whereas Matlab is used to implement them. The theory is developed in such a way that students can learn by themselves as they work through the text. Each chapter contains numerous examples and problems to help readers understand the material “hands-on”.

  12. Introduction to the LaRC central scientific computing complex

    Science.gov (United States)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  13. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  14. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  15. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  16. Learning SciPy for numerical and scientific computing

    CERN Document Server

    Silva

    2013-01-01

    A step-by-step practical tutorial with plenty of examples on research-based problems from various areas of science, that prove how simple, yet effective, it is to provide solutions based on SciPy. This book is targeted at anyone with basic knowledge of Python, a somewhat advanced command of mathematics/physics, and an interest in engineering or scientific applications---this is broadly what we refer to as scientific computing.This book will be of critical importance to programmers and scientists who have basic Python knowledge and would like to be able to do scientific and numerical computatio

  17. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    Energy Technology Data Exchange (ETDEWEB)

    Hey, Tony [eScience Institute, University of Washington; Agarwal, Deborah [Lawrence Berkeley National Laboratory; Borgman, Christine [University of California, Los Angeles; Cartaro, Concetta [SLAC National Accelerator Laboratory; Crivelli, Silvia [Lawrence Berkeley National Laboratory; Van Dam, Kerstin Kleese [Pacific Northwest National Laboratory; Luce, Richard [University of Oklahoma; Arjun, Shankar [CADES, Oak Ridge National Laboratory; Trefethen, Anne [University of Oxford; Wade, Alex [Microsoft Research, Microsoft Corporation; Williams, Dean [Lawrence Livermore National Laboratory

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  18. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  19. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  20. High-performance scientific computing in the cloud

    Science.gov (United States)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  1. Scientific Discovery through Advanced Computing in Plasma Science

    Science.gov (United States)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations

  2. Scientific Computing Strategic Plan for the Idaho National Laboratory

    International Nuclear Information System (INIS)

    Whiting, Eric Todd

    2015-01-01

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory's (INL's) challenge and charge, and is central to INL's ongoing success. Computing is an essential part of INL's future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing number of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.

  3. PARA'04, State-of-the-art in scientific computing

    DEFF Research Database (Denmark)

    Madsen, Kaj; Wasniewski, Jerzy

    This meeting in the series, the PARA'04 Workshop with the title ``State of the Art in Scientific Computing'', was held in Lyngby, Denmark, June 20-23, 2004. The PARA'04 Workshop was organized by Jack Dongarra from the University of Tennessee and Oak Ridge National Laboratory, and Kaj Madsen and J...

  4. Topics in numerical partial differential equations and scientific computing

    CERN Document Server

    2016-01-01

    Numerical partial differential equations (PDEs) are an important part of numerical simulation, the third component of the modern methodology for science and engineering, besides the traditional theory and experiment. This volume contains papers that originated with the collaborative research of the teams that participated in the IMA Workshop for Women in Applied Mathematics: Numerical Partial Differential Equations and Scientific Computing in August 2014.

  5. Data-flow oriented visual programming libraries for scientific computing

    NARCIS (Netherlands)

    Maubach, J.M.L.; Drenth, W.D.; Sloot, P.M.A.

    2002-01-01

    The growing release of scientific computational software does not seem to aid the implementation of complex numerical algorithms. Released libraries lack a common standard interface with regard to for instance finite element, difference or volume discretizations. And, libraries written in standard

  6. Ontology-Driven Discovery of Scientific Computational Entities

    Science.gov (United States)

    Brazier, Pearl W.

    2010-01-01

    Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…

  7. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  8. Research initiatives for plug-and-play scientific computing

    International Nuclear Information System (INIS)

    McInnes, Lois Curfman; Dahlgren, Tamara; Nieplocha, Jarek; Bernholdt, David; Allan, Ben; Armstrong, Rob; Chavarria, Daniel; Elwasif, Wael; Gorton, Ian; Kenny, Joe; Krishan, Manoj; Malony, Allen; Norris, Boyana; Ray, Jaideep; Shende, Sameer

    2007-01-01

    This paper introduces three component technology initiatives within the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS) that address ever-increasing productivity challenges in creating, managing, and applying simulation software to scientific discovery. By leveraging the Common Component Architecture (CCA), a new component standard for high-performance scientific computing, these initiatives tackle difficulties at different but related levels in the development of component-based scientific software: (1) deploying applications on massively parallel and heterogeneous architectures, (2) investigating new approaches to the runtime enforcement of behavioral semantics, and (3) developing tools to facilitate dynamic composition, substitution, and reconfiguration of component implementations and parameters, so that application scientists can explore tradeoffs among factors such as accuracy, reliability, and performance

  9. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  10. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  11. Initial explorations of ARM processors for scientific computing

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Muzaffar, Shahzad

    2014-01-01

    Power efficiency is becoming an ever more important metric for both high performance and high throughput computing. Over the course of next decade it is expected that flops/watt will be a major driver for the evolution of computer architecture. Servers with large numbers of ARM processors, already ubiquitous in mobile computing, are a promising alternative to traditional x86-64 computing. We present the results of our initial investigations into the use of ARM processors for scientific computing applications. In particular we report the results from our work with a current generation ARMv7 development board to explore ARM-specific issues regarding the software development environment, operating system, performance benchmarks and issues for porting High Energy Physics software

  12. Educational NASA Computational and Scientific Studies (enCOMPASS)

    Science.gov (United States)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  13. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  14. Implementation of Scientific Computing Applications on the Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Guochun Shi

    2009-01-01

    Full Text Available The Cell Broadband Engine architecture is a revolutionary processor architecture well suited for many scientific codes. This paper reports on an effort to implement several traditional high-performance scientific computing applications on the Cell Broadband Engine processor, including molecular dynamics, quantum chromodynamics and quantum chemistry codes. The paper discusses data and code restructuring strategies necessary to adapt the applications to the intrinsic properties of the Cell processor and demonstrates performance improvements achieved on the Cell architecture. It concludes with the lessons learned and provides practical recommendations on optimization techniques that are believed to be most appropriate.

  15. The Potential of the Cell Processor for Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel; Shalf, John; Oliker, Leonid; Husbands, Parry; Kamil, Shoaib; Yelick, Katherine

    2005-10-14

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of the using the forth coming STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. We are the first to present quantitative Cell performance data on scientific kernels and show direct comparisons against leading superscalar (AMD Opteron), VLIW (IntelItanium2), and vector (Cray X1) architectures. Since neither Cell hardware nor cycle-accurate simulators are currently publicly available, we develop both analytical models and simulators to predict kernel performance. Our work also explores the complexity of mapping several important scientific algorithms onto the Cells unique architecture. Additionally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  16. Computer network access to scientific information systems for minority universities

    Science.gov (United States)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  17. JINR CLOUD SERVICE FOR SCIENTIFIC AND ENGINEERING COMPUTATIONS

    Directory of Open Access Journals (Sweden)

    Nikita A. Balashov

    2018-03-01

    Full Text Available Pretty often small research scientific groups do not have access to powerful enough computational resources required for their research work to be productive. Global computational infrastructures used by large scientific collaborations can be challenging for small research teams because of bureaucracy overhead as well as usage complexity of underlying tools. Some researchers buy a set of powerful servers to cover their own needs in computational resources. A drawback of such approach is a necessity to take care about proper hosting environment for these hardware and maintenance which requires a certain level of expertise. Moreover a lot of time such resources may be underutilized because а researcher needs to spend a certain amount of time to prepare computations and to analyze results as well as he doesn’t always need all resources of modern multi-core CPUs servers. The JINR cloud team developed a service which provides an access for scientists of small research groups from JINR and its Member State organizations to computational resources via problem-oriented (i.e. application-specific web-interface. It allows a scientist to focus on his research domain by interacting with the service in a convenient way via browser and abstracting away from underlying infrastructure as well as its maintenance. A user just sets a required values for his job via web-interface and specify a location for uploading a result. The computational workloads are done on the virtual machines deployed in the JINR cloud infrastructure.

  18. Emerging Nanophotonic Applications Explored with Advanced Scientific Parallel Computing

    Science.gov (United States)

    Meng, Xiang

    The domain of nanoscale optical science and technology is a combination of the classical world of electromagnetics and the quantum mechanical regime of atoms and molecules. Recent advancements in fabrication technology allows the optical structures to be scaled down to nanoscale size or even to the atomic level, which are far smaller than the wavelength they are designed for. These nanostructures can have unique, controllable, and tunable optical properties and their interactions with quantum materials can have important near-field and far-field optical response. Undoubtedly, these optical properties can have many important applications, ranging from the efficient and tunable light sources, detectors, filters, modulators, high-speed all-optical switches; to the next-generation classical and quantum computation, and biophotonic medical sensors. This emerging research of nanoscience, known as nanophotonics, is a highly interdisciplinary field requiring expertise in materials science, physics, electrical engineering, and scientific computing, modeling and simulation. It has also become an important research field for investigating the science and engineering of light-matter interactions that take place on wavelength and subwavelength scales where the nature of the nanostructured matter controls the interactions. In addition, the fast advancements in the computing capabilities, such as parallel computing, also become as a critical element for investigating advanced nanophotonic devices. This role has taken on even greater urgency with the scale-down of device dimensions, and the design for these devices require extensive memory and extremely long core hours. Thus distributed computing platforms associated with parallel computing are required for faster designs processes. Scientific parallel computing constructs mathematical models and quantitative analysis techniques, and uses the computing machines to analyze and solve otherwise intractable scientific challenges. In

  19. Technologies for Large Data Management in Scientific Computing

    CERN Document Server

    Pace, A

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focusses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  20. Computer simulations and the changing face of scientific experimentation

    CERN Document Server

    Duran, Juan M

    2013-01-01

    Computer simulations have become a central tool for scientific practice. Their use has replaced, in many cases, standard experimental procedures. This goes without mentioning cases where the target system is empirical but there are no techniques for direct manipulation of the system, such as astronomical observation. To these cases, computer simulations have proved to be of central importance. The question about their use and implementation, therefore, is not only a technical one but represents a challenge for the humanities as well. In this volume, scientists, historians, and philosophers joi

  1. Performance evaluation of scientific programs on advanced architecture computers

    International Nuclear Information System (INIS)

    Walker, D.W.; Messina, P.; Baille, C.F.

    1988-01-01

    Recently a number of advanced architecture machines have become commercially available. These new machines promise better cost-performance then traditional computers, and some of them have the potential of competing with current supercomputers, such as the Cray X/MP, in terms of maximum performance. This paper describes an on-going project to evaluate a broad range of advanced architecture computers using a number of complete scientific application programs. The computers to be evaluated include distributed- memory machines such as the NCUBE, INTEL and Caltech/JPL hypercubes, and the MEIKO computing surface, shared-memory, bus architecture machines such as the Sequent Balance and the Alliant, very long instruction word machines such as the Multiflow Trace 7/200 computer, traditional supercomputers such as the Cray X.MP and Cray-2, and SIMD machines such as the Connection Machine. Currently 11 application codes from a number of scientific disciplines have been selected, although it is not intended to run all codes on all machines. Results are presented for two of the codes (QCD and missile tracking), and future work is proposed

  2. Strategic Plan for a Scientific Cloud Computing infrastructure for Europe

    CERN Document Server

    Lengert, Maryline

    2011-01-01

    Here we present the vision, concept and direction for forming a European Industrial Strategy for a Scientific Cloud Computing Infrastructure to be implemented by 2020. This will be the framework for decisions and for securing support and approval in establishing, initially, an R&D European Cloud Computing Infrastructure that serves the need of European Research Area (ERA ) and Space Agencies. This Cloud Infrastructure will have the potential beyond this initial user base to evolve to provide similar services to a broad range of customers including government and SMEs. We explain how this plan aims to support the broader strategic goals of our organisations and identify the benefits to be realised by adopting an industrial Cloud Computing model. We also outline the prerequisites and commitment needed to achieve these objectives.

  3. 10th International Conference on Scientific Computing in Electrical Engineering

    CERN Document Server

    Clemens, Markus; Günther, Michael; Maten, E

    2016-01-01

    This book is a collection of selected papers presented at the 10th International Conference on Scientific Computing in Electrical Engineering (SCEE), held in Wuppertal, Germany in 2014. The book is divided into five parts, reflecting the main directions of SCEE 2014: 1. Device Modeling, Electric Circuits and Simulation, 2. Computational Electromagnetics, 3. Coupled Problems, 4. Model Order Reduction, and 5. Uncertainty Quantification. Each part starts with a general introduction followed by the actual papers. The aim of the SCEE 2014 conference was to bring together scientists from academia and industry, mathematicians, electrical engineers, computer scientists, and physicists, with the goal of fostering intensive discussions on industrially relevant mathematical problems, with an emphasis on the modeling and numerical simulation of electronic circuits and devices, electromagnetic fields, and coupled problems. The methodological focus was on model order reduction and uncertainty quantification.

  4. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  5. Integrating multiple scientific computing needs via a Private Cloud infrastructure

    International Nuclear Information System (INIS)

    Bagnasco, S; Berzano, D; Brunetti, R; Lusso, S; Vallero, S

    2014-01-01

    In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.

  6. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  7. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  8. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  9. Applications of industrial computed tomography at Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Kruger, R.P.; Morris, R.A.; Wecksung, G.W.

    1980-01-01

    A research and development program was begun three years ago at the Los Alamos Scientific Laboratory (LASL) to study nonmedical applications of computed tomography. This program had several goals. The first goal was to develop the necessary reconstruction algorithms to accurately reconstruct cross sections of nonmedical industrial objects. The second goal was to be able to perform extensive tomographic simulations to determine the efficacy of tomographic reconstruction with a variety of hardware configurations. The final goal was to construct an inexpensive industrial prototype scanner with a high degree of design flexibility. The implementation of these program goals is described

  10. Computer-assisted estimating for the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Spooner, J.E.

    1976-02-01

    An analysis is made of the cost estimating system currently in use at the Los Alamos Scientific Laboratory (LASL) and the benefits of computer assistance are evaluated. A computer-assisted estimating system (CAE) is proposed for LASL. CAE can decrease turnaround and provide more flexible response to management requests for cost information and analyses. It can enhance value optimization at the design stage, improve cost control and change-order justification, and widen the use of cost information in the design process. CAE costs are not well defined at this time although they appear to break even with present operations. It is recommended that a CAE system description be submitted for contractor consideration and bid while LASL system development continues concurrently

  11. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  12. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  13. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  14. A data management system for engineering and scientific computing

    Science.gov (United States)

    Elliot, L.; Kunii, H. S.; Browne, J. C.

    1978-01-01

    Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.

  15. The radar signature of revolution objects in scientific computing

    International Nuclear Information System (INIS)

    Bonnemason, P.; Le Martret, R.; Scheurer, B.; Stupfel, B.

    1990-12-01

    This work is motivated by the study of stealthy (or discrete) revolution objects vis-a-vis a radar. Efficient algorithms, specific numerical methods and two original industrial software (SHF 89 and SHF C) have been developed. These are reliable tools in intensive scientific computing. In particular, they have enabled the precise numerical modeling of complex objects, of very general forms, in the field of high frequencies and a thorough understanding of the physics of the problems involved. The purpose of this note is a general description of the work and its context, which is illustrated by examples of numerical applications (presented in Appendix 4). The technical aspects are detailed in reports and publications (a list is attached to this note) [fr

  16. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  17. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  18. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  19. PS3 CELL Development for Scientific Computation and Research

    Science.gov (United States)

    Christiansen, M.; Sevre, E.; Wang, S. M.; Yuen, D. A.; Liu, S.; Lyness, M. D.; Broten, M.

    2007-12-01

    The Cell processor is one of the most powerful processors on the market, and researchers in the earth sciences may find its parallel architecture to be very useful. A cell processor, with 7 cores, can easily be obtained for experimentation by purchasing a PlayStation 3 (PS3) and installing linux and the IBM SDK. Each core of the PS3 is capable of 25 GFLOPS giving a potential limit of 150 GFLOPS when using all 6 SPUs (synergistic processing units) by using vectorized algorithms. We have used the Cell's computational power to create a program which takes simulated tsunami datasets, parses them, and returns a colorized height field image using ray casting techniques. As expected, the time required to create an image is inversely proportional to the number of SPUs used. We believe that this trend will continue when multiple PS3s are chained using OpenMP functionality and are in the process of researching this. By using the Cell to visualize tsunami data, we have found that its greatest feature is its power. This fact entwines well with the needs of the scientific community where the limiting factor is time. Any algorithm, such as the heat equation, that can be subdivided into multiple parts can take advantage of the PS3 Cell's ability to split the computations across the 6 SPUs reducing required run time by one sixth. Further vectorization of the code can allow for 4 simultanious floating point operations by using the SIMD (single instruction multiple data) capabilities of the SPU increasing efficiency 24 times.

  20. I - Template Metaprogramming for Massively Parallel Scientific Computing - Expression Templates

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Large scale scientific computing raises questions on different levels ranging from the fomulation of the problems to the choice of the best algorithms and their implementation for a specific platform. There are similarities in these different topics that can be exploited by modern-style C++ template metaprogramming techniques to produce readable, maintainable and generic code. Traditional low-level code tend to be fast but platform-dependent, and it obfuscates the meaning of the algorithm. On the other hand, object-oriented approach is nice to read, but may come with an inherent performance penalty. These lectures aim to present he basics of the Expression Template (ET) idiom which allows us to keep the object-oriented approach without sacrificing performance. We will in particular show to to enhance ET to include SIMD vectorization. We will then introduce techniques for abstracting iteration, and introduce thread-level parallelism for use in heavy data-centric loads. We will show to to apply these methods i...

  1. Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)

    International Nuclear Information System (INIS)

    Oka, Yoshiaki; Okuda, Hiroshi

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)

  2. Data management, code deployment, and scientific visualization to enhance scientific discovery in fusion research through advanced computing

    International Nuclear Information System (INIS)

    Schissel, D.P.; Finkelstein, A.; Foster, I.T.; Fredian, T.W.; Greenwald, M.J.; Hansen, C.D.; Johnson, C.R.; Keahey, K.; Klasky, S.A.; Li, K.; McCune, D.C.; Peng, Q.; Stevens, R.; Thompson, M.R.

    2002-01-01

    The long-term vision of the Fusion Collaboratory described in this paper is to transform fusion research and accelerate scientific understanding and innovation so as to revolutionize the design of a fusion energy source. The Collaboratory will create and deploy collaborative software tools that will enable more efficient utilization of existing experimental facilities and more effective integration of experiment, theory, and modeling. The computer science research necessary to create the Collaboratory is centered on three activities: security, remote and distributed computing, and scientific visualization. It is anticipated that the presently envisioned Fusion Collaboratory software tools will require 3 years to complete

  3. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available capabilities has focused on the web services approach as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this paper uses instead remote objects via RPy...

  4. Second Annual AEC Scientific Computer Information Exhange Meeting. Proceedings of the technical program theme: computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Peskin,A.M.; Shimamoto, Y.

    1974-01-01

    The topic of computer graphics serves well to illustrate that AEC affiliated scientific computing installations are well represented in the forefront of computing science activities. The participant response to the technical program was overwhelming--both in number of contributions and quality of the work described. Session I, entitled Advanced Systems, contains presentations describing systems that contain features not generally found in graphics facilities. These features can be roughly classified as extensions of standard two-dimensional monochromatic imaging to higher dimensions including color and time as well as multidimensional metrics. Session II presents seven diverse applications ranging from high energy physics to medicine. Session III describes a number of important developments in establishing facilities, techniques and enhancements in the computer graphics area. Although an attempt was made to schedule as many of these worthwhile presentations as possible, it appeared impossible to do so given the scheduling constraints of the meeting. A number of prospective presenters 'came to the rescue' by graciously withdrawing from the sessions. Some of their abstracts have been included in the Proceedings.

  5. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  6. Engineering of systems for application of scientific computing in industry

    OpenAIRE

    Loeve, W.; Loeve, W.

    1992-01-01

    Mathematics software is of growing importance for computer simulation in industrial computer aided engineering. To be applicable in industry the mathematics software and supporting software must be structured in such a way that functions and performance can be maintained easily. In the present paper a method is described for development of mathematics software in such a way that this requirement can be met.

  7. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  8. Scholarly literature and the press: scientific impact and social perception of physics computing

    CERN Document Server

    Pia, Maria Grazia; Bell, Zane W; Dressendorfer, Paul V

    2014-01-01

    The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the scientific impact and social perception of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing would be beneficial to the high energy physics community.

  9. Multicore Challenges and Benefits for High Performance Scientific Computing

    Directory of Open Access Journals (Sweden)

    Ida M.B. Nielsen

    2008-01-01

    Full Text Available Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexity of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.

  10. Network and computing infrastructure for scientific applications in Georgia

    Science.gov (United States)

    Kvatadze, R.; Modebadze, Z.

    2016-09-01

    Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.

  11. A look back: 57 years of scientific computing

    DEFF Research Database (Denmark)

    Wasniewski, Jerzy

    2012-01-01

    This document outlines my 57-year career in computational mathematics, a career that took me from Poland to Canada and finally to Denmark. It of course spans a period in which both hardware and software developed enormously. Along the way I was fortunate to be faced with fascinating technical cha...... challenges and privileged to be able to share them with inspiring colleagues. From the beginning, my work to a great extent was concerned, directly or indirectly, with computational linear algebra, an interest I maintain even today....

  12. Trends in scientific computing applied to petroleum exploration and production

    International Nuclear Information System (INIS)

    Guevara, Saul E; Piedrahita, Carlos E; Arroyo, Elkin R; Soto Rodolfo

    2002-01-01

    Current trends of computational tools in the upstream of the petroleum industry ore presented herein several results and images obtained through commercial programs and through in-house software developments illustrate the topics discussed. They include several types of problems and programming paradigms. Emphasis is made on the future of parallel processing through the use of affordable, open systems, as the Linux system. This kind of technologies will likely make possible new research and industry applications, since quite advanced computational resources will be available to many people working in the area

  13. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  14. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific-analysis computer programs for year-2000 compliance is part of AECL' s year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  15. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific analysis computer programs for year-2000 compliance is part of AECL's year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  16. New Chicago-Indiana computer network will handle dataflow from world's largest scientific experiment

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an international netword of computer centers, including one operated jointly by the University of Chicago and Indiana University." (1,5 page)

  17. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  18. From Mars to Minerva: The origins of scientific computing in the AEC labs

    Energy Technology Data Exchange (ETDEWEB)

    Seidel, R.W. [ERA Land Grant Professor of the History of Technology]|[Charles Babbage Institute, University of Minnesota, Minneapolis, Minnesota (United States)

    1996-10-01

    Although the AEC laboratories are renowned for the development of nuclear weapons, their largess in promoting scientific computing also had a profound effect on scientific and technological development in the second half of the 20th century. {copyright} {ital 1996 American Institute of Physics.}

  19. Computer-Supported Aids to Making Sense of Scientific Articles: Cognitive, Motivational, and Attitudinal Effects

    Science.gov (United States)

    Gegner, Julie A.; Mackay, Donald H. J.; Mayer, Richard E.

    2009-01-01

    High school students can access original scientific research articles on the Internet, but may have trouble understanding them. To address this problem of online literacy, the authors developed a computer-based prototype for guiding students' comprehension of scientific articles. High school students were asked to read an original scientific…

  20. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-03-02

    ... Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR... least 5 business days prior to the meeting. Reasonable provision will be made to include the scheduled... the orderly conduct of business. Public comment will follow the 10-minute rule. Minutes: The minutes...

  1. Scientific Computing: A New Way of Looking at Mathematics

    Indian Academy of Sciences (India)

    Amiya Kumar Pani

    repose faith on the numbers being crunched. To design and develop reliable and efficient algorithms for numerical solutions to PDEs. By reliability, we mean that for a given tolerance and measurement, the computed solution stays near to the exact unknown solution within the prescribed tolerance with respect to the given.

  2. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Almgren, Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); DeMar, Phil [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Riley, Katherine [Argonne Leadership Computing Facility, Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bosilca, George [Univ. of Tennessee, Knoxville, TN (United States); Cappello, Frank [Argonne National Lab. (ANL), Argonne, IL (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Hill, Judy [Oak Ridge Leadership Computing Facility, Oak Ridge, TN (United States); Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States); McInnes, Lois Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moore, Shirley [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moreland, Ken [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roser, Rob [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Shende, Sameer [Univ. of Oregon, Eugene, OR (United States); Shipman, Galen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-06-20

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of the U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.

  3. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  4. The Julia programming language: the future of scientific computing

    Science.gov (United States)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  5. Performance of scientific computing platforms with MCNP4B

    International Nuclear Information System (INIS)

    McLaughlin, H.E.; Hendricks, J.S.

    1998-01-01

    Several computing platforms were evaluated with the MCNP4B Monte Carlo radiation transport code. The DEC AlphaStation 500/500 was the fastest to run MCNP4B. Compared to the HP 9000-735, the fastest platform 4 yr ago, the AlphaStation is 335% faster, the HP C180 is 133% faster, the SGI Origin 2000 is 82% faster, the Cray T94/4128 is 1% faster, the IBM RS/6000-590 is 93% as fast, the DEC 3000/600 is 81% as fast, the Sun Sparc20 is 57% as fast, the Cray YMP 8/8128 is 57% as fast, the sun Sparc5 is 33% as fast, and the Sun Sparc2 is 13% as fast. All results presented are reproducible and allow for comparison to computer platforms not included in this study. Timing studies are seen to be very problem dependent. The performance gains resulting from advances in software were also investigated. Various compilers and operating systems were seen to have a modest impact on performance, whereas hardware improvements have resulted in a factor of 4 improvement. MCNP4B also ran approximately as fast as MCNP4A

  6. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    International Nuclear Information System (INIS)

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs

  7. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs.

  8. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  9. Scientific computing and algorithms in industrial simulations projects and products of Fraunhofer SCAI

    CERN Document Server

    Schüller, Anton; Schweitzer, Marc

    2017-01-01

    The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...

  10. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    International Nuclear Information System (INIS)

    Nam, H; Stoitsov, M; Nazarewicz, W; Hagen, G; Kortelainen, M; Pei, J C; Bulgac, A; Maris, P; Vary, J P; Roche, K J; Schunck, N; Thompson, I; Wild, S M

    2012-01-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  11. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    Science.gov (United States)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  12. Computational brain connectivity mapping: A core health and scientific challenge.

    Science.gov (United States)

    Deriche, Rachid

    2016-10-01

    One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress have been obtained for exploring the brain during the past decades, it is still terra-incognita and calls for specific efforts in research to better understand its architecture and functioning. To take up this great challenge of modern science and to solve the limited view of the brain provided just by one imaging modality, this article advocates the idea developed in my research group of a global approach involving new generation of models for brain connectivity mapping and strong interactions between structural and functional connectivities. Capitalizing on the strengths of integrated and complementary non invasive imaging modalities such as diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG) will contribute to achieve new frontiers for identifying and characterizing structural and functional brain connectivities and to provide a detailed mapping of the brain connectivity, both in space and time. Thus leading to an added clinical value for high impact diseases with new perspectives in computational neuro-imaging and cognitive neuroscience. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  14. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  15. Scholarly literature and the press: scientific impact and social perception of physics computing

    International Nuclear Information System (INIS)

    Pia, M G; Basaglia, T; Bell, Z W; Dressendorfer, P V

    2014-01-01

    The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the relationship between the scientific impact and the social perception of HEP physics research versus that of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing via press releases from the major HEP laboratories would be beneficial to the high energy physics community.

  16. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Peisert, Sean [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Davis, CA (United States); Potok, Thomas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jones, Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-03

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues included research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the

  17. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    CERN Document Server

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2014-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  18. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    Science.gov (United States)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-05-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  19. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Muzaffar, Shahzad; Knight, Robert

    2015-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG). (paper)

  20. RAPPORT: running scientific high-performance computing applications on the cloud.

    Science.gov (United States)

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  1. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  2. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    Science.gov (United States)

    1987-10-01

    include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen

  3. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  4. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  5. On the impact of quantum computing technology on future developments in high-performance scientific computing

    OpenAIRE

    Möller, Matthias; Vuik, Cornelis

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to researchers and vendors of future computing technologies, national authorities are showing strong interest in maturing this technology due to its known potential to break many of today’s encryption technique...

  6. Performance analysis of cloud computing services for many-tasks scientific computing

    NARCIS (Netherlands)

    Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.

    2011-01-01

    Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a

  7. A performance analysis of EC2 cloud computing services for scientific computing

    NARCIS (Netherlands)

    Ostermann, S.; Iosup, A.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.; Avresky, D.; Diaz, M.; Bode, A.; Bruno, C.; Dekel, E.

    2010-01-01

    Cloud Computing is emerging today as a commercial infrastructure that eliminates the need for maintaining expensive computing hardware. Through the use of virtualization, clouds promise to address with the same shared set of physical resources a large user base with different needs. Thus, clouds

  8. On the impact of quantum computing technology on future developments in high-performance scientific computing

    NARCIS (Netherlands)

    Möller, M.; Vuik, C.

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to

  9. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.

    2013-01-01

    As our understanding of the world around us increases it becomes more challenging to make use of what we already know, and to increase our understanding still further. Computational modeling and simulation have become critical tools in addressing this challenge. The requirements of high-resolution, accurate modeling have outstripped the ability of desktop computers and even small clusters to provide the necessary compute power. Many applications in the scientific and engineering domains now need very large amounts of compute time, while other applications, particularly in the life sciences, frequently have large data I/O requirements. There is thus a growing need for a range of high performance applications which can utilize parallel compute systems effectively, which have efficient data handling strategies and which have the capacity to utilise current and future systems. The High Performance and Scientific Applications topic aims to highlight recent progress in the use of advanced computing and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators, and to deal with difficult I/O requirements. © 2013 Springer-Verlag.

  10. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  11. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  12. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    International Nuclear Information System (INIS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-01-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  13. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  14. Applied and numerical partial differential equations scientific computing in simulation, optimization and control in a multidisciplinary context

    CERN Document Server

    Glowinski, R; Kuznetsov, Y A; Periaux, Jacques; Neittaanmaki, Pekka; Pironneau, Olivier

    2010-01-01

    Standing at the intersection of mathematics and scientific computing, this collection of state-of-the-art papers in nonlinear PDEs examines their applications to subjects as diverse as dynamical systems, computational mechanics, and the mathematics of finance.

  15. Sign use and cognition in automated scientific discovery: are computers only special kinds of signs?

    Science.gov (United States)

    Giza, Piotr

    2018-04-01

    James Fetzer criticizes the computational paradigm, prevailing in cognitive science by questioning, what he takes to be, its most elementary ingredient: that cognition is computation across representations. He argues that if cognition is taken to be a purposive, meaningful, algorithmic problem solving activity, then computers are incapable of cognition. Instead, they appear to be signs of a special kind, that can facilitate computation. He proposes the conception of minds as semiotic systems as an alternative paradigm for understanding mental phenomena, one that seems to overcome the difficulties of computationalism. Now, I argue, that with computer systems dealing with scientific discovery, the matter is not so simple as that. The alleged superiority of humans using signs to stand for something other over computers being merely "physical symbol systems" or "automatic formal systems" is only easy to establish in everyday life, but becomes far from obvious when scientific discovery is at stake. In science, as opposed to everyday life, the meaning of symbols is, apart from very low-level experimental investigations, defined implicitly by the way the symbols are used in explanatory theories or experimental laws relevant to the field, and in consequence, human and machine discoverers are much more on a par. Moreover, the great practical success of the genetic programming method and recent attempts to apply it to automatic generation of cognitive theories seem to show, that computer systems are capable of very efficient problem solving activity in science, which is neither purposive nor meaningful, nor algorithmic. This, I think, undermines Fetzer's argument that computer systems are incapable of cognition because computation across representations is bound to be a purposive, meaningful, algorithmic problem solving activity.

  16. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  17. ScalaLab and GroovyLab: Comparing Scala and Groovy for Scientific Computing

    Directory of Open Access Journals (Sweden)

    Stergios Papadimitriou

    2015-01-01

    Full Text Available ScalaLab and GroovyLab are both MATLAB-like environments for the Java Virtual Machine. ScalaLab is based on the Scala programming language and GroovyLab is based on the Groovy programming language. They present similar user interfaces and functionality to the user. They also share the same set of Java scientific libraries and of native code libraries. From the programmer's point of view though, they have significant differences. This paper compares some aspects of the two environments and highlights some of the strengths and weaknesses of Scala versus Groovy for scientific computing. The discussion also examines some aspects of the dilemma of using dynamic typing versus static typing for scientific programming. The performance of the Java platform is continuously improved at a fast pace. Today Java can effectively support demanding high-performance computing and scales well on multicore platforms. Thus, both systems can challenge the performance of the traditional C/C++/Fortran scientific code with an easier to use and more productive programming environment.

  18. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  19. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  20. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  1. An integrated IaaS and PaaS architecture for scientific computing

    OpenAIRE

    Donvito, Giacinto; Blanquer, Ignacio

    2015-01-01

    Scientific applications often require multiple computing resources deployed on a coordinated way. The deployment of multiple resources require installing and configuring special software applications which should be updated when changes in the virtual infrastructure take place. When working on hybrid and federated cloud environments, restrictions on the hypervisor or cloud management platform must be minimised to facilitate geographic-wide brokering and cross-site deployments. Moreover, prese...

  2. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  3. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  4. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  5. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  6. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  7. The graphics future in scientific applications-trends and developments in computer graphics

    CERN Document Server

    Enderle, G

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations will appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education.

  8. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  9. Large-scale computation at PSI scientific achievements and future requirements

    International Nuclear Information System (INIS)

    Adelmann, A.; Markushin, V.

    2008-11-01

    Computational modelling and simulation are among the disciplines that have seen the most dramatic growth in capabilities in the 2Oth Century. Within the past two decades, scientific computing has become an important contributor to all scientific research programs. Computational modelling and simulation are particularly indispensable for solving research problems that are unsolvable by traditional theoretical and experimental approaches, hazardous to study, or time consuming or expensive to solve by traditional means. Many such research areas are found in PSI's research portfolio. Advances in computing technologies (including hardware and software) during the past decade have set the stage for a major step forward in modelling and simulation. We have now arrived at a situation where we have a number of otherwise unsolvable problems, where simulations are as complex as the systems under study. In 2008 the High-Performance Computing (HPC) community entered the petascale area with the heterogeneous Opteron/Cell machine, called Road Runner built by IBM for the Los Alamos National Laboratory. We are on the brink of a time where the availability of many hundreds of thousands of cores will open up new challenging possibilities in physics, algorithms (numerical mathematics) and computer science. However, to deliver on this promise, it is not enough to provide 'peak' performance in terms of peta-flops, the maximum theoretical speed a computer can attain. Most important, this must be translated into corresponding increase in the capabilities of scientific codes. This is a daunting problem that can only be solved by increasing investment in hardware, in the accompanying system software that enables the reliable use of high-end computers, in scientific competence i.e. the mathematical (parallel) algorithms that are the basis of the codes, and education. In the case of Switzerland, the white paper 'Swiss National Strategic Plan for High Performance Computing and Networking

  10. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    Computational modelling and simulation are among the disciplines that have seen the most dramatic growth in capabilities in the 2Oth Century. Within the past two decades, scientific computing has become an important contributor to all scientific research programs. Computational modelling and simulation are particularly indispensable for solving research problems that are unsolvable by traditional theoretical and experimental approaches, hazardous to study, or time consuming or expensive to solve by traditional means. Many such research areas are found in PSI's research portfolio. Advances in computing technologies (including hardware and software) during the past decade have set the stage for a major step forward in modelling and simulation. We have now arrived at a situation where we have a number of otherwise unsolvable problems, where simulations are as complex as the systems under study. In 2008 the High-Performance Computing (HPC) community entered the petascale area with the heterogeneous Opteron/Cell machine, called Road Runner built by IBM for the Los Alamos National Laboratory. We are on the brink of a time where the availability of many hundreds of thousands of cores will open up new challenging possibilities in physics, algorithms (numerical mathematics) and computer science. However, to deliver on this promise, it is not enough to provide 'peak' performance in terms of peta-flops, the maximum theoretical speed a computer can attain. Most important, this must be translated into corresponding increase in the capabilities of scientific codes. This is a daunting problem that can only be solved by increasing investment in hardware, in the accompanying system software that enables the reliable use of high-end computers, in scientific competence i.e. the mathematical (parallel) algorithms that are the basis of the codes, and education. In the case of Switzerland, the white paper 'Swiss National Strategic Plan for High Performance Computing

  11. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sewell, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Meredith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  12. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  13. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D.; Sewell, Christopher (LANL); Childs, Hank (U of Oregon); Ma, Kwan-Liu (UC Davis); Geveci, Berk (Kitware); Meredith, Jeremy (ORNL)

    2016-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  14. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States)

    2017-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  15. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  16. On the Performance of the Python Programming Language for Serial and Parallel Scientific Computations

    Directory of Open Access Journals (Sweden)

    Xing Cai

    2005-01-01

    Full Text Available This article addresses the performance of scientific applications that use the Python programming language. First, we investigate several techniques for improving the computational efficiency of serial Python codes. Then, we discuss the basic programming techniques in Python for parallelizing serial scientific applications. It is shown that an efficient implementation of the array-related operations is essential for achieving good parallel performance, as for the serial case. Once the array-related operations are efficiently implemented, probably using a mixed-language implementation, good serial and parallel performance become achievable. This is confirmed by a set of numerical experiments. Python is also shown to be well suited for writing high-level parallel programs.

  17. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul

    2017-03-26

    We provide a general discussion of Smolyak\\'s algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak\\'s work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak\\'s algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  18. Smolyak's algorithm: A powerful black box for the acceleration of scientific computations

    KAUST Repository

    Tempone, Raul; Wolfers, Soeren

    2017-01-01

    We provide a general discussion of Smolyak's algorithm for the acceleration of scientific computations. The algorithm first appeared in Smolyak's work on multidimensional integration and interpolation. Since then, it has been generalized in multiple directions and has been associated with the keywords: sparse grids, hyperbolic cross approximation, combination technique, and multilevel methods. Variants of Smolyak's algorithm have been employed in the computation of high-dimensional integrals in finance, chemistry, and physics, in the numerical solution of partial and stochastic differential equations, and in uncertainty quantification. Motivated by this broad and ever-increasing range of applications, we describe a general framework that summarizes fundamental results and assumptions in a concise application-independent manner.

  19. Using Just-in-Time Information to Support Scientific Discovery Learning in a Computer-Based Simulation

    Science.gov (United States)

    Hulshof, Casper D.; de Jong, Ton

    2006-01-01

    Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…

  20. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  1. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele

    2012-01-01

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  2. The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC

    Science.gov (United States)

    Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan

    2016-04-01

    The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.

  3. Availability measurement of grid services from the perspective of a scientific computing centre

    International Nuclear Information System (INIS)

    Marten, H; Koenig, T

    2011-01-01

    The Karlsruhe Institute of Technology (KIT) is the merger of Forschungszentrum Karlsruhe and the Technical University Karlsruhe. The Steinbuch Centre for Computing (SCC) was one of the first new organizational units of KIT, combining the former Institute for Scientific Computing of Forschungszentrum Karlsruhe and the Computing Centre of the University. IT service management according to the worldwide de-facto-standard 'IT Infrastructure Library (ITIL)' was chosen by SCC as a strategic element to support the merging of the two existing computing centres located at a distance of about 10 km. The availability and reliability of IT services directly influence the customer satisfaction as well as the reputation of the service provider, and unscheduled loss of availability due to hardware or software failures may even result in severe consequences like data loss. Fault tolerant and error correcting design features are reducing the risk of IT component failures and help to improve the delivered availability. The ITIL process controlling the respective design is called Availability Management. This paper discusses Availability Management regarding grid services delivered to WLCG and provides a few elementary guidelines for availability measurements and calculations of services consisting of arbitrary numbers of components.

  4. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    Science.gov (United States)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  5. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Jayatilaka, B. [Fermilab; Levshina, T. [Fermilab; Sehgal, C. [Fermilab; Gardner, R. [Chicago U.; Rynge, M. [USC - ISI, Marina del Rey; Würthwein, F. [UC, San Diego

    2017-11-22

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  6. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  7. Efficient Machine Learning Approach for Optimizing Scientific Computing Applications on Emerging HPC Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Kamesh [Old Dominion Univ., Norfolk, VA (United States)

    2017-05-01

    Efficient parallel implementations of scientific applications on multi-core CPUs with accelerators such as GPUs and Xeon Phis is challenging. This requires - exploiting the data parallel architecture of the accelerator along with the vector pipelines of modern x86 CPU architectures, load balancing, and efficient memory transfer between different devices. It is relatively easy to meet these requirements for highly structured scientific applications. In contrast, a number of scientific and engineering applications are unstructured. Getting performance on accelerators for these applications is extremely challenging because many of these applications employ irregular algorithms which exhibit data-dependent control-ow and irregular memory accesses. Furthermore, these applications are often iterative with dependency between steps, and thus making it hard to parallelize across steps. As a result, parallelism in these applications is often limited to a single step. Numerical simulation of charged particles beam dynamics is one such application where the distribution of work and memory access pattern at each time step is irregular. Applications with these properties tend to present significant branch and memory divergence, load imbalance between different processor cores, and poor compute and memory utilization. Prior research on parallelizing such irregular applications have been focused around optimizing the irregular, data-dependent memory accesses and control-ow during a single step of the application independent of the other steps, with the assumption that these patterns are completely unpredictable. We observed that the structure of computation leading to control-ow divergence and irregular memory accesses in one step is similar to that in the next step. It is possible to predict this structure in the current step by observing the computation structure of previous steps. In this dissertation, we present novel machine learning based optimization techniques to address

  8. Measuring scientific reasoning through behavioral analysis in a computer-based problem solving exercise

    Science.gov (United States)

    Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2016-12-01

    Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new

  9. Multithreaded transactions in scientific computing. The Growth06_v2 program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-07-01

    efficient than the previous ones [3]. Summary of revisions:The design pattern (See Fig. 2 of Ref. [3]) has been modified according to the scheme shown on Fig. 1. A graphical user interface (GUI) for the program has been reconstructed. Fig. 2 presents a hybrid diagram of a GUI that shows how onscreen objects connect to use cases. The program has been compiled with English/USA regional and language options. Note: The figures mentioned above are contained in the program distribution file. Unusual features: The program is distributed in the form of source project GROWTH06_v2.dpr with associated files, and should be compiled using Borland Delphi compilers versions 6 or latter (including Borland Developer Studio 2006 and Code Gear compilers for Delphi). Additional comments: Two figures are included in the program distribution file. These are captioned Static classes model for Transaction design pattern. A model of a window that shows how onscreen objects connect to use cases. Running time: The typical running time is machine and user-parameters dependent. References: [1] A. Daniluk, Comput. Phys. Comm. 170 (2005) 265. [2] W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes in Pascal: The Art of Scientific Computing, first ed., Cambridge University Press, 1989. [3] M. Brzuszek, A. Daniluk, Comput. Phys. Comm. 175 (2006) 678.

  10. Computer simulation, rhetoric, and the scientific imagination how virtual evidence shapes science in the making and in the news

    CERN Document Server

    Roundtree, Aimee Kendall

    2013-01-01

    Computer simulations help advance climatology, astrophysics, and other scientific disciplines. They are also at the crux of several high-profile cases of science in the news. How do simulation scientists, with little or no direct observations, make decisions about what to represent? What is the nature of simulated evidence, and how do we evaluate its strength? Aimee Kendall Roundtree suggests answers in Computer Simulation, Rhetoric, and the Scientific Imagination. She interprets simulations in the sciences by uncovering the argumentative strategies that underpin the production and disseminati

  11. A Scientific Calculator for Exact Real Number Computation Based on LRT, GMP and FC++

    Directory of Open Access Journals (Sweden)

    J. A. Hernández

    2012-03-01

    Full Text Available Language for Redundant Test (LRT is a programming language for exact real number computation. Its lazy evaluation mechanism (also called call-by-need and its infinite list requirement, make the language appropriate to be implemented in a functional programming language such as Haskell. However, a direction translation of the operational semantics of LRT into Haskell as well as the algorithms to implement basic operations (addition subtraction, multiplication, division and trigonometric functions (sin, cosine, tangent, etc. makes the resulting scientific calculator time consuming and so inefficient. In this paper, we present an alternative implementation of the scientific calculator using FC++ and GMP. FC++ is a functional C++ library while GMP is a GNU multiple presicion library. We show that a direct translation of LRT in FC++ results in a faster scientific calculator than the one presented in Haskell.El lenguaje de verificación redundante (LRT, por sus siglas en inglés es un lenguaje de programación para el cómputo con números reales exactos. Su método de evaluación lazy (o mejor conocido como llamada por necesidad y el manejo de listas infinitas requerido, hace que el lenguaje sea apropiado para su implementación en un lenguaje funcional como Haskell. Sin embargo, la implementación directa de la semántica operacional de LRT en Haskell así como los algoritmos para funciones básicas (suma, resta, multiplicación y división y funciones trigonométricas (seno, coseno, tangente, etc hace que la calculadora científica resultante sea ineficiente. En este artículo, presentamos una implementación alternativa de la calculadora científica usando FC++ y GMP. FC++ es una librería que utiliza el paradigma Funcional en C++ mientras que GMP es una librería GNU de múltiple precisión. En el artículo mostramos que la implementación directa de LRT en FC++ resulta en una librería más eficiente que la implementada en Haskell.

  12. II - Template Metaprogramming for Massively Parallel Scientific Computing - Vectorization with Expression Templates

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Large scale scientific computing raises questions on different levels ranging from the fomulation of the problems to the choice of the best algorithms and their implementation for a specific platform. There are similarities in these different topics that can be exploited by modern-style C++ template metaprogramming techniques to produce readable, maintainable and generic code. Traditional low-level code tend to be fast but platform-dependent, and it obfuscates the meaning of the algorithm. On the other hand, object-oriented approach is nice to read, but may come with an inherent performance penalty. These lectures aim to present he basics of the Expression Template (ET) idiom which allows us to keep the object-oriented approach without sacrificing performance. We will in particular show to to enhance ET to include SIMD vectorization. We will then introduce techniques for abstracting iteration, and introduce thread-level parallelism for use in heavy data-centric loads. We will show to to apply these methods i...

  13. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    International Nuclear Information System (INIS)

    Smith, W. Spencer; Koothoor, Mimitha

    2016-01-01

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification

  14. Implementation of a Curriculum-Integrated Computer Game for Introducing Scientific Argumentation

    Science.gov (United States)

    Wallon, Robert C.; Jasti, Chandana; Lauren, Hillary Z. G.; Hug, Barbara

    2017-11-01

    Argumentation has been emphasized in recent US science education reform efforts (NGSS Lead States 2013; NRC 2012), and while existing studies have investigated approaches to introducing and supporting argumentation (e.g., McNeill and Krajcik in Journal of Research in Science Teaching, 45(1), 53-78, 2008; Kang et al. in Science Education, 98(4), 674-704, 2014), few studies have investigated how game-based approaches may be used to introduce argumentation to students. In this paper, we report findings from a design-based study of a teacher's use of a computer game intended to introduce the claim, evidence, reasoning (CER) framework (McNeill and Krajcik 2012) for scientific argumentation. We studied the implementation of the game over two iterations of development in a high school biology teacher's classes. The results of this study include aspects of enactment of the activities and student argument scores. We found the teacher used the game in aspects of explicit instruction of argumentation during both iterations, although the ways in which the game was used differed. Also, students' scores in the second iteration were significantly higher than the first iteration. These findings support the notion that students can learn argumentation through a game, especially when used in conjunction with explicit instruction and support in student materials. These findings also highlight the importance of analyzing classroom implementation in studies of game-based learning.

  15. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers.

    Directory of Open Access Journals (Sweden)

    Vanessa V Sochat

    Full Text Available Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub's primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers.

  16. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers

    Science.gov (United States)

    Prybol, Cameron J.; Kurtzer, Gregory M.

    2017-01-01

    Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub’s primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers. PMID:29186161

  17. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  18. Scientific profile of brain-computer interfaces: Bibliometric analysis in a 10-year period.

    Science.gov (United States)

    Hu, Kejia; Chen, Chao; Meng, Qingyao; Williams, Ziv; Xu, Wendong

    2016-12-02

    With the tremendous advances in the field of brain-computer interfaces (BCI), the literature in this field has grown exponentially; examination of highly cited articles is a tool that can help identify outstanding scientific studies and landmark papers. This study examined the characteristics of 100 highly cited BCI papers over the past 10 years. The Web of Science was searched for highly cited papers related to BCI research published from 2006 to 2015. The top 100 highly cited articles were identified. The number of citations and countries, and the corresponding institutions, year of publication, study design, and research area were noted and analyzed. The 100 highly cited articles had a mean of 137.1(SE: 15.38) citations. These articles were published in 45 high-impact journals, and mostly in TRANSACTIONS ON BIOMEDICAL ENGINEERING (n=14). Of the 100 articles, 72 were original articles and the rest were review articles. These articles came from 15 countries, with the USA contributing most of the highly cited articles (n=52). Fifty-seven institutions produced these 100 highly cited articles, led by Duke University (n=7). This study provides a historical perspective on the progress in the field of BCI, allows recognition of the most influential reports, and provides useful information that can indicate areas requiring further investigation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. III - Template Metaprogramming for massively parallel scientific computing - Templates for Iteration; Thread-level Parallelism

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Large scale scientific computing raises questions on different levels ranging from the fomulation of the problems to the choice of the best algorithms and their implementation for a specific platform. There are similarities in these different topics that can be exploited by modern-style C++ template metaprogramming techniques to produce readable, maintainable and generic code. Traditional low-level code tend to be fast but platform-dependent, and it obfuscates the meaning of the algorithm. On the other hand, object-oriented approach is nice to read, but may come with an inherent performance penalty. These lectures aim to present he basics of the Expression Template (ET) idiom which allows us to keep the object-oriented approach without sacrificing performance. We will in particular show to to enhance ET to include SIMD vectorization. We will then introduce techniques for abstracting iteration, and introduce thread-level parallelism for use in heavy data-centric loads. We will show to to apply these methods i...

  20. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  1. Using Cloud-Computing Applications to Support Collaborative Scientific Inquiry: Examining Pre-Service Teachers' Perceived Barriers to Integration

    Science.gov (United States)

    Donna, Joel D.; Miller, Brant G.

    2013-01-01

    Technology plays a crucial role in facilitating collaboration within the scientific community. Cloud-computing applications, such as Google Drive, can be used to model such collaboration and support inquiry within the secondary science classroom. Little is known about pre-service teachers' beliefs related to the envisioned use of collaborative,…

  2. The InSAR Scientific Computing Environment (ISCE): A Python Framework for Earth Science

    Science.gov (United States)

    Rosen, P. A.; Gurrola, E. M.; Agram, P. S.; Sacco, G. F.; Lavalle, M.

    2015-12-01

    The InSAR Scientific Computing Environment (ISCE, funded by NASA ESTO) provides a modern computing framework for geodetic image processing of InSAR data from a diverse array of radar satellites and aircraft. ISCE is both a modular, flexible, and extensible framework for building software components and applications as well as a toolbox of applications for processing raw or focused InSAR and Polarimetric InSAR data. The ISCE framework contains object-oriented Python components layered to construct Python InSAR components that manage legacy Fortran/C InSAR programs. Components are independently configurable in a layered manner to provide maximum control. Polymorphism is used to define a workflow in terms of abstract facilities for each processing step that are realized by specific components at run-time. This enables a single workflow to work on either raw or focused data from all sensors. ISCE can serve as the core of a production center to process Level-0 radar data to Level-3 products, but is amenable to interactive processing approaches that allow scientists to experiment with data to explore new ways of doing science with InSAR data. The NASA-ISRO SAR (NISAR) Mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystems. ISCE is planned as the foundational element in processing NISAR data, enabling a new class of analyses that take greater advantage of the long time and large spatial scales of these new data. NISAR will be but one mission in a constellation of radar satellites in the future delivering such data. ISCE currently supports all publicly available strip map mode space-borne SAR data since ERS and is expected to include support for upcoming missions. ISCE has been incorporated into two prototype cloud-based systems that have demonstrated its elasticity in addressing larger data processing problems in a "production" context and its ability to be

  3. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  4. [Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].

    Science.gov (United States)

    Hahn, P; Dullweber, F; Unglaub, F; Spies, C K

    2014-06-01

    Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  6. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    International Nuclear Information System (INIS)

    Abel, R.

    2000-01-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes

  7. SCEE 2008 book of abstracts. The 7. international conference on scientific computing in electrical engineering (SCEE 2008)

    Energy Technology Data Exchange (ETDEWEB)

    Roos, J.; Costa, L.R.J. (ed.)

    2008-09-15

    SCEE is an international conference series dedicated to Scientific Computing in Electrical Engineering. The 7th International Conference on Scientific Computing in Electrical Engineering (SCEE 2008) in Espoo, Finland, is organized by the Helsinki University of Technology (TKK); Faculty of Electronics, Communications and Automation (ECA); Department of Radio Science and Engineering (RAD); Circuit Theory Group. (SCEE 2008 web site: http://www.ct.tkk.fi/scee2008/). The aim of the SCEE 2008 conference is to bring together scientists from academia and industry with the goal of intensive discussions on modeling and numerical simulation of electronic circuits and of electromagnetic fields. The conference is mainly directed towards mathematicians and electrical engineers. The SCEE 2008 conference has the following four main topics: 1. Computational Electromagnetics (CE), 2. Circuit Simulation (CS), 3. Coupled Problems (CP), 4. Mathematical and Computational Methods (CM). The selection of abstracts in this book was carried out by the Program Committee; each abstract was reviewed by two or three reviewers. The authors of all accepted abstracts were invited to submit an extended full paper, which will be reviewed as well. The accepted full papers will later on be published in a separate post-conference book

  8. CUDA/GPU Technology : Parallel Programming For High Performance Scientific Computing

    OpenAIRE

    YUHENDRA; KUZE, Hiroaki; JOSAPHAT, Tetuko Sri Sumantyo

    2009-01-01

    [ABSTRACT]Graphics processing units (GP Us) originally designed for computer video cards have emerged as the most powerful chip in a high-performance workstation. In the high performance computation capabilities, graphic processing units (GPU) lead to much more powerful performance than conventional CPUs by means of parallel processing. In 2007, the birth of Compute Unified Device Architecture (CUDA) and CUDA-enabled GPUs by NVIDIA Corporation brought a revolution in the general purpose GPU a...

  9. Computer technologies of future teachers of fine art training as an object of scientific educational research

    Directory of Open Access Journals (Sweden)

    Bohdan Cherniavskyi

    2017-03-01

    Full Text Available The article deals with computer technology training, highlights the current state ofcomputerization of educational process in teacher training colleges, reveals the specifictechniques of professional training of teachers of fine arts to use computer technology inteaching careers.Key words: Methods of professional training, professional activities, computertechnology training future teachers of Fine Arts, the subject of research.

  10. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    International Nuclear Information System (INIS)

    Stoitsov, Mario; Nam, Hai Ah; Nazarewicz, Witold; Bulgac, Aurel; Hagen, Gaute; Kortelainen, E.M.; Pei, Junchen; Roche, K.J.; Schunck, N.; Thompson, I.; Vary, J.P.; Wild, S.

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  11. Parallel scientific computing theory, algorithms, and applications of mesh based and meshless methods

    CERN Document Server

    Trobec, Roman

    2015-01-01

    This book is concentrated on the synergy between computer science and numerical analysis. It is written to provide a firm understanding of the described approaches to computer scientists, engineers or other experts who have to solve real problems. The meshless solution approach is described in more detail, with a description of the required algorithms and the methods that are needed for the design of an efficient computer program. Most of the details are demonstrated on solutions of practical problems, from basic to more complicated ones. This book will be a useful tool for any reader interes

  12. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  13. Scientific Computers at the Helsinki University of Technology during the Post Pioneering Stage

    Science.gov (United States)

    Nykänen, Panu; Andersin, Hans

    The paper describes the process leading from the pioneering phase when the university was free to develop and build its own computers through the period when the university was dependent on cooperation with the local computer companies to the stage when a bureaucratic state organization took over the power to decide on acquiring computing equipment to the universities. This stage ended in the late 1970s when computing power gradually became a commodity that the individual laboratories and research projects could acquire just like any resource. This development paralleled the situation in many other countries and universities as well. We have chosen the Helsinki University of Technology (TKK) as a case to illustrate this development process, which for the researchers was very annoying and frustrating when it happened.

  14. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.; Roller, Sabine P.; Seitsonen, Ari Paavo; Valcke, Sophie; Keyes, David E.; Sawley, Marie Christine; Schulthess, Thomas C.; Shalf, John M.

    2013-01-01

    and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators

  15. Sudden Cardiac Risk Stratification with Electrocardiographic Indices - A Review on Computational Processing, Technology Transfer, and Scientific Evidence

    Directory of Open Access Journals (Sweden)

    Francisco Javier eGimeno-Blanes

    2016-03-01

    Full Text Available Great effort has been devoted in recent years to the development of sudden cardiac risk predictors as a function of electric cardiac signals, mainly obtained from the electrocardiogram (ECG analysis. But these prediction techniques are still seldom used in clinical practice, partly due to its limited diagnostic accuracy and to the lack of consensus about the appropriate computational signal processing implementation. This paper addresses a three-fold approach, based on ECG indexes, to structure this review on sudden cardiac risk stratification. First, throughout the computational techniques that had been widely proposed for obtaining these indexes in technical literature. Second, over the scientific evidence, that although is supported by observational clinical studies, they are not always representative enough. And third, via the limited technology transfer of academy-accepted algorithms, requiring further meditation for future systems. We focus on three families of ECG derived indexes which are tackled from the aforementioned viewpoints, namely, heart rate turbulence, heart rate variability, and T-wave alternans. In terms of computational algorithms, we still need clearer scientific evidence, standardizing, and benchmarking, siting on advanced algorithms applied over large and representative datasets. New scenarios like electronic health recordings, big data, long-term monitoring, and cloud databases, will eventually open new frameworks to foresee suitable new paradigms in the near future.

  16. Reduced-order modeling (ROM) for simulation and optimization powerful algorithms as key enablers for scientific computing

    CERN Document Server

    Milde, Anja; Volkwein, Stefan

    2018-01-01

    This edited monograph collects research contributions and addresses the advancement of efficient numerical procedures in the area of model order reduction (MOR) for simulation, optimization and control. The topical scope includes, but is not limited to, new out-of-the-box algorithmic solutions for scientific computing, e.g. reduced basis methods for industrial problems and MOR approaches for electrochemical processes. The target audience comprises research experts and practitioners in the field of simulation, optimization and control, but the book may also be beneficial for graduate students alike. .

  17. Science gateways for distributed computing infrastructures development framework and exploitation by scientific user communities

    CERN Document Server

    Kacsuk, Péter

    2014-01-01

    The book describes the science gateway building technology developed in the SCI-BUS European project and its adoption and customization method, by which user communities, such as biologists, chemists, and astrophysicists, can build customized, domain-specific science gateways. Many aspects of the core technology are explained in detail, including its workflow capability, job submission mechanism to various grids and clouds, and its data transfer mechanisms among several distributed infrastructures. The book will be useful for scientific researchers and IT professionals engaged in the develop

  18. Portable computing - A fielded interactive scientific application in a small off-the-shelf package

    Science.gov (United States)

    Groleau, Nicolas; Hazelton, Lyman; Frainier, Rich; Compton, Michael; Colombano, Silvano; Szolovits, Peter

    1993-01-01

    Experience with the design and implementation of a portable computing system for STS crew-conducted science is discussed. Principal-Investigator-in-a-Box (PI) will help the SLS-2 astronauts perform vestibular (human orientation system) experiments in flight. PI is an interactive system that provides data acquisition and analysis, experiment step rescheduling, and various other forms of reasoning to astronaut users. The hardware architecture of PI consists of a computer and an analog interface box. 'Off-the-shelf' equipment is employed in the system wherever possible in an effort to use widely available tools and then to add custom functionality and application codes to them. Other projects which can help prospective teams to learn more about portable computing in space are also discussed.

  19. The Impact of Misspelled Words on Automated Computer Scoring: A Case Study of Scientific Explanations

    Science.gov (United States)

    Ha, Minsu; Nehm, Ross H.

    2016-06-01

    Automated computerized scoring systems (ACSSs) are being increasingly used to analyze text in many educational settings. Nevertheless, the impact of misspelled words (MSW) on scoring accuracy remains to be investigated in many domains, particularly jargon-rich disciplines such as the life sciences. Empirical studies confirm that MSW are a pervasive feature of human-generated text and that despite improvements, spell-check and auto-replace programs continue to be characterized by significant errors. Our study explored four research questions relating to MSW and text-based computer assessments: (1) Do English language learners (ELLs) produce equivalent magnitudes and types of spelling errors as non-ELLs? (2) To what degree do MSW impact concept-specific computer scoring rules? (3) What impact do MSW have on computer scoring accuracy? and (4) Are MSW more likely to impact false-positive or false-negative feedback to students? We found that although ELLs produced twice as many MSW as non-ELLs, MSW were relatively uncommon in our corpora. The MSW in the corpora were found to be important features of the computer scoring models. Although MSW did not significantly or meaningfully impact computer scoring efficacy across nine different computer scoring models, MSW had a greater impact on the scoring algorithms for naïve ideas than key concepts. Linguistic and concept redundancy in student responses explains the weak connection between MSW and scoring accuracy. Lastly, we found that MSW tend to have a greater impact on false-positive feedback. We discuss the implications of these findings for the development of next-generation science assessments.

  20. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  1. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    Science.gov (United States)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  2. SciCADE 95: International conference on scientific computation and differential equations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    This report consists of abstracts from the conference. Topics include algorithms, computer codes, and numerical solutions for differential equations. Linear and nonlinear as well as boundary-value and initial-value problems are covered. Various applications of these problems are also included.

  3. The Transition and Adoption to Modern Programming Concepts for Scientific Computing in Fortran

    Directory of Open Access Journals (Sweden)

    Charles D. Norton

    2007-01-01

    Full Text Available This paper describes our experiences in the early exploration of modern concepts introduced in Fortran90 for large-scale scientific programming. We review our early work in expressing object-oriented concepts based on the new Fortran90 constructs – foreign to most programmers at the time – our experimental work in applying them to various applications, the impact on the WG5/J3 standards committees to consider formalizing object-oriented constructs for later versions of Fortran, and work in exploring how other modern programming techniques such as Design Patterns can and have impacted our software development. Applications will be drawn from plasma particle simulation and finite element adaptive mesh refinement for solid earth crustal deformation modeling.

  4. Porting of Scientific Applications to Grid Computing on GridWay

    Directory of Open Access Journals (Sweden)

    J. Herrera

    2005-01-01

    Full Text Available The expansion and adoption of Grid technologies is prevented by the lack of a standard programming paradigm to port existing applications among different environments. The Distributed Resource Management Application API has been proposed to aid the rapid development and distribution of these applications across different Distributed Resource Management Systems. In this paper we describe an implementation of the DRMAA standard on a Globus-based testbed, and show its suitability to express typical scientific applications, like High-Throughput and Master-Worker applications. The DRMAA routines are supported by the functionality offered by the GridWay2 framework, which provides the runtime mechanisms needed for transparently executing jobs on a dynamic Grid environment based on Globus. As cases of study, we consider the implementation with DRMAA of a bioinformatics application, a genetic algorithm and the NAS Grid Benchmarks.

  5. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  6. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  7. Eighth SIAM conference on parallel processing for scientific computing: Final program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This SIAM conference is the premier forum for developments in parallel numerical algorithms, a field that has seen very lively and fruitful developments over the past decade, and whose health is still robust. Themes for this conference were: combinatorial optimization; data-parallel languages; large-scale parallel applications; message-passing; molecular modeling; parallel I/O; parallel libraries; parallel software tools; parallel compilers; particle simulations; problem-solving environments; and sparse matrix computations.

  8. USSR and Eastern Europe Scientifics Abstracts cybernetics, Computers, and Automation Technology No. 25

    Science.gov (United States)

    1976-12-01

    number). In addition, the microcomputer works with constants (the "K" key) and negative numbers. Performance time is less than 0.5 seconds for all...been added qualitative evaluations such as its suitability for mechanical milking, the albumin content in its milk, its resistance to mastitis , and...opinion it is advisable to create at the All-Union Academy of Agri- cultural Sciences imeni V. I. Lenin a special council on the use of computer 5/7

  9. INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)

    International Nuclear Information System (INIS)

    Arezzini, S; Carboni, A; Caruso, G; Ciampa, A; Coscetti, S; Mazzoni, E; Piras, S

    2014-01-01

    The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.

  10. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  11. Statistical physics of fracture: scientific discovery through high-performance computing

    International Nuclear Information System (INIS)

    Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T

    2006-01-01

    The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness

  12. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    Science.gov (United States)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  13. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  14. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  15. An Analysis on the Effect of Computer Self-Efficacy over Scientific Research Self-Efficacy and Information Literacy Self-Efficacy

    Science.gov (United States)

    Tuncer, Murat

    2013-01-01

    Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…

  16. UIMX: A User Interface Management System For Scientific Computing With X Windows

    Science.gov (United States)

    Foody, Michael

    1989-09-01

    Applications with iconic user interfaces, (for example, interfaces with pulldown menus, radio buttons, and scroll bars), such as those found on Apple's Macintosh computer and the IBM PC under Microsoft's Presentation Manager, have become very popular, and for good reason. They are much easier to use than applications with traditional keyboard-oriented interfaces, so training costs are much lower and just about anyone can use them. They are standardized between applications, so once you learn one application you are well along the way to learning another. The use of one reinforces the common elements between applications of the interface, and, as a result, you remember how to use them longer. Finally, for the developer, their support costs can be much lower because of their ease of use.

  17. The MicroGrid: A Scientific Tool for Modeling Computational Grids

    Directory of Open Access Journals (Sweden)

    H.J. Song

    2000-01-01

    Full Text Available The complexity and dynamic nature of the Internet (and the emerging Computational Grid demand that middleware and applications adapt to the changes in configuration and availability of resources. However, to the best of our knowledge there are no simulation tools which support systematic exploration of dynamic Grid software (or Grid resource behavior. We describe our vision and initial efforts to build tools to meet these needs. Our MicroGrid simulation tools enable Globus applications to be run in arbitrary virtual grid resource environments, enabling broad experimentation. We describe the design of these tools, and their validation on micro-benchmarks, the NAS parallel benchmarks, and an entire Grid application. These validation experiments show that the MicroGrid can match actual experiments within a few percent (2% to 4%.

  18. The role of scientific middleware in the future of HEP computing

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    In the 18 months since the CHEP03 meeting in San Diego, the HEP community deployed the current generation of grid technologies in a veracity of settings. Legacy software as well as recently developed applications was interfaced with middleware tools to deliver end-to-end capabilities to HEP experiments in different stages of their life cycles. In a series of data challenges, reprocessing efforts and data distribution activities the community demonstrated the benefits distributed computing can offer and the power a range of middleware tools can deliver. After running millions of jobs, moving tera-bytes of data, creating millions of files and resolving hundreds of bug reports, the community also exposed the limitations of these middleware tools. As we move to the next level of challenges, requirements and expectations, we must also examine the methods and procedures we employ to develop, implement and maintain our common suite of middleware tools. The talk will focus on the role common middleware ...

  19. Man versus Computer: Difference of the Essences. The Problem of the Scientific Creation

    Directory of Open Access Journals (Sweden)

    Temur Z. Kalanov

    2017-07-01

    Full Text Available In this study it is proposed the critical analysis of the creation of Artificial Intelligence (AI and of Artificial General Intelligence (AGI. The unity of formal logic and rational dialectics is the methodological basis of the analysis. The main results of the analysis are as follows: (1 the model of man represents the unity of the two material aspects: “physiological body” (controllable aspect and “psychical body” (controlling aspect; (2 the “psychical body” is the subsystem “subconsciousness + consciousness”; (3 in the comprehensive sense of the word, the thinking is an attribute of the complete system “physiological body + psychical body + environment”. (3 in the broad sense of the word, thinking and creativity are an essential feature of the subsystem “subconsciousness + consciousness”; (4 in the narrow (concise sense of the word, thinking and creativity are the attribute of the instinct of the conservation (preservation, retention, maintenance of life (i.e., the self-preservation instinct, the survival instinct; the instinct of the conservation of life exists in subconsciousness; (5 the instinct of life conservation is a system of elementary (basic instincts; thinking is the attribute of the each elementary instinct; (6 the mechanism of thinking and the essence of creation cannot be cognized by men; (7 a computer as a device cannot think and create (in particular, it cannot prove theorems, because a computer does not have the subconsciousness; (8 the modeling of human thinking, Human Intellect, and the creation of AI and AGI are the impossible because the essential properties of the complete system “man + environment” cannot be cognized and modeled; (9 the existence of AI and AGI conflicts with the essence of the thinking; (10 the existence of AI and AGI contradict to formal-logical and rationaldialectical laws.

  20. Modular Approaches to Earth Science Scientific Computing: 3D Electromagnetic Induction Modeling as an Example

    Science.gov (United States)

    Tandon, K.; Egbert, G.; Siripunvaraporn, W.

    2003-12-01

    We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.

  1. A Computational Unification of Scientific Law:. Spelling out a Universal Semantics for Physical Reality

    Science.gov (United States)

    Marcer, Peter J.; Rowlands, Peter

    2013-09-01

    The principal criteria Cn (n = 1 to 23) and grammatical production rules are set out of a universal computational rewrite language spelling out a semantic description of an emergent, self-organizing architecture for the cosmos. These language productions already predicate: (1) Einstein's conservation law of energy, momentum and mass and, subsequently, (2) with respect to gauge invariant relativistic space time (both Lorentz special & Einstein general); (3) Standard Model elementary particle physics; (4) the periodic table of the elements & chemical valence; and (5) the molecular biological basis of the DNA / RNA genetic code; so enabling the Cybernetic Machine specialist Groups Mission Statement premise;** (6) that natural semantic language thinking at the higher level of the self-organized emergent chemical molecular complexity of the human brain (only surpassed by that of the cosmos itself!) would be realized (7) by this same universal semantic language via (8) an architecture of a conscious human brain/mind and self which, it predicates consists of its neural / glia and microtubule substrates respectively, so as to endow it with; (9) the intelligent semantic capability to be able to specify, symbolize, spell out and understand the cosmos that conceived it; and (10) provide a quantum physical explanation of consciousness and of how (11) the dichotomy between first person subjectivity and third person objectivity or `hard problem' is resolved.

  2. A dry EEG-system for scientific research and brain-computer interfaces

    Directory of Open Access Journals (Sweden)

    Thorsten Oliver Zander

    2011-05-01

    Full Text Available Although it ranks among the oldest tools in neuroscientific research, electroencephalography (EEG still forms the method of choice in a wide variety of clinical and research applications. In the context of Brain-Computer Interfacing (BCI, EEG recently has become a tool to enhance Human-Machine Interaction (HMI. EEG could be employed in a wider range of environments, especially for the use of BCI systems in a clinical context or at the homes of patients. However, the application of EEG in these contexts is impeded by the cumbersome preparation of the electrodes with conductive gel that is necessary to lower the impedance between electrodes and scalp. Dry electrodes could provide a solution to this barrier and allow for EEG applications outside the laboratory. In addition, dry electrodes may reduce the time needed for neurological exams in clinical practice. This study evaluates a prototype of a three-channel dry electrode EEG system, comparing it to state-of-the-art conventional EEG electrodes. Two experimental paradigms were used: first, Event-Related Potentials (ERP were investigated with a variant of the oddball paradigm. Second, features of the frequency domain were compared by a paradigm inducing occipital alpha. Furthermore, both paradigms were used to evaluate BCI classification accuracies of both EEG systems. Amplitude and temporal structure of ERPs as well as features in the frequency domain did not differ significantly between the EEG systems. BCI classification accuracies were equally high in both systems when the frequency domain was considered. With respect to the oddball classification accuracy, there were slight differences between the wet and dry electrode systems. We conclude that the tested dry electrodes were capable to detect EEG signals with good quality and that these signals can be used for research or BCI applications. Easy to handle electrodes may help to foster the use of EEG among a wider range of potential users.

  3. Towards Monitoring-as-a-service for Scientific Computing Cloud applications using the ElasticSearch ecosystem

    CERN Document Server

    Bagnasco, S; Guarise, A; Lusso, S; Masera, M; Vallero, S

    2015-01-01

    The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monit...

  4. Towards Monitoring-as-a-service for Scientific Computing Cloud applications using the ElasticSearch ecosystem

    Science.gov (United States)

    Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monitoring solution if needed. The heterogeneous accounting information is transferred from the database to the ElasticSearch engine via a custom Logstash plugin. Each use-case is indexed separately in ElasticSearch and we setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service. Moreover, we have developed a billing system for our private Cloud, which relies on the RabbitMQ message queue for asynchronous communication to the database and on the ELK stack for its graphical interface. The Italian Grid accounting framework is also migrating to a similar set-up. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BESIII

  5. Wavelets in scientific computing

    DEFF Research Database (Denmark)

    Nielsen, Ole Møller

    1998-01-01

    the FWT can be used as a front-end for efficient image compression schemes. Part II deals with vector-parallel implementations of several variants of the Fast Wavelet Transform. We develop an efficient and scalable parallel algorithm for the FWT and derive a model for its performance. Part III...... supported wavelets in the context of multiresolution analysis. These wavelets are particularly attractive because they lead to a stable and very efficient algorithm, namely the fast wavelet transform (FWT). We give estimates for the approximation characteristics of wavelets and demonstrate how and why...... is an investigation of the potential for using the special properties of wavelets for solving partial differential equations numerically. Several approaches are identified and two of them are described in detail. The algorithms developed are applied to the nonlinear Schrödinger equation and Burgers' equation...

  6. Improving the Efficiency of the Nodal Integral Method With the Portable, Extensible Tool-kit for Scientific Computation

    International Nuclear Information System (INIS)

    Toreja, Allen J.; Uddin, Rizwan

    2002-01-01

    An existing implementation of the nodal integral method for the time-dependent convection-diffusion equation is modified to incorporate various PETSc (Portable, Extensible Tool-kit for Scientific Computation) solver and pre-conditioner routines. In the modified implementation, the default iterative Gauss-Seidel solver is replaced with one of the following PETSc iterative linear solver routines: Generalized Minimal Residuals, Stabilized Bi-conjugate Gradients, or Transpose-Free Quasi-Minimal Residuals. For each solver, a Jacobi or a Successive Over-Relaxation pre-conditioner is used. Two sample problems, one with a low Peclet number and one with a high Peclet number, are solved using the new implementation. In all the cases tested, the new implementation with the PETSc solver routines outperforms the original Gauss-Seidel implementation. Moreover, the PETSc Stabilized Bi-conjugate Gradients routine performs the best on the two sample problems leading to CPU times that are less than half the CPU times of the original implementation. (authors)

  7. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on

  8. Analysis of Scientific Attitude, Computer Anxiety, Educational Internet Use, Problematic Internet Use, and Academic Achievement of Middle School Students According to Demographic Variables

    Science.gov (United States)

    Bekmezci, Mehmet; Celik, Ismail; Sahin, Ismail; Kiray, Ahmet; Akturk, Ahmet Oguz

    2015-01-01

    In this research, students' scientific attitude, computer anxiety, educational use of the Internet, academic achievement, and problematic use of the Internet are analyzed based on different variables (gender, parents' educational level and daily access to the Internet). The research group involves 361 students from two middle schools which are…

  9. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    Energy Technology Data Exchange (ETDEWEB)

    Saffer, Shelley (Sam) I.

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  10. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Galli, Giulia [Univ. of California, Davis, CA (United States). Workshop Chair; Dunning, Thom [Univ. of Illinois, Urbana, IL (United States). Workshop Chair

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. PARA'04 Workshop on State-of-the-art in Scientific Computing, June 20-23, 2004: Complementary Proceedings

    DEFF Research Database (Denmark)

    Dongarra, Jack; Madsen, Kaj; Wasniewski, Jerzy

    2004-01-01

    , was held in Lyngby, Denmark, June 20-23, 2004. The PARA'04 Workshop was organized by Jack Dongarra from the University of Tennessee and Oak Ridge National Laboratory, and Kaj Madsen and Jerzy Wasniewski from the Technical University of Denmark. The emphasis here was shifted to High-Performance Computing...... (HPC). The ongoing development of ever more advanced computers provides the potential for solving increasingly dif cult computational problems. However, given the complexity of modern computer architectures, the task of realizing this potential needs careful attention. For example, the failure......The PARA workshops in the past have been devoted to parallel computing methods in science and technology. There have been seven PARA meetings to date: PARA'94, PARA'95 and PARA'96 in Lyngby, Denmark, PARA'98 in Umeå, Sweden, PARA'2000 in Bergen, Norway, PARA'02 in Espoo, Finland, and PARA'04 again...

  13. Cloud-based opportunities in scientific computing: insights from processing Suomi National Polar-Orbiting Partnership (S-NPP) Direct Broadcast data

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S.

    2013-12-01

    The cloud is proving to be a uniquely promising platform for scientific computing. Our experience with processing satellite data using Amazon Web Services highlights several opportunities for enhanced performance, flexibility, and cost effectiveness in the cloud relative to traditional computing -- for example: - Direct readout from a polar-orbiting satellite such as the Suomi National Polar-Orbiting Partnership (S-NPP) requires bursts of processing a few times a day, separated by quiet periods when the satellite is out of receiving range. In the cloud, by starting and stopping virtual machines in minutes, we can marshal significant computing resources quickly when needed, but not pay for them when not needed. To take advantage of this capability, we are automating a data-driven approach to the management of cloud computing resources, in which new data availability triggers the creation of new virtual machines (of variable size and processing power) which last only until the processing workflow is complete. - 'Spot instances' are virtual machines that run as long as one's asking price is higher than the provider's variable spot price. Spot instances can greatly reduce the cost of computing -- for software systems that are engineered to withstand unpredictable interruptions in service (as occurs when a spot price exceeds the asking price). We are implementing an approach to workflow management that allows data processing workflows to resume with minimal delays after temporary spot price spikes. This will allow systems to take full advantage of variably-priced 'utility computing.' - Thanks to virtual machine images, we can easily launch multiple, identical machines differentiated only by 'user data' containing individualized instructions (e.g., to fetch particular datasets or to perform certain workflows or algorithms) This is particularly useful when (as is the case with S-NPP data) we need to launch many very similar machines to process an unpredictable number of

  14. 1 March 2012 - British University of Oxford Head of the Mathematical, Physical & Life Sciences Division A. N. Halliday FRS signing the guest book with Director for Research and Scientific Computing S. Bertolucci.

    CERN Multimedia

    Jean-Claude Gadmer

    2012-01-01

    1 March 2012 - British University of Oxford Head of the Mathematical, Physical & Life Sciences Division A. N. Halliday FRS signing the guest book with Director for Research and Scientific Computing S. Bertolucci.

  15. 28 October 2013- Former US Vice President A. Gore signing the guest book with Technology Department Head F. Bordry, Head of International Relations R. Voss, Director for Research and Scientific Computing S. Bertolucci and CMS Collaboration Spokesperson J. Incandela.

    CERN Multimedia

    Maximilien Brice

    2013-01-01

    28 October 2013- Former US Vice President A. Gore signing the guest book with Technology Department Head F. Bordry, Head of International Relations R. Voss, Director for Research and Scientific Computing S. Bertolucci and CMS Collaboration Spokesperson J. Incandela.

  16. 1st October 2010 - Chinese Vice President of the Academy of Sciences signing the guest book and exchanging gifts with CERN Director for Research and Scientific Computing S. Bertolucci, witnessed by Adviser R. Voss

    CERN Multimedia

    Maximilien Brice

    2010-01-01

    1st October 2010 - Chinese Vice President of the Academy of Sciences signing the guest book and exchanging gifts with CERN Director for Research and Scientific Computing S. Bertolucci, witnessed by Adviser R. Voss

  17. 24 October 2014 - President of the Republic of Ecuador R. Correa Delgado signing the guest book with Vice President L. Moreno and Director for Research and Scientific Computing S. Bertolucci.

    CERN Multimedia

    Guillaume, Jeanneret

    2014-01-01

    visiting the ATLAS experimental cavern with Collaboration PSokesperson D. Charlton and ATLAS User F. Monticelli; throughout accompanied by Adviser for Ecuador J. Salicio Diez and Director for Research and Scientific Computing S. Bertolucci.

  18. Modern Trends Of Computation, Simulation, and Communication, And Their Impacts On The Progress Of Scientific And Engineering Research, Development, And Education

    International Nuclear Information System (INIS)

    Bunjamin, Muhammad

    2001-01-01

    A short report on the modern trends of computation, simulation, and communication in the 1990s is presented, along with their impacts on the progress of scientific and engineering research, development, and education. A full description of this giant issue is certainly a m ission impossible f or the author. Nevertheless, it is the author's hope that it will at least give an overall view about what is going on in this very dynamic field in the advanced countries. After t hinking globally t hru reading this report, we should then decide on w hat and how to act locally t o respond to these global trends. The main source of information reported here were the computational science and engineering journals and books issued during the 1990s as listed in the references below

  19. 28 March 2014 - Italian Minister of Education, University and Research S. Giannini welcomed by CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci in the ATLAS experimental cavern with Former Collaboration Spokesperson F. Gianotti. Signature of the guest book with Belgian State Secretary for the Scientific Policy P. Courard.

    CERN Multimedia

    Gadmer, Jean-Claude

    2014-01-01

    28 March 2014 - Italian Minister of Education, University and Research S. Giannini welcomed by CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci in the ATLAS experimental cavern with Former Collaboration Spokesperson F. Gianotti. Signature of the guest book with Belgian State Secretary for the Scientific Policy P. Courard.

  20. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-10-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the prediction-observation-explanation (POE) strategy (White and Gunstone in Probing understanding. Routledge, New York, 1992) on facilitating preschoolers' acquisition of scientific concepts regarding light and shadow. The children's alternative conceptions were explored as well. Fifty participants were randomly assigned into either an experimental group that played a computer game integrating the POE model or a control group that played a non-POE computer game. By assessing the students' conceptual understanding through interviews, this study revealed that the students in the experimental group significantly outperformed their counterparts in the concepts regarding "shadow formation in daylight" and "shadow orientation." However, children in both groups, after playing the games, still expressed some alternative conceptions such as "Shadows always appear behind a person" and "Shadows should be on the same side as the sun."

  1. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    Energy Technology Data Exchange (ETDEWEB)

    Stover, E. K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10/sup 17/ cm/sup -3/, where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10/sup 16/ cm/sup -3/, where the ion mean-free-path was on the order of the plasma column length.

  2. Computer simulation of plasma behavior in open-ended linear theta machines. Scientific report 81-5

    International Nuclear Information System (INIS)

    Stover, E.K.

    1981-04-01

    Zero-dimensional and one-dimensional fluid plasma computer models have been developed to study the behavior of linear theta pinch plasmas. Computer simulation results generated from these codes are compared with data obtained from two theta pinch experiments so that significant machine plasma behavior can be identified. The experiments examined are a collisional experiment, T/sub i/ approx. 50 eV, n/sub e/ approx. 10 17 cm -3 , where the plasma mean-free-path was significantly less than the plasma column length, and a hot ion species experiment, T/sub i/ approx. 3 keV, n/sub e/ approx. 10 16 cm -3 , where the ion mean-free-path was on the order of the plasma column length

  3. Designing scientific applications on GPUs

    CERN Document Server

    Couturier, Raphael

    2013-01-01

    Many of today's complex scientific applications now require a vast amount of computational power. General purpose graphics processing units (GPGPUs) enable researchers in a variety of fields to benefit from the computational power of all the cores available inside graphics cards.Understand the Benefits of Using GPUs for Many Scientific ApplicationsDesigning Scientific Applications on GPUs shows you how to use GPUs for applications in diverse scientific fields, from physics and mathematics to computer science. The book explains the methods necessary for designing or porting your scientific appl

  4. Toward executable scientific publications

    NARCIS (Netherlands)

    Strijkers, R.J.; Cushing, R.; Vasyunin, D.; Laat, C. de; Belloum, A.S.Z.; Meijer, R.J.

    2011-01-01

    Reproducibility of experiments is considered as one of the main principles of the scientific method. Recent developments in data and computation intensive science, i.e. e-Science, and state of the art in Cloud computing provide the necessary components to preserve data sets and re-run code and

  5. Software Innovations Speed Scientific Computing

    Science.gov (United States)

    2012-01-01

    To help reduce the time needed to analyze data from missions like those studying the Sun, Goddard Space Flight Center awarded SBIR funding to Tech-X Corporation of Boulder, Colorado. That work led to commercial technologies that help scientists accelerate their data analysis tasks. Thanks to its NASA work, the company doubled its number of headquarters employees to 70 and generated about $190,000 in revenue from its NASA-derived products.

  6. Nuclear Physics Exascale Requirements Review: An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Nuclear Physics, June 15 - 17, 2016, Gaithersburg, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Savage, Martin J. [Univ. of Washington, Seattle, WA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Energy Sciences Network (ESnet), Berkeley, CA (United States); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Energy Sciences Network (ESnet), Berkeley, CA (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Riley, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States). Advanced Photon Source (APS); Rotman, Lauren [Energy Sciences Network (ESnet), Berkeley, CA (United States); Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Avakian, Harut [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Ayyad, Yassid [Michigan State Univ., East Lansing, MI (United States). Dept. of Physics and Astronomy. National Superconducting Cyclotron Lab.; Bass, Steffen A. [Duke Univ., Durham, NC (United States); Bazin, Daniel [Michigan State Univ., East Lansing, MI (United States). Dept. of Physics and Astronomy. National Superconducting Cyclotron Lab.; Boehnlein, Amber [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Bollen, Georg [Michigan State Univ., East Lansing, MI (United States). Facility for Rare Isotope Beams; Broussard, Leah J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Calder, Alan [Stony Brook Univ., NY (United States); Couch, Sean [Michigan State Univ., East Lansing, MI (United States); Couture, Aaron [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cromaz, Mario [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Detwiler, Jason [Univ. of Washington, Seattle, WA (United States); Duan, Huaiyu [Univ. of New Mexico, Albuquerque, NM (United States); Edwards, Robert [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Engel, Jonathan [Univ. of North Carolina, Chapel Hill, NC (United States); Fryer, Chris [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fuller, George M. [Univ. of California, San Diego, CA (United States); Gandolfi, Stefano [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gavalian, Gagik [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Georgobiani, Dali [Michigan State Univ., East Lansing, MI (United States); Gupta, Rajan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Hausmann, Marc [Michigan State Univ., East Lansing, MI (United States); Heyes, Graham [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Hix, W. Ralph [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); ito, Mark [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Jansen, Gustav [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jones, Richard [Univ. of Connecticut, Storrs, CT (United States); Joo, Balint [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Kaczmarek, Olaf [Bielefeld Univ. (Germany); Kasen, Dan [Univ. of California, Berkeley, CA (United States); Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Kurth, Thorsten [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center; Lauret, Jerome [Brookhaven National Lab. (BNL), Upton, NY (United States); Lawrence, David [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Lin, Huey-Wen [Michigan State Univ., East Lansing, MI (United States); Lin, Meifeng [Brookhaven National Lab. (BNL), Upton, NY (United States); Mantica, Paul [Michigan State Univ., East Lansing, MI (United States); Maris, Peter [Iowa State Univ., Ames, IA (United States); Messer, Bronson [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mittig, Wolfgang [Michigan State Univ., East Lansing, MI (United States); Mosby, Shea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukherjee, Swagato [Brookhaven National Lab. (BNL), Upton, NY (United States); Nam, Hai Ah [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); navratil, Petr [Tri-Univ. Meson Facility (TRIUMF), Vancouver, BC (Canada); Nazarewicz, Witek [Michigan State Univ., East Lansing, MI (United States); Ng, Esmond [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); O' Donnell, Tommy [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Orginos, Konstantinos [College of William and Mary, Williamsburg, VA (United States); Pellemoine, Frederique [Michigan State Univ., East Lansing, MI (United States). Facility for Rare Isotope Beams; Petreczky, Peter [Brookhaven National Lab. (BNL), Upton, NY (United States); Pieper, Steven C. [Argonne National Lab. (ANL), Argonne, IL (United States); Pinkenburg, Christopher H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Plaster, Brad [Univ. of Kent,Canterbury (United Kingdom); Porter, R. Jefferson [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Portillo, Mauricio [Michigan State Univ., East Lansing, MI (United States). Facility for Rare Isotope Beams; Pratt, Scott [Michigan State Univ., East Lansing, MI (United States); Purschke, Martin L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Qiang, Ji [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Quaglioni, Sofia [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Richards, David [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Roblin, Yves [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Schenke, Bjorn [Brookhaven National Lab. (BNL), Upton, NY (United States); Schiavilla, Rocco [Old Dominion Univ., Norfolk, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Schlichting, Soren [Brookhaven National Lab. (BNL), Upton, NY (United States); Schunck, Nicolas [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Steinbrecher, Patrick [Brookhaven National Lab. (BNL), Upton, NY (United States); Strickland, Michael [Kent State Univ., Kent, OH (United States); Syritsyn, Sergey [Stony Brook Univ., NY (United States); Terzic, Balsa [Old Dominion Univ., Norfolk, VA (United States); Varner, Robert [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Vary, James [Iowa State Univ., Ames, IA (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States); Winter, Frank [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Zegers, Remco [Michigan State Univ., East Lansing, MI (United States); Zhang, He [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Ziegler, Veronique [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Zingale, Michael [Stony Brook Univ., NY (United States)

    2017-02-28

    Imagine being able to predict — with unprecedented accuracy and precision — the structure of the proton and neutron, and the forces between them, directly from the dynamics of quarks and gluons, and then using this information in calculations of the structure and reactions of atomic nuclei and of the properties of dense neutron stars (NSs). Also imagine discovering new and exotic states of matter, and new laws of nature, by being able to collect more experimental data than we dream possible today, analyzing it in real time to feed back into an experiment, and curating the data with full tracking capabilities and with fully distributed data mining capabilities. Making this vision a reality would improve basic scientific understanding, enabling us to precisely calculate, for example, the spectrum of gravity waves emitted during NS coalescence, and would have important societal applications in nuclear energy research, stockpile stewardship, and other areas. This review presents the components and characteristics of the exascale computing ecosystems necessary to realize this vision.

  7. Nuclear Physics Exascale Requirements Review: An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Nuclear Physics, June 15 - 17, 2016, Gaithersburg, Maryland

    International Nuclear Information System (INIS)

    Carlson, Joseph; Savage, Martin J.; Gerber, Richard; Antypas, Katie; Bard, Deborah; Coffey, Richard; Dart, Eli; Dosanjh, Sudip; Hack, James; Monga, Inder; Papka, Michael E.; Riley, Katherine; Rotman, Lauren; Straatsma, Tjerk; Wells, Jack; Avakian, Harut; Ayyad, Yassid; Bazin, Daniel; Bollen, Georg; Calder, Alan; Couch, Sean; Couture, Aaron; Cromaz, Mario; Detmold, William; Detwiler, Jason; Duan, Huaiyu; Edwards, Robert; Engel, Jonathan; Fryer, Chris; Fuller, George M.; Gandolfi, Stefano; Gavalian, Gagik; Georgobiani, Dali; Gupta, Rajan; Gyurjyan, Vardan; Hausmann, Marc; Heyes, Graham; Hix, W. Ralph; Ito, Mark; Jansen, Gustav; Jones, Richard; Joo, Balint; Kaczmarek, Olaf; Kasen, Dan; Kostin, Mikhail; Kurth, Thorsten; Lawrence, David; Lin, Huey-Wen; Lin, Meifeng; Mantica, Paul; Maris, Peter; Messer, Bronson; Mittig, Wolfgang; Mosby, Shea; Mukherjee, Swagato; Nam, Hai Ah; Navratil, Petr; Nazarewicz, Witek; Ng, Esmond; O'Donnell, Tommy; Orginos, Konstantinos; Pellemoine, Frederique; Pieper, Steven C.; Pinkenburg, Christopher H.; Plaster, Brad; Porter, R. Jefferson; Portillo, Mauricio; Purschke, Martin L.; Qiang, Ji; Quaglioni, Sofia; Richards, David; Roblin, Yves; Schenke, Bjorn; Schiavilla, Rocco; Schlichting, Soren; Schunck, Nicolas; Steinbrecher, Patrick; Strickland, Michael; Syritsyn, Sergey; Terzic, Balsa; Varner, Robert; Vary, James; Wild, Stefan; Winter, Frank; Zegers, Remco; Zhang, He; Ziegler, Veronique; Zingale, Michael

    2017-01-01

    Imagine being able to predict - with unprecedented accuracy and precision - the structure of the proton and neutron, and the forces between them, directly from the dynamics of quarks and gluons, and then using this information in calculations of the structure and reactions of atomic nuclei and of the properties of dense neutron stars (NSs). Also imagine discovering new and exotic states of matter, and new laws of nature, by being able to collect more experimental data than we dream possible today, analyzing it in real time to feed back into an experiment, and curating the data with full tracking capabilities and with fully distributed data mining capabilities. Making this vision a reality would improve basic scientific understanding, enabling us to precisely calculate, for example, the spectrum of gravity waves emitted during NS coalescence, and would have important societal applications in nuclear energy research, stockpile stewardship, and other areas. This review presents the components and characteristics of the exascale computing ecosystems necessary to realize this vision.

  8. Scientific Services on the Cloud

    Science.gov (United States)

    Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong

    Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.

  9. Scientific Inquiry Self-Efficacy and Computer Game Self-Efficacy as Predictors and Outcomes of Middle School Boys' and Girls' Performance in a Science Assessment in a Virtual Environment

    Science.gov (United States)

    Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa

    2015-01-01

    The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game…

  10. 30th November 2010 - Norwegian Ministry of Government Administration, Reform and Church Affairs State Secretary R. Valle signing the guest book with Head of International Relations F. Pauss and Director for Research and Scientific Computing S. Bertolucci; visiting CERN Computer Centre with Information Technology Department Head F. Hemmer.

    CERN Multimedia

    Maximilien Brice

    2010-01-01

    30th November 2010 - Norwegian Ministry of Government Administration, Reform and Church Affairs State Secretary R. Valle signing the guest book with Head of International Relations F. Pauss and Director for Research and Scientific Computing S. Bertolucci; visiting CERN Computer Centre with Information Technology Department Head F. Hemmer.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  15. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Riley, Katherine [Argonne National Lab., IL (United States). Argonne Leadership Computing Facility (ALCF); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States). Argonne Leadership Computing Facility (ALCF); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States). Argonne Leadership Computing Facility; Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). ESnet

    2018-01-22

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, and deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain

  16. Scientific progress report 1980

    International Nuclear Information System (INIS)

    1981-01-01

    The R + D-projects in this field and the infrastructural tasks mentioned are handled in seven working- and two project groups: Computer systems, Numerical and applied mathematics, Software development, Process calculation systems- hardware, Nuclear electronics, measuring- and automatic control technique, Research of component parts and irradiation tests, Central data processing, Processing of process data in the science of medicine, Co-operation in the BERNET-project in the 'Wissenschaftliches Rechenzentrum Berlin (WRB)' (scientific computer center in Berlin). (orig./WB)

  17. Cooperative and competitive concurrency in scientific computing. A full open-source upgrade of the program for dynamical calculations of RHEED intensity oscillations

    Science.gov (United States)

    Daniluk, Andrzej

    2011-06-01

    A computational model is a computer program, which attempts to simulate an abstract model of a particular system. Computational models use enormous calculations and often require supercomputer speed. As personal computers are becoming more and more powerful, more laboratory experiments can be converted into computer models that can be interactively examined by scientists and students without the risk and cost of the actual experiments. The future of programming is concurrent programming. The threaded programming model provides application programmers with a useful abstraction of concurrent execution of multiple tasks. The objective of this release is to address the design of architecture for scientific application, which may execute as multiple threads execution, as well as implementations of the related shared data structures. New version program summaryProgram title: GrowthCP Catalogue identifier: ADVL_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 32 269 No. of bytes in distributed program, including test data, etc.: 8 234 229 Distribution format: tar.gz Programming language: Free Object Pascal Computer: multi-core x64-based PC Operating system: Windows XP, Vista, 7 Has the code been vectorised or parallelized?: No RAM: More than 1 GB. The program requires a 32-bit or 64-bit processor to run the generated code. Memory is addressed using 32-bit (on 32-bit processors) or 64-bit (on 64-bit processors with 64-bit addressing) pointers. The amount of addressed memory is limited only by the available amount of virtual memory. Supplementary material: The figures mentioned in the "Summary of revisions" section can be obtained here. Classification: 4.3, 7.2, 6.2, 8, 14 External routines: Lazarus [1] Catalogue

  18. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  20. Scientific Misconduct.

    Science.gov (United States)

    Goodstein, David

    2002-01-01

    Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  3. 23rd October 2010 - UNESCO Director-General I. Bokova signing the Guest Book with CERN Director for Research and Scientific Computing S. Bertolucci and CERN Director-General R. Heuer.

    CERN Multimedia

    Maximilien Brice

    2010-01-01

    CERN-HI-1010244 37: in the SM18 hall: Ms Jasmina Sopova, Communication Officer J. Sopova; Director, Division of Basic & Engineering Sciences M. Nalecz, Assistant Director-General for the Natural Sciences G. Kalonji; Former CERN Director-General H. Schopper, CERN Head of Education R. Landua; UNESCO Director-General I. Bokova; CERN Adviser M. Bona; CERN Director for Research and Scientific Computing S. Bertolucci and UNESCO Office in Geneva Director Luis M. Tiburcio.

  4. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, Christopher J. [Univ. of Minnesota, Minneapolis, MN (United States)

    2017-11-12

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  5. Using cloud-computing applications to support collaborative scientific inquiry: Examining pre-service teachers’ perceived barriers towards integration / Utilisation d'applications infonuagiques pour appuyer la recherche scientifique collaborative

    OpenAIRE

    Joel Donna; Brant G Miller

    2013-01-01

    Technology plays a crucial role in facilitating collaboration within the scientific community. Cloud-computing applications can be used to model such collaboration and support inquiry within the secondary science classroom. Little is known about pre-service teachers’ beliefs related to the envisioned use of this technology in their teaching. These beliefs may influence future integration. This study finds several first-order barriers, such as perceptions that these tools would take too much t...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. Scientific communication

    Directory of Open Access Journals (Sweden)

    Aleksander Kobylarek

    2017-09-01

    Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

  13. Scientific millenarianism

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1997-01-01

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO 2 warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper

  14. Scientific meetings

    International Nuclear Information System (INIS)

    1973-01-01

    One of the main aims of the IAEA is to foster the exchange of scientific and technical information and one of the main ways of doing this is to convene international scientific meetings. They range from large international conferences bringing together several hundred scientists, smaller symposia attended by an average of 150 to 250 participants and seminars designed to instruct rather than inform, to smaller panels and study groups of 10 to 30 experts brought together to advise on a particular programme or to develop a set of regulations. The topics of these meetings cover every part of the Agency's activities and form a backbone of many of its programmes. (author)

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  18. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  19. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. The next scientific revolution.

    Science.gov (United States)

    Hey, Tony

    2010-11-01

    For decades, computer scientists have tried to teach computers to think like human experts. Until recently, most of those efforts have failed to come close to generating the creative insights and solutions that seem to come naturally to the best researchers, doctors, and engineers. But now, Tony Hey, a VP of Microsoft Research, says we're witnessing the dawn of a new generation of powerful computer tools that can "mash up" vast quantities of data from many sources, analyze them, and help produce revolutionary scientific discoveries. Hey and his colleagues call this new method of scientific exploration "machine learning." At Microsoft, a team has already used it to innovate a method of predicting with impressive accuracy whether a patient with congestive heart failure who is released from the hospital will be readmitted within 30 days. It was developed by directing a computer program to pore through hundreds of thousands of data points on 300,000 patients and "learn" the profiles of patients most likely to be rehospitalized. The economic impact of this prediction tool could be huge: If a hospital understands the likelihood that a patient will "bounce back," it can design programs to keep him stable and save thousands of dollars in health care costs. Similar efforts to uncover important correlations that could lead to scientific breakthroughs are under way in oceanography, conservation, and AIDS research. And in business, deep data exploration has the potential to unearth critical insights about customers, supply chains, advertising effectiveness, and more.

  3. Why not make a PC cluster of your own? 5. AppleSeed: A Parallel Macintosh Cluster for Scientific Computing

    Science.gov (United States)

    Decyk, Viktor K.; Dauger, Dean E.

    We have constructed a parallel cluster consisting of Apple Macintosh G4 computers running both Classic Mac OS as well as the Unix-based Mac OS X, and have achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. Unlike other Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the mainstream of computing.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  5. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-01-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the…

  6. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    Science.gov (United States)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  8. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Science.gov (United States)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  9. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  10. Efficient Scheduling of Scientific Workflows with Energy Reduction Using Novel Discrete Particle Swarm Optimization and Dynamic Voltage Scaling for Computational Grids

    Directory of Open Access Journals (Sweden)

    M. Christobel

    2015-01-01

    Full Text Available One of the most significant and the topmost parameters in the real world computing environment is energy. Minimizing energy imposes benefits like reduction in power consumption, decrease in cooling rates of the computing processors, provision of a green environment, and so forth. In fact, computation time and energy are directly proportional to each other and the minimization of computation time may yield a cost effective energy consumption. Proficient scheduling of Bag-of-Tasks in the grid environment ravages in minimum computation time. In this paper, a novel discrete particle swarm optimization (DPSO algorithm based on the particle’s best position (pbDPSO and global best position (gbDPSO is adopted to find the global optimal solution for higher dimensions. This novel DPSO yields better schedule with minimum computation time compared to Earliest Deadline First (EDF and First Come First Serve (FCFS algorithms which comparably reduces energy. Other scheduling parameters, such as job completion ratio and lateness, are also calculated and compared with EDF and FCFS. An energy improvement of up to 28% was obtained when Makespan Conservative Energy Reduction (MCER and Dynamic Voltage Scaling (DVS were used in the proposed DPSO algorithm.

  11. Efficient Scheduling of Scientific Workflows with Energy Reduction Using Novel Discrete Particle Swarm Optimization and Dynamic Voltage Scaling for Computational Grids

    Science.gov (United States)

    Christobel, M.; Tamil Selvi, S.; Benedict, Shajulin

    2015-01-01

    One of the most significant and the topmost parameters in the real world computing environment is energy. Minimizing energy imposes benefits like reduction in power consumption, decrease in cooling rates of the computing processors, provision of a green environment, and so forth. In fact, computation time and energy are directly proportional to each other and the minimization of computation time may yield a cost effective energy consumption. Proficient scheduling of Bag-of-Tasks in the grid environment ravages in minimum computation time. In this paper, a novel discrete particle swarm optimization (DPSO) algorithm based on the particle's best position (pbDPSO) and global best position (gbDPSO) is adopted to find the global optimal solution for higher dimensions. This novel DPSO yields better schedule with minimum computation time compared to Earliest Deadline First (EDF) and First Come First Serve (FCFS) algorithms which comparably reduces energy. Other scheduling parameters, such as job completion ratio and lateness, are also calculated and compared with EDF and FCFS. An energy improvement of up to 28% was obtained when Makespan Conservative Energy Reduction (MCER) and Dynamic Voltage Scaling (DVS) were used in the proposed DPSO algorithm. PMID:26075296

  12. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Maxine D. [Acting Director, EVL; Leigh, Jason [PI

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascale computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.

  13. Numerical Recipes in C++: The Art of Scientific Computing (2nd edn). Numerical Recipes Example Book (C++) (2nd edn). Numerical Recipes Multi-Language Code CD ROM with LINUX or UNIX Single-Screen License Revised Version

    International Nuclear Information System (INIS)

    Borcherds, P

    2003-01-01

    The two Numerical Recipes books are marvellous. The principal book, The Art of Scientific Computing, contains program listings for almost every conceivable requirement, and it also contains a well written discussion of the algorithms and the numerical methods involved. The Example Book provides a complete driving program, with helpful notes, for nearly all the routines in the principal book. The first edition of Numerical Recipes: The Art of Scientific Computing was published in 1986 in two versions, one with programs in Fortran, the other with programs in Pascal. There were subsequent versions with programs in BASIC and in C. The second, enlarged edition was published in 1992, again in two versions, one with programs in Fortran (NR(F)), the other with programs in C (NR(C)). In 1996 the authors produced Numerical Recipes in Fortran 90: The Art of Parallel Scientific Computing as a supplement, called Volume 2, with the original (Fortran) version referred to as Volume 1. Numerical Recipes in C++ (NR(C++)) is another version of the 1992 edition. The numerical recipes are also available on a CD ROM: if you want to use any of the recipes, I would strongly advise you to buy the CD ROM. The CD ROM contains the programs in all the languages. When the first edition was published I bought it, and have also bought copies of the other editions as they have appeared. Anyone involved in scientific computing ought to have a copy of at least one version of Numerical Recipes, and there also ought to be copies in every library. If you already have NR(F), should you buy the NR(C++) and, if not, which version should you buy? In the preface to Volume 2 of NR(F), the authors say 'C and C++ programmers have not been far from our minds as we have written this volume, and we think that you will find that time spent in absorbing its principal lessons will be amply repaid in the future as C and C++ eventually develop standard parallel extensions'. In the preface and introduction to NR

  14. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  15. Blueprint and First Experiences Bridging Hardware Virtualization and Global Grids for Advanced Scientific Computing: Designing and Building a Global Edge Services Framework (ESF) for OSG, EGEE, and LCG

    CERN Document Server

    Rana, A S; Vaniachine, A; Wurthwein, F; Foster, I; Sotomayor, B; Freeman, T

    2006-01-01

    We report on first experiences with building and operating an edge services framework (ESF) based on Xen virtual machines instantiated via the workspace service in Globus toolkit, and developed as a joint project between EGEE, LCG, and OSG. Many computing facilities are architected with their compute and storage clusters behind firewalls. Edge services (ES) are instantiated on a small set of gateways to provide access to these clusters via standard grid interfaces. Experience on EGEE, LCG, and OSG has shown that at least two issues are of critical importance when designing an infrastructure in support of ES. The first concerns ES configuration. It is impractical to assume that each virtual organization (VO) using a facility will employ the same ES configuration, or that different configurations will coexist easily. Even within a VO, it should be possible to run different versions of the same ES simultaneously. The second issue concerns resource allocation: it is essential that an ESF be able to effectively gu...

  16. High Energy Physics Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and High Energy Physics, June 10-12, 2015, Bethesda, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Esnet, Berkeley, CA (United States); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Esnet, Berkeley, CA (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Riley, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Esnet, Berkeley, CA (United States); Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Williams, Tim [Argonne National Lab. (ANL), Argonne, IL (United States); Almgren, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Amundson, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Bailey, Stephen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bloom, Ken [Univ. of Nebraska, Lincoln, NE (United States); Bockelman, Brian [Univ. of Nebraska, Lincoln, NE (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Borrill, Julian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Boughezal, Radja [Argonne National Lab. (ANL), Argonne, IL (United States); Brower, Richard [Boston Univ., MA (United States); Cowan, Benjamin [SLAC National Accelerator Lab., Menlo Park, CA (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Frontiere, Nicholas [Argonne National Lab. (ANL), Argonne, IL (United States); Fuess, Stuart [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ge, Lixin [SLAC National Accelerator Lab., Menlo Park, CA (United States); Gnedin, Nick [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Gottlieb, Steven [Indiana Univ., Bloomington, IN (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Han, T. [Indiana Univ., Bloomington, IN (United States); Heitmann, Katrin [Argonne National Lab. (ANL), Argonne, IL (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Ko, Kwok [SLAC National Accelerator Lab., Menlo Park, CA (United States); Kononenko, Oleksiy [SLAC National Accelerator Lab., Menlo Park, CA (United States); LeCompte, Thomas [Argonne National Lab. (ANL), Argonne, IL (United States); Li, Zheng [SLAC National Accelerator Lab., Menlo Park, CA (United States); Lukic, Zarija [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mori, Warren [Univ. of California, Los Angeles, CA (United States); Ng, Cho-Kuen [SLAC National Accelerator Lab., Menlo Park, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oleynik, Gene [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); O’Shea, Brian [Michigan State Univ., East Lansing, MI (United States); Padmanabhan, Nikhil [Yale Univ., New Haven, CT (United States); Petravick, Donald [Univ. of Illinois, Urbana, IL (United States). National Center for Supercomputing Applications; Petriello, Frank J. [Argonne National Lab. (ANL), Argonne, IL (United States); Pope, Adrian [Argonne National Lab. (ANL), Argonne, IL (United States); Power, John [Argonne National Lab. (ANL), Argonne, IL (United States); Qiang, Ji [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Reina, Laura [Florida State Univ., Tallahassee, FL (United States); Rizzo, Thomas Gerard [SLAC National Accelerator Lab., Menlo Park, CA (United States); Ryne, Robert [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schram, Malachi [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spentzouris, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Toussaint, Doug [Univ. of Arizona, Tucson, AZ (United States); Vay, Jean Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Viren, B. [Brookhaven National Lab. (BNL), Upton, NY (United States); Wuerthwein, Frank [Univ. of California, San Diego, CA (United States); Xiao, Liling [SLAC National Accelerator Lab., Menlo Park, CA (United States); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-11-29

    The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greater — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR

  17. Basic Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Basic Energy Sciences, November 3-5, 2015, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Windus, Theresa [Ames Lab., Ames, IA (United States); Banda, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Devereaux, Thomas [SLAC National Accelerator Lab., Menlo Park, CA (United States); White, Julia C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Energy Sciences Network (ESNet), Berkeley, CA (United States); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Energy Sciences Network (ESNet), Berkeley, CA (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Riley, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Energy Sciences Network (ESNet), Berkeley, CA (United States); Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baruah, Tunna [Univ. of Texas, El Paso, TX (United States); Benali, Anouar [Argonne National Lab. (ANL), Argonne, IL (United States); Borland, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Brabec, Jiri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carter, Emily [Princeton Univ., NJ (United States); Ceperley, David [Univ. of Illinois, Urbana-Champaign, IL (United States); Chan, Maria [Argonne National Lab. (ANL), Argonne, IL (United States); Chelikowsky, James [Univ. of Texas, Austin, TX (United States); Chen, Jackie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cheng, Hai-Ping [Univ. of Florida, Gainesville, FL (United States); Clark, Aurora [Washington State Univ., Pullman, WA (United States); Darancet, Pierre [Argonne National Lab. (ANL), Argonne, IL (United States); DeJong, Wibe [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Deslippe, Jack [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Dixon, David [Univ. of Alabama, Tuscaloosa, AL (United States); Donatelli, Jeffrey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunning, Thomas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fernandez-Serra, Marivi [Stony Brook Univ., NY (United States); Freericks, James [Georgetown Univ., Washington, DC (United States); Gagliardi, Laura [Univ. of Minnesota, Minneapolis, MN (United States); Galli, Giulia [Univ. of Chicago, IL (United States); Garrett, Bruce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Glezakou, Vassiliki-Alexandra [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gordon, Mark [Iowa State Univ., Ames, IA (United States); Govind, Niri [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gray, Stephen [Argonne National Lab. (ANL), Argonne, IL (United States); Gull, Emanuel [Univ. of Michigan, Ann Arbor, MI (United States); Gygi, Francois [Univ. of California, Davis, CA (United States); Hexemer, Alexander [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Isborn, Christine [Univ. of California, Merced, CA (United States); Jarrell, Mark [Louisiana State Univ., Baton Rouge, LA (United States); Kalia, Rajiv K. [Univ. of Southern California, Los Angeles, CA (United States); Kent, Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Klippenstein, Stephen [Argonne National Lab. (ANL), Argonne, IL (United States); Kowalski, Karol [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Krishnamurthy, Hulikal [Indian Inst. of Science, Bangalore (India); Kumar, Dinesh [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lena, Charles [Univ. of Texas, Austin, TX (United States); Li, Xiaosong [Univ. of Washington, Seattle, WA (United States); Maier, Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Markland, Thomas [Stanford Univ., CA (United States); McNulty, Ian [Argonne National Lab. (ANL), Argonne, IL (United States); Millis, Andrew [Columbia Univ., New York, NY (United States); Mundy, Chris [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nakano, Aiichiro [Univ. of Southern California, Los Angeles, CA (United States); Niklasson, A.M.N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Panagiotopoulos, Thanos [Princeton Univ., NJ (United States); Pandolfi, Ron [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Parkinson, Dula [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pask, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Perazzo, Amedeo [SLAC National Accelerator Lab., Menlo Park, CA (United States); Rehr, John [Univ. of Washington, Seattle, WA (United States); Rousseau, Roger [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sankaranarayanan, Subramanian [Argonne National Lab. (ANL), Argonne, IL (United States); Schenter, Greg [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Selloni, Annabella [Princeton Univ., NJ (United States); Sethian, Jamie [Univ. of California, Berkeley, CA (United States); Siepmann, Ilja [Univ. of Minnesota, Minneapolis, MN (United States); Slipchenko, Lyudmila [Purdue Univ., West Lafayette, IN (United States); Sternberg, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Stevens, Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Summers, Michael [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sumpter, Bobby [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sushko, Peter [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Thayer, Jana [SLAC National Accelerator Lab., Menlo Park, CA (United States); Toby, Brian [Argonne National Lab. (ANL), Argonne, IL (United States); Tull, Craig [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Valeev, Edward [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Vashishta, Priya [Univ. of Southern California, Los Angeles, CA (United States); Venkatakrishnan, V. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yang, C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yang, Ping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Zwart, Peter H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-02-03

    Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. We could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy

  18. Scientific activities 1979

    International Nuclear Information System (INIS)

    1981-01-01

    The scientific activities and achievements of the Nuclear Research Center Democritus for the year 1979 are presented in the form of a list of 78 projects giving title, objectives, commencement year, responsible of each project, developed activities and the pertaining lists of publications. The 15 chapters of this work cover the activities of the main Divisions of the Democritus NRC: Electronics, Biology, Physics, Chemistry, Health Physics, Reactor, Radioisotopes, Environmental Radioactivity, Soil Science, Computer Center, Uranium Exploration, Medical Service, Technological Applications and Training. (T.A.)

  19. Scientific visualization and radiology

    International Nuclear Information System (INIS)

    Lawrance, D.P.; Hoyer, C.E.; Wrestler, F.A.; Kuhn, M.J.; Moore, W.D.; Anderson, D.R.

    1989-01-01

    Scientific visualization is the visual presentation of numerical data. The National Center for Supercomputing Applications (NCSA) has developed methods for visualizing computerbased simulations of digital imaging data. The applicability of these various tools for unique and potentially medical beneficial display of MR images is investigated. Raw data are obtained from MR images of the brain, neck, spine, and brachial plexus obtained on a 1.5-T imager with multiple pulse sequences. A supercomputer and other mainframe resources run a variety of graphic and imaging programs using this data. An interdisciplinary team of imaging scientists, computer graphic programmers, an physicians works together to achieve useful information

  20. Coordinated Use of Heterogeneous Infrastructures for Scientific Computing at CIEMAT by means of Grid Technologies; Aprovechamiento Coordinado de las Infraestructuras Heterogeneas para Calculo Cientifico Participadas por el CIEMAT por medio de Tecnologias Grid

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Montero, A. J.

    2008-08-06

    Usually, research data centres maintain platforms from a wide range of architectures to cover the computational needs of their scientists. These centres are also frequently involved in diverse national and international Grid projects. Besides, it is very difficult to achieve a complete and efficient utilization of these recourses, due to the heterogeneity in their hardware and software configurations and their unequal use along the time. This report offers a solution to the problem of enabling a simultaneous and coordinated access to the variety of computing infrastructures and platforms available in great Research Organisms such as CIEMAT. For this purpose, new Grid technologies have been deployed in order to facilitate a common interface which enables the final user to access the internal and external resources. The previous computing infrastructure has not been modified and the independence on its administration has been guaranteed. For a sake of comparison, a feasibility study has been performed with the execution of the Drift Kinetic Equation solver (Dikes) tool, a high throughput scientific application used in the TJ-II Flexible Heliac at National Fusion Laboratory. (Author) 35 refs.

  1. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  2. Processing of English scientific articles on an IBM 1401 computer for linguistic and statistic purposes; Traitement sur ordinateur IBM 1401 de textes scientifiques anglais en vue d'etudes linguistiques et statistiques

    Energy Technology Data Exchange (ETDEWEB)

    Vandeputte, N.

    1961-07-01

    The author reports how he addressed the processing of a large set of scientific articles written in English and stored on a punched tape. The objective is to build up an alphabetically ordered list of occurrence of a set of pre-defined words, and also to build up statistics on the presence of these words within the concerned set of articles. He presents how data are organised on the storage support, comments how data are read by the tape reading device and thus how records are processed. He indicates which characters are ignored, or converted or detected. He indicates the different processing steps: tape reading, data storage, alphabetic sorting, outputs, calculation of component frequency and frequency sorting. He also discusses how tape punching errors are handled. He proposes a brief quantitative assessment of the article set processing, notably with a reference to computing time.

  3. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    Science.gov (United States)

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  4. Manual on JSSL (JAERI scientific subroutine library)

    International Nuclear Information System (INIS)

    Fujimura, Toichiro; Nishida, Takahiko; Asai, Kiyoshi

    1977-05-01

    A manual on the revised JAERI scientific subroutine library is presented. The library is a collection of subroutines developed or modified in JAERI which complements the library installed for FACOM 230-75 computer. It is subject to further extension in the future, since the present one is still insufficient for scientific calculations. (auth.)

  5. Scientific Library Offers New Training Options | Poster

    Science.gov (United States)

    The Scientific Library is expanding its current training opportunities by offering webinars, allowing employees to take advantage of trainings from the comfort of their own offices. Due to the nature of their work, some employees find it inconvenient to attend in-person training classes; others simply prefer to use their own computers. The Scientific Library has been

  6. Symbolic/Numeric Approaches to Scientific Computing

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    A course held at the University of Basel (CH) in the winter semester 1997/98. The coupling of Maple with Fortran 77 code is discussed in detail, with many mathematical applications.......A course held at the University of Basel (CH) in the winter semester 1997/98. The coupling of Maple with Fortran 77 code is discussed in detail, with many mathematical applications....

  7. Scientific instruments, scientific progress and the cyclotron

    International Nuclear Information System (INIS)

    Baird, David; Faust, Thomas

    1990-01-01

    Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientific instruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientific instruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

  8. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  9. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  10. Load Balancing Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Pearce, Olga Tkachyshyn [Texas A & M Univ., College Station, TX (United States)

    2014-12-01

    The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

  11. 30 January 2012 - Danish National Research Foundation Chairman of board K. Bock and University of Copenhagen Rector R. Hemmingsen visiting ATLAS underground experimental area, CERN Control Centre and ALICE underground experimental area, throughout accompanied by J. Dines Hansen and B. Svane Nielsen; signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci and Head of International Relations F. Pauss.

    CERN Document Server

    Jean-Claude Gadmer

    2012-01-01

    30 January 2012 - Danish National Research Foundation Chairman of board K. Bock and University of Copenhagen Rector R. Hemmingsen visiting ATLAS underground experimental area, CERN Control Centre and ALICE underground experimental area, throughout accompanied by J. Dines Hansen and B. Svane Nielsen; signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci and Head of International Relations F. Pauss.

  12. 28th February 2011 - Turkish Minister of Foreign Affairs A. Davutoğlu signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci and Head of International Relations F. Pauss; meeting the CERN Turkish Community at Point 1; visiting the ATLAS control room with Former Collaboration Spokesperson P. Jenni.

    CERN Document Server

    Maximilien Brice

    2011-01-01

    28th February 2011 - Turkish Minister of Foreign Affairs A. Davutoğlu signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci and Head of International Relations F. Pauss; meeting the CERN Turkish Community at Point 1; visiting the ATLAS control room with Former Collaboration Spokesperson P. Jenni.

  13. 10 March 2008 - Swedish Minister for Higher Education and Research L. Leijonborg signing the guest book with CERN Chef Scientific Officer J. Engelen, followed by the signature of the Swedish Computing Memorandum of Understanding by the Director General of the Swedish Research Council P. Ömling.

    CERN Multimedia

    Maximilien Brice

    2008-01-01

    10 March 2008 - Swedish Minister for Higher Education and Research L. Leijonborg signing the guest book with CERN Chef Scientific Officer J. Engelen, followed by the signature of the Swedish Computing Memorandum of Understanding by the Director General of the Swedish Research Council P. Ömling.

  14. 11 July 2011 - Carleton University Ottawa, Canada Vice President (Research and International) K. Matheson in the ATLAS visitor centre with Collaboration Spokesperson F. Gianotti, accompanied by Adviser J. Ellis and signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci.

    CERN Multimedia

    Jean-Claude Gadmer

    2011-01-01

    11 July 2011 - Carleton University Ottawa, Canada Vice President (Research and International) K. Matheson in the ATLAS visitor centre with Collaboration Spokesperson F. Gianotti, accompanied by Adviser J. Ellis and signing the guest book with CERN Director for Research and Scientific Computing S. Bertolucci.

  15. SALTON SEA SCIENTIFIC DRILLING PROJECT: SCIENTIFIC PROGRAM.

    Science.gov (United States)

    Sass, J.H.; Elders, W.A.

    1986-01-01

    The Salton Sea Scientific Drilling Project, was spudded on 24 October 1985, and reached a total depth of 10,564 ft. (3. 2 km) on 17 March 1986. There followed a period of logging, a flow test, and downhole scientific measurements. The scientific goals were integrated smoothly with the engineering and economic objectives of the program and the ideal of 'science driving the drill' in continental scientific drilling projects was achieved in large measure. The principal scientific goals of the project were to study the physical and chemical processes involved in an active, magmatically driven hydrothermal system. To facilitate these studies, high priority was attached to four areas of sample and data collection, namely: (1) core and cuttings, (2) formation fluids, (3) geophysical logging, and (4) downhole physical measurements, particularly temperatures and pressures.

  16. Biomedical ontologies: toward scientific debate.

    Science.gov (United States)

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  17. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  18. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  19. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  20. Scientific integrity in Brazil.

    Science.gov (United States)

    Lins, Liliane; Carvalho, Fernando Martins

    2014-09-01

    This article focuses on scientific integrity and the identification of predisposing factors to scientific misconduct in Brazil. Brazilian scientific production has increased in the last ten years, but the quality of the articles has decreased. Pressure on researchers and students for increasing scientific production may contribute to scientific misconduct. Cases of misconduct in science have been recently denounced in the country. Brazil has important institutions for controlling ethical and safety aspects of human research, but there is a lack of specific offices to investigate suspected cases of misconduct and policies to deal with scientific dishonesty.

  1. The Scientific Enterprise

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 9. The Scientific Enterprise - Assumptions, Problems, and Goals in the Modern Scientific Framework. V V Raman. Reflections Volume 13 Issue 9 September 2008 pp 885-894 ...

  2. Extensional scientific realism vs. intensional scientific realism.

    Science.gov (United States)

    Park, Seungbae

    2016-10-01

    Extensional scientific realism is the view that each believable scientific theory is supported by the unique first-order evidence for it and that if we want to believe that it is true, we should rely on its unique first-order evidence. In contrast, intensional scientific realism is the view that all believable scientific theories have a common feature and that we should rely on it to determine whether a theory is believable or not. Fitzpatrick argues that extensional realism is immune, while intensional realism is not, to the pessimistic induction. I reply that if extensional realism overcomes the pessimistic induction at all, that is because it implicitly relies on the theoretical resource of intensional realism. I also argue that extensional realism, by nature, cannot embed a criterion for distinguishing between believable and unbelievable theories. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Top scientific research center deploys Zambeel Aztera (TM) network storage system in high performance environment

    CERN Multimedia

    2002-01-01

    " The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory has implemented a Zambeel Aztera storage system and software to accelerate the productivity of scientists running high performance scientific simulations and computations" (1 page).

  4. WWW: The Scientific Method

    Science.gov (United States)

    Blystone, Robert V.; Blodgett, Kevin

    2006-01-01

    The scientific method is the principal methodology by which biological knowledge is gained and disseminated. As fundamental as the scientific method may be, its historical development is poorly understood, its definition is variable, and its deployment is uneven. Scientific progress may occur without the strictures imposed by the formal…

  5. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  6. Scientific Assistant Virtual Laboratory (SAVL)

    Science.gov (United States)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  7. Software support for students engaging in scientific activity and scientific controversy

    Science.gov (United States)

    Cavalli-Sforza, Violetta; Weiner, Arlene W.; Lesgold, Alan M.

    Computer environments could support students in engaging in cognitive activities that are essential to scientific practice and to the understanding of the nature of scientific knowledge, but that are difficult to manage in science classrooms. The authors describe a design for a computer-based environment to assist students in conducting dialectical activities of constructing, comparing, and evaluating arguments for competing scientific theories. Their choice of activities and their design respond to educators' and theorists' criticisms of current science curricula. They give detailed specifications of portions of the environment.

  8. Scientific Journal Indexing

    Directory of Open Access Journals (Sweden)

    Getulio Teixeira Batista

    2007-08-01

    Full Text Available It is quite impressive the visibility of online publishing compared to offline. Lawrence (2001 computed the percentage increase across 1,494 venues containing at least five offline and five online articles. Results shown an average of 336% more citations to online articles compared to offline articles published in the same venue. If articles published in the same venue are of similar quality, then they concluded that online articles are more highly cited because of their easier access. Thomson Scientific, traditionally concerned with printed journals, announced on November 28, 2005, the launch of Web Citation Index™, the multidisciplinary citation index of scholarly content from institutional and subject-based repositories (http://scientific.thomson. com/press/2005/8298416/. The Web Citation Index from the abstracting and indexing (A&I connects together pre-print articles, institutional repositories and open access (OA journals (Chillingworth, 2005. Basically all research funds are government granted funds, tax payer’s supported and therefore, results should be made freely available to the community. Free online availability facilitates access to research findings, maximizes interaction among research groups, and optimizes efforts and research funds efficiency. Therefore, Ambi-Água is committed to provide free access to its articles. An important aspect of Ambi-Água is the publication and management system of this journal. It uses the Electronic System for Journal Publishing (SEER - http://www.ibict.br/secao.php?cat=SEER. This system was translated and customized by the Brazilian Institute for Science and Technology Information (IBICT based on the software developed by the Public Knowledge Project (Open Journal Systems of the British Columbia University (http://pkp.sfu.ca/ojs/. The big advantage of using this system is that it is compatible with the OAI-PMH protocol for metadata harvesting what greatly promotes published articles

  9. CERN Scientific Book Fair 2013

    CERN Multimedia

    CERN Library

    2013-01-01

    The CERN Bookshop and CERN Library invite you to attend the 2013 CERN Book Fair, a two-day scientific event offering you the opportunity to meet key publishers and to browse and purchase books at significant discounts.   Key publishers will present a selection of titles in physics, technology, mathematics, engineering, computing and popular science. You are welcome to come along and meet the publishers’ representatives or simply have a look at the books on sale. The fair will take place in the Main Building (Bldg. 500) on the ground floor near Restaurant 1 on Monday 9 and Tuesday 10 September. Participating or represented publishers include: Oxford University Press, Princeton University Press, Springer, Wiley, and World Scientific-Imperial College Press. Fair opening times:  - Monday 9 September 9:00 - 18:00  - Tuesday 10 September 9:00 - 18:00

  10. CERN scientific book fair 2010

    CERN Document Server

    CERN Library

    2010-01-01

    The CERN Bookshop and CERN Library invite you to attend the 2010 CERN Book Fair, a two-day scientific event offering you the opportunity to meet key publishers and to browse and purchase books at significant discounts.   Some twelve companies will be present and will bring with them a selection of titles in physics, technology, mathematics, engineering, computing and popular science. You are welcome to come along and meet the publishers’ representatives or simply have a look to the books on offer. The Fair will take place in the Main Building (bldg. 500) on the ground floor near the Restaurant 1 on Tuesday 7th and Wednesday 8th September. Participating or represented publishers include: Cambridge University Press, EPFL Press – PPUR, Oxford University Press, Imperial College Press, McGraw-Hill, Oxford University Press, Pearson Education, Princeton University Press, Springer, Taylor and Francis, Wiley, World Scientific. Fair opening times: Tuesday 7 September 9:00 &ndash...

  11. Using cloud-computing applications to support collaborative scientific inquiry: Examining pre-service teachers’ perceived barriers towards integration / Utilisation d'applications infonuagiques pour appuyer la recherche scientifique collaborative

    Directory of Open Access Journals (Sweden)

    Joel Donna

    2013-07-01

    Full Text Available Technology plays a crucial role in facilitating collaboration within the scientific community. Cloud-computing applications can be used to model such collaboration and support inquiry within the secondary science classroom. Little is known about pre-service teachers’ beliefs related to the envisioned use of this technology in their teaching. These beliefs may influence future integration. This study finds several first-order barriers, such as perceptions that these tools would take too much time to use. Second-order barriers include perceptions that this technology would not promote face-to-face collaboration skills, would create social loafing situations, and beliefs that the technology does not help students understand the nature of science. Suggestions for mitigating these barriers within pre-service education technology courses are discussed. La technologie joue un rôle essentiel pour faciliter la collaboration au sein de la communauté scientifique. Les applications infonuagiques telles que Google Drive peuvent être utilisées pour donner forme à ce type de collaboration et pour appuyer le questionnement dans les cours de sciences du secondaire. On connaît pourtant peu les opinions que se font les futurs enseignants d’une telle utilisation des technologies collaboratives infonuagiques. Or, ces opinions pourraient influencer l’intégration future de ces technologies en salle de classe. Cette étude révèle plusieurs obstacles de premier plan, comme l’idée que l’utilisation de ces outils informatiques prend trop de temps. Parmi les obstacles de second plan, on note les perceptions selon lesquelles cette technologie ne promeut pas les compétences collaboratives de personne à personne, pose des problèmes de gestion de classe et n'aide pas les étudiants à comprendre la nature de la science. Des suggestions sont proposées pour atténuer ces obstacles dans les cours de technologie des programmes d’éducation.

  12. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  13. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  14. The scaling issue: scientific opportunities

    Science.gov (United States)

    Orbach, Raymond L.

    2009-07-01

    A brief history of the Leadership Computing Facility (LCF) initiative is presented, along with the importance of SciDAC to the initiative. The initiative led to the initiation of the Innovative and Novel Computational Impact on Theory and Experiment program (INCITE), open to all researchers in the US and abroad, and based solely on scientific merit through peer review, awarding sizeable allocations (typically millions of processor-hours per project). The development of the nation's LCFs has enabled available INCITE processor-hours to double roughly every eight months since its inception in 2004. The 'top ten' LCF accomplishments in 2009 illustrate the breadth of the scientific program, while the 75 million processor hours allocated to American business since 2006 highlight INCITE contributions to US competitiveness. The extrapolation of INCITE processor hours into the future brings new possibilities for many 'classic' scaling problems. Complex systems and atomic displacements to cracks are but two examples. However, even with increasing computational speeds, the development of theory, numerical representations, algorithms, and efficient implementation are required for substantial success, exhibiting the crucial role that SciDAC will play.

  15. The scaling issue: scientific opportunities

    International Nuclear Information System (INIS)

    Orbach, Raymond L

    2009-01-01

    A brief history of the Leadership Computing Facility (LCF) initiative is presented, along with the importance of SciDAC to the initiative. The initiative led to the initiation of the Innovative and Novel Computational Impact on Theory and Experiment program (INCITE), open to all researchers in the US and abroad, and based solely on scientific merit through peer review, awarding sizeable allocations (typically millions of processor-hours per project). The development of the nation's LCFs has enabled available INCITE processor-hours to double roughly every eight months since its inception in 2004. The 'top ten' LCF accomplishments in 2009 illustrate the breadth of the scientific program, while the 75 million processor hours allocated to American business since 2006 highlight INCITE contributions to US competitiveness. The extrapolation of INCITE processor hours into the future brings new possibilities for many 'classic' scaling problems. Complex systems and atomic displacements to cracks are but two examples. However, even with increasing computational speeds, the development of theory, numerical representations, algorithms, and efficient implementation are required for substantial success, exhibiting the crucial role that SciDAC will play.

  16. XML Based Scientific Data Management Facility

    Science.gov (United States)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  17. Cyber warfare building the scientific foundation

    CERN Document Server

    Jajodia, Sushil; Subrahmanian, VS; Swarup, Vipin; Wang, Cliff

    2015-01-01

    This book features a wide spectrum of the latest computer science research relating to cyber warfare, including military and policy dimensions. It is the first book to explore the scientific foundation of cyber warfare and features research from the areas of artificial intelligence, game theory, programming languages, graph theory and more. The high-level approach and emphasis on scientific rigor provides insights on ways to improve cyber warfare defense worldwide. Cyber Warfare: Building the Scientific Foundation targets researchers and practitioners working in cyber security, especially gove

  18. The role of data in scientific progress

    International Nuclear Information System (INIS)

    Glaeser, P.S.

    1985-01-01

    This book contains 109 papers presented at the 9th Int. CODATA Conference and illustrates two main themes (1) new computer-based methods for storing, manipulating and disseminating scientific and technical data, and (2) the use of such computerized data files to give new scientific insights. The broad range of scientific disciplines covered includes geology and geochemistry, oceanography and ecology, molecular biology and biotechnology, chemical engineering, materials properties, energy systems, data base design and management - theory and practice, and finally, a last section on data retrieval and library systems. 12 items are included in Atomindex separately. (Auth.)

  19. Manual for JSSL (JAERI scientific subroutine library)

    International Nuclear Information System (INIS)

    Inoue, Shuji; Fujimura, Toichiro; Tsutsui, Tsuneo; Nishida, Takahiko

    1982-09-01

    A manual on revised version of JAERI scientific subroutine library, which is a collection of scientific subroutines developed or modified in JAERI. They are classified into fifteen fields (Special Functions, Linear Problems, Eigenvalue and Eigen vector Problems, Non linear Problems, Mathematical Programming, Extreme Value Problems, Transformations, Functional Approximation Methods, Numerical Differential and Integral Methods, Numerical Differential and Integral Equations, Statistical Functions, Physical Problems, I/O Routines, Plotter Routines, Computer System Functions and Others). Main expansion of this version is in the fields of mathematical programming and statistical functions. The present library may be said to be a comprehensive compilation of scientific subroutines covering almost all the important fields. (author)

  20. The Computational Sensorimotor Systems Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Computational Sensorimotor Systems Lab focuses on the exploration, analysis, modeling and implementation of biological sensorimotor systems for both scientific...

  1. Scientific activities 1977 and 1978

    International Nuclear Information System (INIS)

    1980-01-01

    The scientific activities and achievements of the Nuclear Research Center Democritus for the years 1977 and 1978 are presented in the form of a list of 79 projects giving title, objectives, commencement year, responsible of each project and the pertaining lists of publications. The 14 chapters of this work cover the activities of the main Divisions of the Democritus NRC: Exploration of Radioactive Minerals, Computer Center, Environmental Radioactivity, Chemistry, Physics, Biology, Soil Science, Electronics, Reactor, Health Physics, Radioisotopes, Technological Applications and Medical Service. (T.A.)

  2. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  3. Amplify scientific discovery with artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Gil, Yolanda; Greaves, Mark T.; Hendler, James; Hirsch, Hyam

    2014-10-10

    Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automated language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.

  4. Age and Scientific Performance.

    Science.gov (United States)

    Cole, Stephen

    1979-01-01

    The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)

  5. Scientific Notation Watercolor

    Science.gov (United States)

    Linford, Kyle; Oltman, Kathleen; Daisey, Peggy

    2016-01-01

    (Purpose) The purpose of this paper is to describe visual literacy, an adapted version of Visual Thinking Strategy (VTS), and an art-integrated middle school mathematics lesson about scientific notation. The intent of this lesson was to provide students with a real life use of scientific notation and exponents, and to motivate them to apply their…

  6. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Rediscovering the scientific ethos

    DEFF Research Database (Denmark)

    Djørup, Stine

    The doctoral dissertation discusses some of the moral standards of good scientific practice that areunderexposed in the literature. In particular, attempts are made to correct the conceptual confusionsurrounding the norm of 'disinterestedness' in science (‘uhildethed’), and the norm of scientific...

  8. Testing Scientific Software: A Systematic Literature Review

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  9. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  10. Spatial computing in interactive architecture

    NARCIS (Netherlands)

    S.O. Dulman (Stefan); M. Krezer; L. Hovestad

    2014-01-01

    htmlabstractDistributed computing is the theoretical foundation for applications and technologies like interactive architecture, wearable computing, and smart materials. It evolves continuously, following needs rising from scientific developments, novel uses of technology, or simply the curiosity to

  11. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  12. The Revista Scientific

    Directory of Open Access Journals (Sweden)

    Oscar Antonio Martínez Molina

    2017-02-01

    Full Text Available The Revista Scientific aims to publish quality papers that include the perspective of analysis in educational settings. Together with www.indtec.com.ve, this electronic publication aims to promote and disseminate, with seriousness and rigor, the academic production in this field. Editorial of the new stage Revista Scientific was created with the aim of constituting a reference space for scientific research in the field of research analysis that is carried out within the universities in Latin America, once the distribution list hosted on the INDTEC platform (http://www.indtec.com.ve is consolidated as a space for dissemination and development of new ideas and initiatives. The first presentation of INDTEC Magazine was held in August 2016 in Venezuela. Thanks to the support of the INDTEC platform, SCIENTIFIC Magazine has been able to develop from the cooperative work of the people who make up its Editorial Committee, Academic Committee and Scientific Committee in Electronic Edition, and of the referees of each one of the numbers. Part of the success is due to the motivation of its co-editors and excellent professionals from different parts of the world: Argentina, Belgium, Colombia, Cuba, Ecuador, Spain, Mexico, Venezuela, which form the various committees, with enthusiasm and joy participating in this project (whose organizational structure is presented in this edition and continues in increcendo. Also, the strategy adopted to edit a monographic number from the various events organized in the framework of the universities, has contributed to provide SCIENTIFIC with a point value speaker of intellectual progress in the field of education. SCIENTIFIC Magazine is currently indexed in ISI, International Scientific Indexing, Dubai - UAE; ROAD, the Directory of Open Access Scholarly Resources (ISSN International Center, France; REVENCYT-ULA, Venezuela; Google Scholar (Google Scholar, International Index; Published in Calaméo; ISSUU; Academia

  13. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  14. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  15. Scientific meeting abstracts

    International Nuclear Information System (INIS)

    1999-01-01

    The document is a collection of the scientific meeting abstracts in the fields of nuclear physics, medical sciences, chemistry, agriculture, environment, engineering, different aspects of energy and presents research done in 1999 in these fields

  16. Identifying Strategic Scientific Opportunities

    Science.gov (United States)

    As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

  17. The Scientific Enterprise

    Indian Academy of Sciences (India)

    Srimath

    The phrase pre-modern scientific may be used to describe certain attitudes and ..... But unfortunately, in the general atmosphere of poor education and collective fears .... present day science and technology that old time beliefs and traditional ...

  18. WITHER SCIENTIFIC AND TECHNOLOGICAL

    African Journals Online (AJOL)

    No library or information service and especially in a developing .... Good public relations, consultancy services including bilateral and ... project proposal for the creation of a scientific and technological information ... For example, in 1995 the ...

  19. Shaping a Scientific Self

    DEFF Research Database (Denmark)

    Andrade-Molina, Melissa; Valero, Paola

    us to understand how a truth is reproduced, circulating among diverse fields of human knowledge. Also it will show why we accept and reproduce a particular discourse. Finally, we state Euclidean geometry as a truth that circulates in scientific discourse and performs a scientific self. We unfold...... the importance of having students following the path of what schools perceive a real scientist is, no to become a scientist, but to become a logical thinker, a problem solver, a productive citizen who uses reason....

  20. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  1. Open scientific communication urged

    Science.gov (United States)

    Richman, Barbara T.

    In a report released last week the National Academy of Sciences' Panel on Scientific Communication and National Security concluded that the ‘limited and uncertain benefits’ of controls on the dissemination of scientific and technological research are ‘outweighed by the importance of scientific progress, which open communication accelerates, to the overall welfare of the nation.’ The 18-member panel, chaired by Dale R. Corson, president emeritus of Cornell University, was created last spring (Eos, April 20, 1982, p. 241) to examine the delicate balance between open dissemination of scientific and technical information and the U.S. government's desire to protect scientific and technological achievements from being translated into military advantages for our political adversaries.The panel dealt almost exclusively with the relationship between the United States and the Soviet Union but noted that there are ‘clear problems in scientific communication and national security involving Third World countries.’ Further study of this matter is necessary.

  2. Writing software or writing scientific articles?

    CERN Document Server

    Basaglia, Tullio; Dressendorfer, P V; Larkin, A; Pia, M G

    2008-01-01

    An analysis of publications related to high energy physics computing in refereed journals is presented. The distribution of papers associated to various fields of computing relevant to high energy physics is critically analyzed. The relative publication rate of software papers is evaluated in comparison to other closely related physics disciplines, such as nuclear physics, radiation protection and medical physics, and to hardware publications. The results hint to the fact that, in spite of the significant effort invested in high energy physics computing and its fundamental role in the experiments, this research area is underrepresented in scientific literature; nevertheless the analysis of citations highlights the significant impact of software publications in experimental research.

  3. A primer on scientific programming with Python

    CERN Document Server

    Langtangen, Hans Petter

    2014-01-01

    The book serves as a first introduction to computer programming of scientific applications, using the high-level Python language. The exposition is example and problem-oriented, where the applications are taken from mathematics, numerical calculus, statistics, physics, biology and finance. The book teaches "Matlab-style" and procedural programming as well as object-oriented programming. High school mathematics is a required background and it is advantageous to study classical and numerical one-variable calculus in parallel with reading this book. Besides learning how to program computers, the reader will also learn how to solve mathematical problems, arising in various branches of science and engineering, with the aid of numerical methods and programming. By blending programming, mathematics and scientific applications, the book lays a solid foundation for practicing computational science. From the reviews: Langtangen … does an excellent job of introducing programming as a set of skills in problem solving. ...

  4. A primer on scientific programming with Python

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    The book serves as a first introduction to computer programming of scientific applications, using the high-level Python language. The exposition is example and problem-oriented, where the applications are taken from mathematics, numerical calculus, statistics, physics, biology and finance. The book teaches "Matlab-style" and procedural programming as well as object-oriented programming. High school mathematics is a required background and it is advantageous to study classical and numerical one-variable calculus in parallel with reading this book. Besides learning how to program computers, the reader will also learn how to solve mathematical problems, arising in various branches of science and engineering, with the aid of numerical methods and programming. By blending programming, mathematics and scientific applications, the book lays a solid foundation for practicing computational science. From the reviews: Langtangen … does an excellent job of introducing programming as a set of skills in problem solving. ...

  5. Accelerating scientific discovery : 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis of Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide

  6. Scientific collaboratories in higher education

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.; Li, Bin

    2003-01-01

    Scientific collaboratories hold the promise of providing students access to specialized scientific instruments, data and experts, enabling learning opportunities perhaps otherwise not available. However, evaluation of scientific collaboratories in higher education has lagged behind...

  7. Making better scientific figures

    Science.gov (United States)

    Hawkins, Ed; McNeall, Doug

    2016-04-01

    In the words of the UK government chief scientific adviser "Science is not finished until it's communicated" (Walport 2013). The tools to produce good visual communication have never been so easily accessible to scientists as at the present. Correspondingly, it has never been easier to produce and disseminate poor graphics. In this presentation, we highlight some good practice and offer some practical advice in preparing scientific figures for presentation to peers or to the public. We identify common mistakes in visualisation, including some made by the authors, and offer some good reasons not to trust defaults in graphics software. In particular, we discuss the use of colour scales and share our experiences in running a social media campaign (http://tiny.cc/endrainbow) to replace the "rainbow" (also "jet", or "spectral") colour scale as the default in (climate) scientific visualisation.

  8. Plagiarism in scientific publishing.

    Science.gov (United States)

    Masic, Izet

    2012-12-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader's own scientific contribution. There is no general regulation of control of

  9. PLAGIARISM IN SCIENTIFIC PUBLISHING

    Science.gov (United States)

    Masic, Izet

    2012-01-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader’s own scientific contribution. There is no general regulation of control of

  10. NASA's Scientific Visualization Studio

    Science.gov (United States)

    Mitchell, Horace G.

    2003-01-01

    Since 1988, the Scientific Visualization Studio(SVS) at NASA Goddard Space Flight Center has produced scientific visualizations of NASA s scientific research and remote sensing data for public outreach. These visualizations take the form of images, animations, and end-to-end systems and have been used in many venues: from the network news to science programs such as NOVA, from museum exhibits at the Smithsonian to White House briefings. This presentation will give an overview of the major activities and accomplishments of the SVS, and some of the most interesting projects and systems developed at the SVS will be described. Particular emphasis will be given to the practices and procedures by which the SVS creates visualizations, from the hardware and software used to the structures and collaborations by which products are designed, developed, and delivered to customers. The web-based archival and delivery system for SVS visualizations at svs.gsfc.nasa.gov will also be described.

  11. Biological and Environmental Research Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Biological and Environmental Research, March 28-31, 2016, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Arkin, Adam [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bader, David C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Riley, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Esnet; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aluru, Srinivas [Georgia Inst. of Technology, Atlanta, GA (United States); Andersen, Amity [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Aprá, Edoardo [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Azad, Ariful [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bates, Susan [National Center for Atmospheric Research, Boulder, CO (United States); Blaby, Ian [Brookhaven National Lab. (BNL), Upton, NY (United States); Blaby-Haas, Crysten [Brookhaven National Lab. (BNL), Upton, NY (United States); Bonneau, Rich [New York Univ. (NYU), NY (United States); Bowen, Ben [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bradford, Mark A. [Yale Univ., New Haven, CT (United States); Brodie, Eoin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, James (Ben) [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bernholdt, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bylaska, Eric [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Calvin, Kate [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cannon, Bill [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Xingyuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cheng, Xiaolin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cheung, Margaret [Univ. of Houston, Houston, TX (United States); Chowdhary, Kenny [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Collins, Bill [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Compo, Gil [National Oceanic and Atmospheric Administration (NOAA), Boulder, CO (United States); Crowley, Mike [National Renewable Energy Lab. (NREL), Golden, CO (United States); Debusschere, Bert [Sandia National Lab. (SNL-CA), Livermore, CA (United States); D’Imperio, Nicholas [Brookhaven National Lab. (BNL), Upton, NY (United States); Dror, Ron [Stanford Univ., Stanford, CA (United States); Egan, Rob [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Evans, Katherine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Friedberg, Iddo [Iowa State Univ., Ames, IA (United States); Fyke, Jeremy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gao, Zheng [Stony Brook Univ., Stony Brook, NY (United States); Georganas, Evangelos [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Giraldo, Frank [Naval Postgraduate School, Monterey, CA (United States); Gnanakaran, Gnana [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Govind, Niri [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Grandy, Stuart [Univ. of New Hampshire, Durham, NH (United States); Gustafson, Bill [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hammond, Glenn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hargrove, William [USDA Forest Service, Washington, D.C. (United States); Heroux, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Forrest [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hofmeyr, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hunke, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jackson, Charles [Univ. of Texas-Austin, Austin, TX (United States); Jacob, Rob [Argonne National Lab. (ANL), Argonne, IL (United States); Jacobson, Dan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jacobson, Matt [Univ. of California, San Francisco, CA (United States); Jain, Chirag [Georgia Inst. of Technology, Atlanta, GA (United States); Johansen, Hans [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Johnson, Jeff [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jones, Andy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jones, Phil [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kalyanaraman, Ananth [Washington State Univ., Pullman, WA (United States); Kang, Senghwa [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); King, Eric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koanantakool, Penporn [Univ. of California, Berkeley, CA (United States); Kollias, Pavlos [Stony Brook Univ., Stony Brook, NY (United States); Kopera, Michal [Univ. of California, Santa Cruz, CA (United States); Kotamarthi, Rao [Argonne National Lab. (ANL), Argonne, IL (United States); Kowalski, Karol [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Kumar, Jitendra [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kyrpides, Nikos [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Leung, Ruby [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Li, Xiaolin [Stony Brook Univ., Stony Brook, NY (United States); Lin, Wuyin [Brookhaven National Lab. (BNL), Upton, NY (United States); Link, Robert [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Yangang [Brookhaven National Lab. (BNL), Upton, NY (United States); Loew, Leslie [Univ. of Connecticut, Storrs, CT (United States); Luke, Edward [Brookhaven National Lab. (BNL), Upton, NY (United States); Ma, Hsi -Yen [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mahadevan, Radhakrishnan [Univ. of Toronto, Toronto, ON (Canada); Maranas, Costas [Pennsylvania State Univ., University Park, PA (United States); Martin, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Maslowski, Wieslaw [Naval Postgraduate School, Monterey, CA (United States); McCue, Lee Ann [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McInnes, Lois Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Mills, Richard [Intel Corp., Santa Clara, CA (United States); Molins Rafa, Sergi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Morozov, Dmitriy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mostafavi, Sara [Center for Molecular Medicine and Therapeutics, Vancouver, BC (Canada); Moulton, David J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mourao, Zenaida [Univ. of Cambridge (United Kingdom); Najm, Habib [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ng, Bernard [Center for Molecular Medicine and Therapeutics, Vancouver, BC (Canada); Ng, Esmond [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Norman, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Oh, Sang -Yun [Univ. of California, Santa Barbara, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pan, Chongle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pass, Rebecca [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pau, George S. H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Petridis, Loukas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Prakash, Giri [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Price, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Randall, David [Colorado State Univ., Fort Collins, CO (United States); Renslow, Ryan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Riihimaki, Laura [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Roberts, Andrew [Naval Postgraduate School, Monterey, CA (United States); Rokhsar, Dan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ruebel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Salinger, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scheibe, Tim [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schulz, Roland [Intel, Mountain View, CA (United States); Sivaraman, Chitra [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sreepathi, Sarat [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steefel, Carl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Talbot, Jenifer [Boston Univ., Boston, MA (United States); Tantillo, D. J. [Univ. of California, Davis, CA (United States); Tartakovsky, Alex [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Taylor, Ronald [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Trebotich, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Urban, Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valiev, Marat [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). EMSL; Wagner, Allon [Univ. of California, Berkeley, CA (United States); Wainwright, Haruko [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wieder, Will [NCAR/Univ. of Colorado, Boulder, CO (United States); Wiley, Steven [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Williams, Dean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Worley, Pat [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Xie, Shaocheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yoo, Shinjae [Brookhaven National Lab. (BNL), Upton, NY (United States); Yosef, Niri [Univ. of California, Berkeley, CA (United States); Zhang, Minghua [Stony Brook Univ., Stony Brook, NY (United States)

    2016-03-31

    Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOE began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.

  12. Fusion Energy Sciences Exascale Requirements Review. An Office of Science review sponsored jointly by Advanced Scientific Computing Research and Fusion Energy Sciences, January 27-29, 2016, Gaithersburg, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choong-Seock [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Greenwald, Martin [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Riley, Katherine [Argonne Leadership Computing Facility, Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Esnet, Berkeley, CA (United States); Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Monga, Inder [Esnet, Berkeley, CA (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Esnet, Berkeley, CA (United States); Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Andre, R. [TRANSP Group, Princeton, NJ (United States); Bernholdt, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bhattacharjee, Amitava [Princeton Univ., NJ (United States); Bonoli, Paul [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Boyd, Iain [Univ. of Michigan, Ann Arbor, MI (United States); Bulanov, Stepan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cary, John R. [Tech-X Corporation, Boulder, CO (United States); Chen, Yang [Univ. of Colorado, Boulder, CO (United States); Curreli, Davide [Univ. of Illinois at Urbana-Champaign, Urbana, IL (United States); Ernst, Darin R. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Ethier, Stephane [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Green, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hager, Robert [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Hakim, Ammar [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Hassanein, A. [Purdue Univ., West Lafayette, IN (United States); Hatch, David [Univ. of Texas, Austin, TX (United States); Held, E. D. [Utah State Univ., Logan, UT (United States); Howard, Nathan [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Izzo, Valerie A. [Univ. of California, San Diego, CA (United States); Jardin, Steve [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Jenkins, T. G. [Tech-X Corp., Boulder, CO (United States); Jenko, Frank [Univ. of California, Los Angeles, CA (United States); Kemp, Andreas [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); King, Jacob [Tech-X Corp., Boulder, CO (United States); Kritz, Arnold [Lehigh Univ., Bethlehem, PA (United States); Krstic, Predrag [Stony Brook Univ., NY (United States); Kruger, Scott E. [Tech-X Corp., Boulder, CO (United States); Kurtz, Rick [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lin, Zhihong [Univ. of California, Irvine, CA (United States); Loring, Burlen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nandipati, Giridhar [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pankin, A. Y. [Tech-X Corp., Boulder, CO (United States); Parker, Scott [Univ. of Colorado, Boulder, CO (United States); Perez, Danny [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pigarov, Alex Y. [Univ. of California, San Diego, CA (United States); Poli, Francesca [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Pueschel, M. J. [Univ. of Wisconsin, Madison, WI (United States); Rafiq, Tariq [Lehigh Univ., Bethlehem, PA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Setyawan, Wahyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sizyuk, Valeryi A. [Purdue Univ., West Lafayette, IN (United States); Smithe, D. N. [Tech-X Corp., Boulder, CO (United States); Sovinec, C. R. [Univ. of Wisconsin, Madison, WI (United States); Turner, Miles [Dublin City University, Leinster (Ireland); Umansky, Maxim [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Verboncoeur, John [Michigan State Univ., East Lansing, MI (United States); Vincenti, Henri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Voter, Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wang, Weixing [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Wirth, Brian [Univ. of Tennessee, Knoxville, TN (United States); Wright, John [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Yuan, X. [TRANSP Group, Princeton, NJ (United States)

    2017-02-01

    The additional computing power offered by the planned exascale facilities could be transformational across the spectrum of plasma and fusion research — provided that the new architectures can be efficiently applied to our problem space. The collaboration that will be required to succeed should be viewed as an opportunity to identify and exploit cross-disciplinary synergies. To assess the opportunities and requirements as part of the development of an overall strategy for computing in the exascale era, the Exascale Requirements Review meeting of the Fusion Energy Sciences (FES) community was convened January 27–29, 2016, with participation from a broad range of fusion and plasma scientists, specialists in applied mathematics and computer science, and representatives from the U.S. Department of Energy (DOE) and its major computing facilities. This report is a summary of that meeting and the preparatory activities for it and includes a wealth of detail to support the findings. Technical opportunities, requirements, and challenges are detailed in this report (and in the recent report on the Workshop on Integrated Simulation). Science applications are described, along with mathematical and computational enabling technologies. Also see http://exascaleage.org/fes/ for more information.

  13. Recording Scientific Knowledge

    International Nuclear Information System (INIS)

    Bowker, Geof

    2006-01-01

    The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

  14. Usability in Scientific Databases

    Directory of Open Access Journals (Sweden)

    Ana-Maria Suduc

    2012-07-01

    Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

  15. Advances in Computer Entertainment.

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  16. Surviving Scientific Academia . . . and Beyond

    Energy Technology Data Exchange (ETDEWEB)

    Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-03

    It's been 16 years since I first took a physics class at Weber State University. Since them, I've survived graduate school in Nuclear Engineering, and a postdoc appointment doing nuclear nonproliferation. Now I'm a Technical Staff Member at Los Alamos National Laboratory working with nuclear data, the physics behind the numerical simulations of nuclear reactors and nuclear weapons. Along the way, I've learned a few things. First, scientific computing is everywhere in science. If you are not writing codes, you will be analyzing their output, and generally there will be more output than a human can correctly and accurately interpret in a timely manner. Second, a career in science or engineering can be very rewarding with opportunities to collaborate with and generate friendships with very bright people from all over the world.

  17. Scientific inference learning from data

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Providing the knowledge and practical experience to begin analysing scientific data, this book is ideal for physical sciences students wishing to improve their data handling skills. The book focuses on explaining and developing the practice and understanding of basic statistical analysis, concentrating on a few core ideas, such as the visual display of information, modelling using the likelihood function, and simulating random data. Key concepts are developed through a combination of graphical explanations, worked examples, example computer code and case studies using real data. Students will develop an understanding of the ideas behind statistical methods and gain experience in applying them in practice. Further resources are available at www.cambridge.org/9781107607590, including data files for the case studies so students can practise analysing data, and exercises to test students' understanding.

  18. 10 December 2015 - Director-General for Research, Italian Ministry of Education, Research and University V. Di Felice visiting LHC superconducting magnet assembly hall and CERN Control centre with Director for Research and Scientific Computing S. Bertolucci.

    CERN Multimedia

    Gadmer, Jean-Claude Robert

    2015-01-01

    Dr Vincenzo Di Felice Director-General for Research Ministry of Education, Research and University Italian Republic were also present: A. Di Donato, MIUR; M. Gargano, MIUR - INFN Auditor; F. Ciardiello, MIUR - INFN Auditor; A. Mondera, Court of Auditors - INFN Auditor; S. Odorizzi, AD Tassullo S.p.A.; M. Dalpiaz, Tassullo S.p.A.; F. Conforti, Tassullo S.p.A; A. Sartor, Tassullo S.p.A.; D. Bonn, Tassullo S.p.A.; M. Allegri, INFN; F. Ferroni, INFN President; S. Falciano, INFN Vice President; A. Zoccoli, INFN Executive Member; U. Dosselli, Scientific Attaché, Permanent Mission to the UNOG.

  19. Scientific Equipment Division - Overview

    International Nuclear Information System (INIS)

    Halik, J.

    2001-01-01

    Full text: The Scientific Equipment Division consists of the Design Group and the Mechanical Workshop. The activity of the Division includes the following: - designing of devices and equipment for experiments in physics, their mechanical construction and assembly. In particular, there are vacuum chambers and installations for HV and UHV; - maintenance and upgrading of the existing installations and equipment in our Institute; - participation of our engineers and technicians in design works, equipment assembly and maintenance for experiments in foreign laboratories. The Design Group is equipped with PC-computers and AutoCAD graphic software (release 2000 and Mechanical Desktop 4.0) and a AO plotter, what allows us to make drawings and 2- and 3-dimensional mechanical documentation to the world standards. The Mechanical Workshop can offer a wide range of machining and treatment methods with satisfactory tolerances and surface quality. It offers the following possibilities: - turning - cylindrical elements of a length up to 2000 mm and a diameter up to 400 mm, and also disc-type elements of a diameter up to 600 mm and a length not exceeding 300 mm; - milling - elements of length up to 1000 mm and gear wheels of diameter up to 300 mm; - grinding - flat surfaces of dimensions up to 300 mm x 1000 mm and cylindrical elements of a diameter up to 200 mm and a length up to 800 mm; - drilling - holes of a diameter up to 50 mm; - welding - electrical and gas welding, including TIG vacuum-tight welding; - soft and hard soldering; - mechanical works including precision engineering; - plastics treatment - machining and polishing using diamond milling, modelling, lamination of various shapes and materials, including plexiglas, scintillators and light-guides; - painting - paint spraying with possibility of using furnace-fred drier of internal dimensions of 800 mm x 800 mm x 800 mm. Our workshop posses CNC milling machine which can be used for machining of work-pieces up to 500 kg

  20. Topological data analysis for scientific visualization

    CERN Document Server

    Tierny, Julien

    2017-01-01

    Combining theoretical and practical aspects of topology, this book delivers a comprehensive and self-contained introduction to topological methods for the analysis and visualization of scientific data. Theoretical concepts are presented in a thorough but intuitive manner, with many high-quality color illustrations. Key algorithms for the computation and simplification of topological data representations are described in details, and their application is carefully illustrated in a chapter dedicated to concrete use cases. With its fine balance between theory and practice, "Topological Data Analysis for Scientific Visualization" constitutes an appealing introduction to the increasingly important topic of topological data analysis, for lecturers, students and researchers.

  1. Manual for JSSL (JAERI Scientific Subroutine Library)

    International Nuclear Information System (INIS)

    Fujimura, Toichiro; Tsutsui, Tsuneo

    1991-09-01

    JSSL (JAERI Scientific Subroutine Library) is a library of scientific subroutines developed or modified in JAERI. They are classified into sixteen fields (Special Functions, Linear Problems, Eigenvalue and Eigenvector Problems, Non Linear Problems, Mathematical Programming, Extreme Value Problems, Transformations, Functional Approximation Methods, Numerical Differential and Integral Methods, Numerical Differential and Integral Equations, Statistical Functions, Physical Problems, I/O Routines, Plotter Routines, Computer System Functions and Others). This report is the user manual for the revised version of JSSL which involves evaluated subroutines selected from the previous compilation of JSSL, applied in almost all the fields. (author)

  2. Scientific activities 1980 Nuclear Research Center ''Democritos''

    International Nuclear Information System (INIS)

    1982-01-01

    The scientific activities and achievements of the Nuclear Research Center Democritos for the year 1980 are presented in the form of a list of 76 projects giving title, objectives, responsible of each project, developed activities and the pertaining lists of publications. The 16 chapters of this work cover the activities of the main Divisions of the Democritos NRC: Electronics, Biology, Physics, Chemistry, Health Physics, Reactor, Scientific Directorate, Radioisotopes, Environmental Radioactivity, Soil Science, Computer Center, Uranium Exploration, Medical Service, Technological Applications, Radioimmunoassay and Training. (N.C.)

  3. Advanced Excel for scientific data analysis

    CERN Document Server

    De Levie, Robert

    2004-01-01

    Excel is by far the most widely distributed data analysis software but few users are aware of its full powers. Advanced Excel For Scientific Data Analysis takes off from where most books dealing with scientific applications of Excel end. It focuses on three areas-least squares, Fourier transformation, and digital simulation-and illustrates these with extensive examples, often taken from the literature. It also includes and describes a number of sample macros and functions to facilitate common data analysis tasks. These macros and functions are provided in uncompiled, computer-readable, easily

  4. Collaborative e-Science Experiments and Scientific Workflows

    NARCIS (Netherlands)

    Belloum, A.; Inda, M.A.; Vasunin, D.; Korkhov, V.; Zhao, Z.; Rauwerda, H.; Breit, T.M.; Bubak, M.; Hertzberger, L.O.

    2011-01-01

    Recent advances in Internet and grid technologies have greatly enhanced scientific experiments' life cycle. In addition to compute- and data-intensive tasks, large-scale collaborations involving geographically distributed scientists and e-infrastructure are now possible. Scientific workflows, which

  5. An Imagination Effect in Learning from Scientific Text

    Science.gov (United States)

    Leopold, Claudia; Mayer, Richard E.

    2015-01-01

    Asking students to imagine the spatial arrangement of the elements in a scientific text constitutes a learning strategy intended to foster deep processing of the instructional material. Two experiments investigated the effects of mental imagery prompts on learning from scientific text. Students read a computer-based text on the human respiratory…

  6. Scientific annual report 1972

    International Nuclear Information System (INIS)

    This is a report on scientific research at DESY in 1972. The activities in the field of electron-nucleon scattering, photoproduction and synchrotron radiation get a special mention. It is also reported on the work on the double storage ring as well as on the extension to the synchrotron. (WL/LN) [de

  7. Funding scientific open access

    International Nuclear Information System (INIS)

    Canessa, E.; Fonda, C.; Zennaro, M.

    2006-11-01

    In order to reduce the knowledge divide, more Open Access Journals (OAJ) are needed in all languages and scholarly subject areas that exercise peer-review or editorial quality control. To finance needed costs, it is discussed why and how to sell target specific advertisement by associating ads to given scientific keywords. (author)

  8. Scientific Report 2007

    International Nuclear Information System (INIS)

    2009-09-01

    This annual scientific report gives an concise overview of research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2007. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  9. Report of scientific results

    International Nuclear Information System (INIS)

    1978-01-01

    The findings of R+D activities of the HMI radiation chemistry department in the fields of pulsed radiolysis, reaction kinematics, insulators and plastics are presented as well as the scientific publications and lectures of HMI staff and visitors including theoretical contributions, theses and dissertations, and conference papers. (HK) [de

  10. Scientific Report 2001

    International Nuclear Information System (INIS)

    2002-04-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor

  11. Scientific Report 2005

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-04-15

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  12. Dorky Poll Scientific Fears

    CERN Multimedia

    2008-01-01

    The questions posed in yesterday's posts about hopes for 2008 were half of what we were asked by the Powers That Be. The other half: What scientific development do you fear you'll be blogging or reading about in 2008?

  13. Scientific Report 2004

    International Nuclear Information System (INIS)

    2005-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  14. Scientific Report 2004

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  15. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  16. Scientific Medical Journal

    African Journals Online (AJOL)

    Scientific Medical Journal: an official journal of Egyptian Medical Education provides a forum for dissemination of knowledge, exchange of ideas, inform of exchange of ideas, information and experience among workers, investigators and clinicians in all disciplines of medicine with emphasis on its treatment and prevention.

  17. Scientific Report 2001

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-04-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor.

  18. Assessing Scientific Performance.

    Science.gov (United States)

    Weiner, John M.; And Others

    1984-01-01

    A method for assessing scientific performance based on relationships displayed numerically in published documents is proposed and illustrated using published documents in pediatric oncology for the period 1979-1982. Contributions of a major clinical investigations group, the Childrens Cancer Study Group, are analyzed. Twenty-nine references are…

  19. Scientific Report 2006

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  20. Scientific Report 2006

    International Nuclear Information System (INIS)

    2007-09-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  1. Scientific Report 2003

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research.

  2. 3 CFR - Scientific Integrity

    Science.gov (United States)

    2010-01-01

    ... information in policymaking. The selection of scientists and technology professionals for positions in the... Administration on a wide range of issues, including improvement of public health, protection of the environment... technological findings and conclusions. If scientific and technological information is developed and used by the...

  3. Scientific annual report 1973

    International Nuclear Information System (INIS)

    A report is given on the scientific research at DESY in 1973, which included the first storage of electrons in the double storage ring DORIS. Also mentioned are the two large spectrometers PLUTO and DASP, and experiments relating to elementary particles, synchrotron radiation, and the improvement of the equipment are described. (WL/AK) [de

  4. Scientific Report 2005

    International Nuclear Information System (INIS)

    2006-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  5. Scientific Report 2003

    International Nuclear Information System (INIS)

    2004-01-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research

  6. Mario Bunge's Scientific Realism

    Science.gov (United States)

    Cordero, Alberto

    2012-01-01

    This paper presents and comments on Mario Bunge's scientific realism. After a brief introduction in Sects. 1 and 2 outlines Bunge's conception of realism. Focusing on the case of quantum mechanics, Sect. 3 explores how his approach plays out for problematic theories. Section 4 comments on Bunge's project against the background of the current…

  7. 1995 Scientific Report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    This annual scientific report of SCK-CEN presents a comprehensive coverage and research activities in the filed of (a) waste and site restoration (b) reactor safety and radiation protection (c) operation of BR2 Materials Testing Reactor and (d) services provided by the center (analysis for characterization of waste packages, nuclear measurements, low-level radioactivity measurements).

  8. 2003 Scientific Technological Report

    International Nuclear Information System (INIS)

    Prado Cuba, A.; Gayoso Caballero, C.; Robles Nique, A.; Olivera Lescano, P.

    2004-08-01

    This annual scientific-technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2003. This report includes 54 papers divided in 9 subject matters: physics and nuclear chemistry, nuclear engineering, materials science, radiochemistry, industrial applications, medical applications, environmental applications, protection and radiological safety, and management aspects

  9. Scientific Tourism in Armenia

    Science.gov (United States)

    Tashchyan, Davit

    2016-12-01

    The Scientific Tourism is relatively new direction in the world, however it already has managed to gain great popularity. As it is, it has arisen in 1980s, but its ideological basis comes from the earliest periods of the human history. In Armenia, it is a completely new phenomenon and still not-understandable for many people. At global level, the Scientific Tourism has several definitions: for example, as explains the member of the scientific tourist centre of Zlovlen Mrs. Pichelerova "The essence of the scientific tourism is based on the provision of the educational, cultural and entertainment needs of a group of people of people who are interested in the same thing", which in our opinion is a very comprehensive and discreet definition. We also have our own views on this type of tourism. Our philosophy is that by keeping the total principles, we put the emphasis on the strengthening of science-individual ties. Our main emphasis is on the scientific-experimental tourism. But this does not mean that we do not take steps to other forms of tourism. Studying the global experience and combining it with our resources, we are trying to get a new interdisciplinary science, which will bring together a number of different professionals as well as individuals, and as a result will have a new lore. It is in this way that an astronomer will become an archaeologist, an archaeologist will become an astrophysicist, etc. Speaking on interdisciplinary sciences, it's worth mentioning that in recent years, the role of interdisciplinary sciences at global level every day is being considered more and more important. In these terms, tourism is an excellent platform for the creation of interdisciplinary sciences and, therefore, the preparation of corresponding scholars. Nevertheless, scientific tourism is very important for the revelation, appreciation and promotion of the country's historical-cultural heritage and scientific potential. Let us not forget either that tourism in all its

  10. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  11. Scientific progress report 1979

    International Nuclear Information System (INIS)

    1979-01-01

    The report discusses planing and development activities in the field of computer systems, user software, process computer systems, measurement and control technology, processing of process data in medicine, research on structural elements and radiation tests carried out in the data processing and electronics department. (WB) [de

  12. Scientific progress report 1978

    International Nuclear Information System (INIS)

    1978-01-01

    Recent work of HMI in the following fields is reported: HMI interconnected computer operation numerical and graphical user software, process computer systems - software and hardware, measurement and control, process data processing in medicine, research on structural element radiation tests. (WB) [de

  13. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  14. Turning Scientific Presentations into Stories

    Science.gov (United States)

    Aruffo, Christopher

    2015-01-01

    To increase students' confidence in giving scientific presentations, students were shown how to present scientific findings as a narrative story. Students who were preparing to give a scientific talk attended a workshop in which they were encouraged to experience the similarities between telling a personal anecdote and presenting scientific data.…

  15. A training program for scientific supercomputing users

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, F.; Moher, T.; Sabelli, N.; Solem, A.

    1988-01-01

    There is need for a mechanism to transfer supercomputing technology into the hands of scientists and engineers in such a way that they will acquire a foundation of knowledge that will permit integration of supercomputing as a tool in their research. Most computing center training emphasizes computer-specific information about how to use a particular computer system; most academic programs teach concepts to computer scientists. Only a few brief courses and new programs are designed for computational scientists. This paper describes an eleven-week training program aimed principally at graduate and postdoctoral students in computationally-intensive fields. The program is designed to balance the specificity of computing center courses, the abstractness of computer science courses, and the personal contact of traditional apprentice approaches. It is based on the experience of computer scientists and computational scientists, and consists of seminars and clinics given by many visiting and local faculty. It covers a variety of supercomputing concepts, issues, and practices related to architecture, operating systems, software design, numerical considerations, code optimization, graphics, communications, and networks. Its research component encourages understanding of scientific computing and supercomputer hardware issues. Flexibility in thinking about computing needs is emphasized by the use of several different supercomputer architectures, such as the Cray X/MP48 at the National Center for Supercomputing Applications at University of Illinois at Urbana-Champaign, IBM 3090 600E/VF at the Cornell National Supercomputer Facility, and Alliant FX/8 at the Advanced Computing Research Facility at Argonne National Laboratory. 11 refs., 6 tabs.

  16. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  17. The philosophy of scientific experimentation: a review

    Science.gov (United States)

    2009-01-01

    Practicing and studying automated experimentation may benefit from philosophical reflection on experimental science in general. This paper reviews the relevant literature and discusses central issues in the philosophy of scientific experimentation. The first two sections present brief accounts of the rise of experimental science and of its philosophical study. The next sections discuss three central issues of scientific experimentation: the scientific and philosophical significance of intervention and production, the relationship between experimental science and technology, and the interactions between experimental and theoretical work. The concluding section identifies three issues for further research: the role of computing and, more specifically, automating, in experimental research, the nature of experimentation in the social and human sciences, and the significance of normative, including ethical, problems in experimental science. PMID:20098589

  18. Sergio Bertolucci - Towards dynamic scientific research

    CERN Multimedia

    2009-01-01

    Sergio Bertolucci has become Director for Research and Scientific Computing at the moment when the LHC is almost ready to deliver its first physics data. In this interview, he explains the importance of the perfect mix of collaboration and competition that will make the LHC scientific programme successful. Sergio Bertolucci’s enthusiasm for being at CERN at this historic time is evident from the first minute of the interview and has not waned after an hour speaking with us. Bertolucci’s recipe for a successful start-up of the physics delivery phase of the LHC is "Festina lente", a Latin motto that means something like ‘hasten slowly’. "The LHC is probably the biggest and most complex scientific enterprise ever undertaken by humanity," says Bertolucci. "It will certainly lead us towards a new phase of our understanding of the Universe. Nature is already giving us some indications but only the LHC will allow us to observe the ne...

  19. The need for scientific software engineering in the pharmaceutical industry.

    Science.gov (United States)

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  20. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  1. The need for scientific software engineering in the pharmaceutical industry

    Science.gov (United States)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  2. Scientific annual report 1977

    International Nuclear Information System (INIS)

    1977-01-01

    In the section data processing and electronics, application-oriented R+D acivities in the field of general and process-oriented data processing, digital and analogue measuring systems and electronic components are carried out in seven working groups and two project groups: HMI computer network; mathematical software and computer graphics; software development; nuclear electronics, measurement and control; research on components and irradiation tests; operation of the central computers; process data processing in medicine; cooperation with the Wissenschaftliches Rechenzentrum Berlin (WRB) in the project BERNET. (orig./WB) [de

  3. Scientific Programming in Fortran

    Directory of Open Access Journals (Sweden)

    W. Van Snyder

    2007-01-01

    Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.

  4. 1997 Scientific Report

    International Nuclear Information System (INIS)

    Govaerts, P.

    1998-01-01

    The 1997 Scientific Report of the Belgian Nuclear Research Centre SCK-CEN describes progress achieved in nuclear safety, radioactive waste management, radiation protection and safeguards. In the field of nuclear research, the main projects concern the behaviour of high-burnup and MOX fuel, the embrittlement of reactor pressure vessels, the irradiation-assisted stress corrosion cracking of reactor internals, and irradiation effects on materials of fusion reactors. In the field of radioactive waste management, progress in the following domains is reported: the disposal of high-level radioactive waste and spent fuel in a clay formation, the decommissioning of nuclear installations, the study of alternative waste-processing techniques. For radiation protection and safeguards, the main activities reported on are in the field of site and environmental restoration, emergency planning and response and scientific support to national and international programmes

  5. Scientific report 1999

    International Nuclear Information System (INIS)

    1999-01-01

    The aim of this report is to outline the main developments of the 'Departement des Reacteurs Nucleaires' (DRN) during the year 1999. DRN is one of the CEA Institutions. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in English) and economic data on staff, budget and communication. Main results of the Department for the year 1999 are presented giving information on the simulation of low mach number compressible flow, experimental irradiation of multi-materials, progress in the dry route conversion process of UF 6 to UO 2 , the neutronics, the CASCADE installation, the corium, the BWR type reactor cores technology, the reactor safety, the transmutation of americium and fuel cell flow studies, the crack propagation, the hybrid systems and the CEA sites improvement. (A.L.B.)

  6. Scientific publications in Nepal.

    Science.gov (United States)

    Magar, A

    2012-09-01

    Scientific publications have become a mainstay of communication among readers, academicians, researchers and scientists worldwide. Although, its existence dates back to 17 th century in the West, Nepal is still struggling to take few steps towards improving its local science for last 50 years. Since the start of the first medical journal in 1963, the challenges remains as it were decades back regarding role of authors, peer reviewers, editors and even publishers in Nepal. Although, there has been some development in terms of the number of articles being published and appearances of the journals, yet there is a long way to go. This article analyzes the past and present scenario, and future perspective for scientific publications in Nepal.

  7. Sherlock Holmes: scientific detective.

    Science.gov (United States)

    Snyder, Laura J

    2004-09-01

    Sherlock Holmes was intended by his creator, Arthur Conan Doyle, to be a 'scientific detective'. Conan Doyle criticized his predecessor Edgar Allan Poe for giving his creation - Inspector Dupin - only the 'illusion' of scientific method. Conan Doyle believed that he had succeeded where Poe had failed; thus, he has Watson remark that Holmes has 'brought detection as near an exact science as it will ever be brought into the world.' By examining Holmes' methods, it becomes clear that Conan Doyle modelled them on certain images of science that were popular in mid- to late-19th century Britain. Contrary to a common view, it is also evident that rather than being responsible for the invention of forensic science, the creation of Holmes was influenced by the early development of it.

  8. Collaboration in scientific practice

    DEFF Research Database (Denmark)

    Wagenknecht, Susann

    2014-01-01

    This monograph investigates the collaborative creation of scientific knowledge in research groups. To do so, I combine philosophical analysis with a first-hand comparative case study of two research groups in experimental science. Qualitative data are gained through observation and interviews......, and I combine empirical insights with existing approaches to knowledge creation in philosophy of science and social epistemology. On the basis of my empirically-grounded analysis I make several conceptual contributions. I study scientific collaboration as the interaction of scientists within research...... to their publication. Specifically, I suggest epistemic difference and the porosity of social structure as two conceptual leitmotifs in the study of group collaboration. With epistemic difference, I emphasize the value of socio-cognitive heterogeneity in group collaboration. With porosity, I underline the fact...

  9. Scientific report 1998

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of this report is to outline the main developments of the ''Departement des Reacteurs Nucleaires'', (DRN) during the year 1998. DRN is one of the CEA Institution. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in english) and economic data on staff, budget and communication. Main results of the Department, for the year 1998, are presented giving information on the reactors technology and safety, the neutronics, the transmutation and the hybrid systems, the dismantling and the sites improvement, the nuclear accidents, the nuclear matter transport, the thermonuclear fusion safety, the fuel cladding materials and radioactive waste control. (A.L.B.)

  10. Scientific Resource EXplorer

    Science.gov (United States)

    Xing, Z.; Wormuth, A.; Smith, A.; Arca, J.; Lu, Y.; Sayfi, E.

    2014-12-01

    Inquisitive minds in our society are never satisfied with curatedimages released by a typical public affairs office. They always want tolook deeper and play directly on original data. However, most scientificdata products are notoriously hard to use. They are immensely large,highly distributed and diverse in format. In this presentation,we will demonstrate Resource EXplorer (REX), a novel webtop applicationthat allows anyone to conveniently explore and visualize rich scientificdata repositories, using only a standard web browser. This tool leverageson the power of Webification Science (w10n-sci), a powerful enabling technologythat simplifies the use of scientific data on the web platform.W10n-sci is now being deployed at an increasing number of NASA data centers,some of which are the largest digital treasure troves in our nation.With REX, these wonderful scientific resources are open for teachers andstudents to learn and play.

  11. Applications of artificial intelligence to scientific research

    Science.gov (United States)

    Prince, Mary Ellen

    1986-01-01

    Artificial intelligence (AI) is a growing field which is just beginning to make an impact on disciplines other than computer science. While a number of military and commercial applications were undertaken in recent years, few attempts were made to apply AI techniques to basic scientific research. There is no inherent reason for the discrepancy. The characteristics of the problem, rather than its domain, determines whether or not it is suitable for an AI approach. Expert system, intelligent tutoring systems, and learning programs are examples of theoretical topics which can be applied to certain areas of scientific research. Further research and experimentation should eventurally make it possible for computers to act as intelligent assistants to scientists.

  12. Professional scientific blog

    Directory of Open Access Journals (Sweden)

    Tamás Beke

    2009-03-01

    Full Text Available The professional blog is a weblog that on the whole meets the requirements of scientific publication. In my opinion it bear a resemblance to digital notice board, where the competent specialists of the given branch of science can place their ideas, questions, possible solutions and can raise problems. Its most important function can be collectivization of the knowledge. In this article I am going to examine the characteristics of the scientific blog as a genre. Conventional learning counts as a rather solitary activity. If the students have access to the materials of each other and of the teacher, their sense of solitude diminishes and this model is also closer to the constructivist approach that features the way most people think and learn. Learning does not mean passively collecting tiny pieces of knowledge; it much more esembles ‘spinning a conceptual net’ which is made up by the experiences and observations of the individual. With the spreading of the Internet more universities and colleges worldwide gave a try to on-line educational methods, but the most efficient one has not been found yet. The publication of the curriculum (the material of the lectures and the handling of the electronic mails are not sufficient; much more is needed for collaborative learning. Our scholastic scientific blog can be a sufficient field for the start of a knowledge-building process based on cooperation. In the Rocard-report can be read that for the future of Europe it is crucial to develop the education of the natural sciences, and for this it isnecessary to act on local, regional, national and EU-level. To the educational processes should be involved beyond the traditional actors (child, parent, teacher also others (scientists, professionals, universities, local institutions, the actors of the economic sphere, etc.. The scholastic scientific blog answer the purposes, as a collaborative knowledge-sharing forum.

  13. Scientific Technological Report 2002

    International Nuclear Information System (INIS)

    Gayoso C, C.; Cuya G, T.; Robles N, A.; Prado C, A.

    2003-07-01

    This annual scientific-technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2002. This report includes 58 papers divided in 10 subject matters: physics and nuclear chemistry, nuclear engineering, materials, industrial applications, biological applications, medical applications, environmental applications, protection and radiological safety, nuclear safety, and management aspects

  14. Evaluating a scientific collaboratory

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.; Whitton, Mary C.; Maglaughlin, Kelly L.

    2003-01-01

    of the system, and post-interviews to understand the participants' views of doing science under both conditions. We hypothesized that study participants would be less effective, report more difficulty, and be less favorably inclined to adopt the system when collaborating remotely. Contrary to expectations...... of collaborating remotely. While the data analysis produced null results, considered as a whole, the analysis leads us to conclude there is positive potential for the development and adoption of scientific collaboratory systems....

  15. National nuclear scientific program

    International Nuclear Information System (INIS)

    Plecas, I.; Matausek, M.V.; Neskovic, N.

    2001-01-01

    National scientific program of the Vinca Institute Nuclear Reactors And Radioactive Waste comprises research and development in the following fields: application of energy of nuclear fission, application of neutron beams, analyses of nuclear safety and radiation protection. In the first phase preparatory activities, conceptual design and design of certain processes and facilities should be accomplished. In the second phase realization of the projects is expected. (author)

  16. CAD/CAM and scientific data management at Dassault

    Science.gov (United States)

    Bohn, P.

    1984-01-01

    The history of CAD/CAM and scientific data management at Dassault are presented. Emphasis is put on the targets of the now commercially available software CATIA. The links with scientific computations such as aerodynamics and structural analysis are presented. Comments are made on the principles followed within the company. The consequences of the approximative nature of scientific data are examined. Consequence of the new history function is mainly its protection against copy or alteration. Future plans at Dassault for scientific data appear to be in opposite directions compared to some general tendencies.

  17. PROSCENIUM OF SCIENTIFIC MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vasile Berlingher

    2013-09-01

    Full Text Available During the last three decades of the nineteenth century, organizations developed rapidly, their managers began to realize that they had too frequent managerial problems; this awareness lead to a new phase of development of scientific management. Examining the titles published in that period, it can be concluded that management issues that pose interest related to payroll and payroll systems, problems exacerbated by the industrial revolution and related work efficiency. Noting that large organizations losing power, direct supervision, the managers were looking for incentives to replace this power . One of the first practitioners of this new management system was Henry R. Towne, the president of the well-known enterprise "Yale and Towne Manufacturing Company", which applied the management methods in his company workshops. Publishers of magazines "Industrial Management" and "The Engineering Magazine" stated that HR Towne is, undisputedly, the pioneer of scientific management. He initiated the systematic application of effective management methods and his famous article "The Engineer as Economist" provided to the company. "American Society of Mechanical Engineers" in 1886 was the one that probably inspired Frederick W. Taylor to devote his entire life and work in scientific management.

  18. Scientific visualization - past, present and future

    International Nuclear Information System (INIS)

    Brodlie, K.

    1995-01-01

    This paper presents a general overview of scientific visualization from a historical orientation. It looks first at visualization before the advent of computers, and then goes on to describe the development of early visualization tools in the 'computer age'. There was a surge of interest in visualization in the latter part of the 1980s, following the publication of an NSF report. This sparked the development of a number of major visualization software systems such as AVS and IRIS Explorer. These are described, and the paper concludes with a look at future developments. ((orig.))

  19. Scientific work environments in the next decade

    Science.gov (United States)

    Gomez, Julian E.

    1989-01-01

    The applications of contemporary computer graphics to scientific visualization is described, with emphasis on the nonintuitive problems. A radically different approach is proposed which centers on the idea of the scientist being in the simulation display space rather than observing it on a screen. Interaction is performed with nonstandard input devices to preserve the feeling of being immersed in the three-dimensional display space. Construction of such a system could begin now with currently available technology.

  20. The paradox of scientific expertise

    DEFF Research Database (Denmark)

    Alrøe, Hugo Fjelsted; Noe, Egon

    2011-01-01

    Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads to a f...... cross-disciplinary research and in the collective use of different kinds of scientific expertise, and thereby make society better able to solve complex, real-world problems.......Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads...... to a fragmentation of scientific expertise. To resolve this paradox, the present paper investigates three hypotheses: 1) All scientific knowledge is perspectival. 2) The perspectival structure of science leads to specific forms of knowledge asymmetries. 3) Such perspectival knowledge asymmetries must be handled...

  1. Predicting future discoveries from current scientific literature.

    Science.gov (United States)

    Petrič, Ingrid; Cestnik, Bojan

    2014-01-01

    Knowledge discovery in biomedicine is a time-consuming process starting from the basic research, through preclinical testing, towards possible clinical applications. Crossing of conceptual boundaries is often needed for groundbreaking biomedical research that generates highly inventive discoveries. We demonstrate the ability of a creative literature mining method to advance valuable new discoveries based on rare ideas from existing literature. When emerging ideas from scientific literature are put together as fragments of knowledge in a systematic way, they may lead to original, sometimes surprising, research findings. If enough scientific evidence is already published for the association of such findings, they can be considered as scientific hypotheses. In this chapter, we describe a method for the computer-aided generation of such hypotheses based on the existing scientific literature. Our literature-based discovery of NF-kappaB with its possible connections to autism was recently approved by scientific community, which confirms the ability of our literature mining methodology to accelerate future discoveries based on rare ideas from existing literature.

  2. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  3. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  4. On the Possibility of a Scientific Theory of Scientific Method.

    Science.gov (United States)

    Nola, Robert

    1999-01-01

    Discusses the philosophical strengths and weaknesses of Laudan's normative naturalism, which understands the principles of scientific method to be akin to scientific hypotheses, and therefore open to test like any principle of science. Contains 19 references. (Author/WRM)

  5. The science of computing shaping a discipline

    CERN Document Server

    Tedre, Matti

    2014-01-01

    The identity of computing has been fiercely debated throughout its short history. Why is it still so hard to define computing as an academic discipline? Is computing a scientific, mathematical, or engineering discipline? By describing the mathematical, engineering, and scientific traditions of computing, The Science of Computing: Shaping a Discipline presents a rich picture of computing from the viewpoints of the field's champions. The book helps readers understand the debates about computing as a discipline. It explains the context of computing's central debates and portrays a broad perspecti

  6. Marie Curie: scientific entrepreneur

    International Nuclear Information System (INIS)

    Boudia, S.

    1998-01-01

    Marie Curie is best known for her discovery of radium one hundred years ago this month, but she also worked closely with industry in developing methods to make and monitor radioactive material, as Soraya Boudia explains. One hundred years ago this month, on 28 December 1898, Pierre Curie, Marie Sklodowska-Curie and Gustave Bemont published a paper in Comptes-rendus - the journal of the French Academy of Sciences. In the paper they announced that they had discovered a new element with astonishing properties: radium. But for one of the authors, Marie Curie, the paper was more than just the result of outstanding work: it showed that a woman could succeed in what was then very much a male-dominated scientific world. Having arrived in Paris from Poland in 1891, Marie Curie became the first woman in France to obtain a PhD in physics, the first woman to win a Nobel prize and the first woman to teach at the Sorbonne. She also helped to found a new scientific discipline: the study of radioactivity. She became an icon and a role-model for other women to follow, someone who succeeded - despite many difficulties - in imposing herself on the world of science. Although Curie's life story is a familiar and well documented one, there is one side to her that is less well known: her interaction with industry. As well as training many nuclear physicists and radiochemists in her laboratory, she also became a scientific pioneer in industrial collaboration. In this article the author describes this side of Marie Curie. (UK)

  7. Scientific (Wo)manpower?

    DEFF Research Database (Denmark)

    Amilon, Anna; Persson, Inga

    2013-01-01

    Purpose – The purpose of this paper is to investigate to what extent male and female PhDs choose academic vs non‐academic employment. Further, it analyses gender earnings differences in the academic and non‐academic labour markets. Design/methodology/approach – Rich Swedish cross‐sectional regist...... scientific human capital. Originality/value – The study is the first to investigate career‐choice and earnings of Swedish PhDs. Further, the study is the first to investigate both the academic and the non‐academic labour markets....

  8. Scientific report 1999

    International Nuclear Information System (INIS)

    2000-01-01

    This scientific report of the Fuel Cycle Direction of the Cea, presents the Direction activities and research programs in the fuel cycle domain during the year 1999. The first chapter is devoted to the front end of the fuel cycle with the SILVA process as main topic. The second chapter is largely based on the separation chemistry of the back end cycle. The third and fourth chapters present studies of more applied and sometimes more technical developments in the nuclear industry or not. (A.L.B.)

  9. Scientific report 1997

    International Nuclear Information System (INIS)

    Gosset, J.; Gueneau, C.; Doizi, D.

    1998-01-01

    In this book are found technical and scientific papers on the main works of the Direction of the Fuel Cycle (DCC) in France. The study fields are: the up-side of the nuclear fuel cycle with theoretical studies (plasma simulation) and technological developments and instrumentation (lasers diodes, carbides plasma projection, carbon 13 enrichment); the down-side nuclear fuel cycle with theoretical studies (ion Eu 3+ complexation simulation, decay simulation, uranium and plutonium diffusion study, electrolyser operating simulation), scenario studies ( recycling, wastes management), experimental studies; dismantling and cleaning (soils cleaning, surface-active agent for decontamination, fault tree analysis); analysis with expert systems and mass spectrometry. (A.L.B.)

  10. Annual scientific report 1974

    International Nuclear Information System (INIS)

    Billiau, R.; Bobin, K.; Michiels, G.; Proost, J.

    1975-01-01

    The main activities of SCK/CEN during 1974 are reported in individual summaries. Fields of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

  11. SCIENTIFIC BASIS OF DENTISTRY

    Directory of Open Access Journals (Sweden)

    Yegane GÜVEN

    2017-10-01

    Full Text Available Technological and scientific innovations have increased exponentially over the past years in the dentistry profession. In this article, these developments are evaluated both in terms of clinical practice and their place in the educational program. The effect of the biologic and digital revolutions on dental education and daily clinical practice are also reviewed. Biomimetics, personalized dental medicine regenerative dentistry, nanotechnology, high-end simulations providing virtual reality, genomic information, and stem cell studies will gain more importance in the coming years, moving dentistry to a different dimension.

  12. Annual scientific report 1975

    International Nuclear Information System (INIS)

    Billiau, R.; Bobin, K.; Michiels, G.; Proost, J.

    1976-01-01

    The main activities of SCK/CEN during 1975 are reported in individual summaries. Field of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

  13. Energy and scientific communication

    Science.gov (United States)

    De Sanctis, E.

    2013-06-01

    Energy communication is a paradigmatic case of scientific communication. It is particularly important today, when the world is confronted with a number of immediate, urgent problems. Science communication has become a real duty and a big challenge for scientists. It serves to create and foster a climate of reciprocal knowledge and trust between science and society, and to establish a good level of interest and enthusiasm for research. For an effective communication it is important to establish an open dialogue with the audience, and a close collaboration among scientists and science communicators. An international collaboration in energy communication is appropriate to better support international and interdisciplinary research and projects.

  14. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Anisenkov, A; Belov, S; Kaplin, V; Korol, A; Skovpen, K; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2012-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  15. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  16. The Scientific Case against Astrology.

    Science.gov (United States)

    Kelly, Ivan

    1980-01-01

    Discussed is the lack of a scientific foundation and scientific evidence favoring astrology. Included are several research studies conducted to examine astrological tenets which yield generally negative results. (Author/DS)

  17. Expectations for a scientific collaboratory

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    2003-01-01

    In the past decade, a number of scientific collaboratories have emerged, yet adoption of scientific collaboratories remains limited. Meeting expectations is one factor that influences adoption of innovations, including scientific collaboratories. This paper investigates expectations scientists have...... with respect to scientific collaboratories. Interviews were conducted with 17 scientists who work in a variety of settings and have a range of experience conducting and managing scientific research. Results indicate that scientists expect a collaboratory to: support their strategic plans; facilitate management...... of the scientific process; have a positive or neutral impact on scientific outcomes; provide advantages and disadvantages for scientific task execution; and provide personal conveniences when collaborating across distances. These results both confirm existing knowledge and raise new issues for the design...

  18. Metadata in Scientific Dialects

    Science.gov (United States)

    Habermann, T.

    2011-12-01

    Discussions of standards in the scientific community have been compared to religious wars for many years. The only things scientists agree on in these battles are either "standards are not useful" or "everyone can benefit from using my standard". Instead of achieving the goal of facilitating interoperable communities, in many cases the standards have served to build yet another barrier between communities. Some important progress towards diminishing these obstacles has been made in the data layer with the merger of the NetCDF and HDF scientific data formats. The universal adoption of XML as the standard for representing metadata and the recent adoption of ISO metadata standards by many groups around the world suggests that similar convergence is underway in the metadata layer. At the same time, scientists and tools will likely need support for native tongues for some time. I will describe an approach that combines re-usable metadata "components" and restful web services that provide those components in many dialects. This approach uses advanced XML concepts of referencing and linking to construct complete records that include reusable components and builds on the ISO Standards as the "unabridged dictionary" that encompasses the content of many other dialects.

  19. Budapest scientific a guidebook

    CERN Document Server

    Hargittai, István

    2015-01-01

    This guidebook introduces the reader—the scientific tourist and others—to the visible memorabilia of science and scientists in Budapest—statues, busts, plaques, buildings, and other artefacts. According to the Hungarian–American Nobel laureate Albert Szent-Györgyi, this metropolis at the crossroads of Europe has a special atmosphere of respect for science. It has been the venue of numerous scientific achievements and the cradle, literally, of many individuals who in Hungary, and even more beyond its borders became world-renowned contributors to science and culture. Six of the eight chapters of the book cover the Hungarian Nobel laureates, the Hungarian Academy of Sciences, the university, the medical school, agricultural sciences, and technology and engineering. One chapter is about selected gimnáziums from which seven Nobel laureates (Szent-Györgyi, de Hevesy, Wigner, Gabor, Harsanyi, Olah, and Kertész) and the five “Martians of Science” (von Kármán, Szilard, Wigner, von Neumann, and Teller...

  20. Compendium of Scientific Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Clendenin, James E

    2003-05-16

    The International Committee supported the proposal of the Chairman of the XVIII International Linac Conference to issue a new Compendium of linear accelerators. The last one was published in 1976. The Local Organizing Committee of Linac96 decided to set up a sub-committee for this purpose. Contrary to the catalogues of the High Energy Accelerators which compile accelerators with energies above 1 GeV, we have not defined a specific limit in energy. Microtrons and cyclotrons are not in this compendium. Also data from thousands of medical and industrial linacs has not been collected. Therefore, only scientific linacs are listed in the present compendium. Each linac found in this research and involved in a physics context was considered. It could be used, for example, either as an injector for high energy accelerators, or in nuclear physics, materials physics, free electron lasers or synchrotron light machines. Linear accelerators are developed in three continents only: America, Asia, and Europe. This geographical distribution is kept as a basis. The compendium contains the parameters and status of scientific linacs. Most of these linacs are operational. However, many facilities under construction or design studies are also included. A special mention has been made at the end for the studies of future linear colliders.

  1. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  2. Drilling for scientific purpose

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Shoichi

    1987-09-01

    Drilling for scientific purpose is a process of conducting geophysical exploration at deep underground and drilling for collecting crust samples directly. This is because earth science has advanced to get a good understanding about the top of the crust and has shifted its main interest to the lower layer of the crust in land regions. The on-land drilling plan in Japan has just started, and the planned drilling spots are areas around the Minami River, Hidaka Mts., kinds of the Mesozoic and Cenozoic granite in outside zone, the extension of Japan Sea, Ogasawara Is., Minami-Tori Is., and active volcanos. The paper also outlines the present situation of on-land drilling in the world, focusing on the SG-3rd super-deep well SG-3 on the Kola Peninsula, USSR, Satori SG-1st well SG-1 in Azerbaidzhan S.S.R, V.S.S.R, Sweden's wells, Cyprus' wells, Bayearn well Plan in West Germany, and Salton Sea Scientific Drilling Program in the U.S. At its end, the paper explains the present situation and the future theme of the Japanese drilling technique and points out the necessity of developing equipment, and techniques. (14 figs, 5 tabs, 26 refs)

  3. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  4. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  5. The Scientific Competitiveness of Nations.

    Science.gov (United States)

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society.

  6. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  7. EFSA Scientific Committee; Scientific Opinion on Risk Assessment Terminology

    DEFF Research Database (Denmark)

    Hald, Tine

    of improving the expression and communication of risk and/or uncertainties in the selected opinions. The Scientific Committee concluded that risk assessment terminology is not fully harmonised within EFSA. In part this is caused by sectoral legislation defining specific terminology and international standards......The Scientific Committee of the European Food Safety Authority (EFSA) reviewed the use of risk assessment terminology within its Scientific Panels. An external report, commissioned by EFSA, analysed 219 opinions published by the Scientific Committee and Panels to recommend possible ways......, the Scientific Committee concludes that particular care must be taken that the principles of CAC, OIE or IPPC are followed strictly. EFSA Scientific Panels should identify which specific approach is most useful in dealing with their individual mandates. The Scientific Committee considered detailed aspects...

  8. 11 March 2009 - Italian Minister of Education, University and Research M. Gelmini, visiting ATLAS and CMS underground experimental areas and LHC tunnel with Director for Research and Scientific Computing S. Bertolucci. Signature of the guest book with CERN Director-General R. Heuer and S. Bertolucci at CMS Point 5.

    CERN Multimedia

    Maximilien Brice

    2009-01-01

    Members of the Ministerial delegation: Cons. Amb. Sebastiano FULCI, Consigliere Diplomatico Dott.ssa Elisa GREGORINI, Segretario Particolare del Ministro Dott. Massimo ZENNARO, Responsabile rapporti con la stampa Prof. Roberto PETRONZIO, Presidente dell’INFN (Istituto Nazionale di Fisica Nucleare) Dott. Luciano CRISCUOLI, Direttore Generale della Ricerca, MIUR Dott. Andrea MARINONI, Consulente scientifico del Ministro CERN delegation present throughout the programme: Prof. Sergio Bertolucci, Director for Research and Scientific Computing Prof. Fabiola Gianotti, ATLAS Collaboration Spokesperson Prof. Paolo Giubellino, ALICE Deputy Spokesperson, Universita & INFN, Torino Prof. Guido Tonelli, CMS Collaboration Deputy Spokesperson, INFN Pisa Dr Monica Pepe-Altarelli, LHCb Collaboration CERN Team Leader Guests in the ATLAS exhibition area: Dr Marcello Givoletti\tPresident of CAEN Dr Davide Malacalza\tPresident of ASG Ansaldo Superconductors and users: Prof. Clara Matteuzzi, LHCb Collaboration, Universita' d...

  9. Experiences of Scientific Thinking in Physics Classrooms

    Directory of Open Access Journals (Sweden)

    Alexandre Fagundes Faria

    2018-04-01

    Full Text Available There is a contemporary demand on STEM education to support learning experiences in which students use scientific thinking to solve tasks. Scientific thinking involves domain-specific knowledge and general domain strategies of thinking. The object of interest in this research was the set of students’ experiences of scientific thinking in which they articulate domain-general strategies and domain-specific knowledge to solve physics tasks. Our goal was to characterize the experiences of scientific thinking of two groups of four students engaged in tasks about Newtonian Mechanics. The volunteers were 19 students, 15-17 years old, enrolled in electronics or computer science courses (11th grade of a Brazilian vocational high school at Belo Horizonte/Minas Gerais. All class activities proposed to the students have been regularly used since 2010, therefore, we made no special intervention to conduct the study. Data collection occurred during the classes and involved audio and video recordings of students working in group; field notes; and photographs of students’ notebooks and of the posters they made to conduct oral presentations. The choice of the groups was based on how assiduous the members were. We have transcribed episodes in which we identified experiences of scientific thinking. These transcriptions, the field notes and the photographs were analyzed together, in interaction with each other. Data analysis is based upon John Dewey’s Theory of Experience. Our results show that the experiences of scientific thinking of the two groups were educative experiences, although qualitatively different. This difference was due to the way students interacted with the conditions given to solve the tasks. Additional information is given about the school circumstances in which the study was conducted to allow a better evaluation of results quality.

  10. 2005 Annual Scientific Conference. Program and Abstracts

    International Nuclear Information System (INIS)

    Barborica, Andrei; Bulinski, Mircea; Stefan, Sabina

    2005-01-01

    Every year the Physics Department of the University of Bucharest organizes the 'Annual Scientific Conference' to present the most interesting scientific results, obtained within the department. This scientific session is opened also to the interested physics researchers from other institutes and universities in the country. This scientific event represents a recognition and a continuation of the prestigious tradition of physics research performed within University. The scientific research in the Physics Department is performed in groups and research centers, the terminal year undergraduate students and graduate students being involved in a high extent in the research works. There are 5 research centers with the status of Center of excellence in research. The long-term strategy adopted by the faculty was focused on developing the scientific research in modern topics of theoretical, experimental and applied physics, as well as in inter-disciplinary fields as biophysics, medical physics, physics and protection of the environment, physics - computer science. Following this strategy, the Faculty of Physics has diversified the research activity, developing new research laboratories and encouraging the academic community to perform modern and competitive research projects. The Faculty of Physics is a partner in many common research programs with prestigious foreign universities and institutes. The 2005 session covered the following 8 topics: 1. Atmosphere and Earth Science; Environment Protection (21 papers); 2. Atomic and Molecular Physics; Astrophysics (12 papers); 3. Electricity and Biophysics (19 papers); 4. Nuclear and Elementary Particles Physics (17 papers); 5. Optics, Spectroscopy, Plasma and Lasers (19 papers); 6. Polymer Physics (10 papers); 7. Solid State Physics and Materials Science (10 papers); 8. Theoretical Physics and Applied Mathematics Seminar (12 papers)

  11. 10 September 2013 - Italian Minister for Economic Development F. Zanonato visiting the ATLAS cavern with Collaboration Spokesperson D. Charlton and Italian scientists F. Gianotti and A. Di Ciaccio; signing the guest book with CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci; in the LHC tunnel with S. Bertolucci, Technology Deputy Department Head L. Rossi and Engineering Department Head R. Saban; visiting CMS cavern with Scientists G. Rolandi and P. Checchia.

    CERN Multimedia

    Jean-Claude Gadmer

    2013-01-01

    10 September 2013 - Italian Minister for Economic Development F. Zanonato visiting the ATLAS cavern with Collaboration Spokesperson D. Charlton and Italian scientists F. Gianotti and A. Di Ciaccio; signing the guest book with CERN Director-General R. Heuer and Director for Research and Scientific Computing S. Bertolucci; in the LHC tunnel with S. Bertolucci, Technology Deputy Department Head L. Rossi and Engineering Department Head R. Saban; visiting CMS cavern with Scientists G. Rolandi and P. Checchia.

  12. Wikimedia as a platform for scientific information

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    During this presentation the topics that will touched upon include various ways in which wikis are being used in scientific research and publishing, currently, as well as some that are more speculative. Many of the examples are drawn from the biological sciences, and the talk is intended to stimulate debate as to how the physics community - and CERN in particular - can enhance its interaction with the Wikimedia community, or via Wikimedia with the public at large. For instance: PLOS Computational Biology Topic Pages Wodak SJ, Mietchen D, Collings AM, Russell RB, Bourne PE (2012) "Topic Pages: PLoS Computational Biology Meets Wikipedia". PLoS Comput Biol 8(3): e1002446 Open Access Media Importer A proposed Wiki Journal, a peer-review journal to encourage academics to contribute Wikipedia articles Encyclopedia of Original Research and JATS-to-MediaWiki The Gene Wiki Wikigenes Wikis in Scholarly Publishing The Journal of the Future His talk is being draft...

  13. The graphics future in scientific applications

    International Nuclear Information System (INIS)

    Enderle, G.

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education. (orig.)

  14. A method to build and analyze scientific workflows from provenance through process mining

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Li, Jiafei; Liu, Zheng; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large due to the large quantities of data used. As

  15. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  16. Final Scientific EFNUDAT Workshop

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  17. Scientific developments ISFD3

    Science.gov (United States)

    Schropp, M.H.I.; Soong, T.W.

    2006-01-01

    Highlights, trends, and consensus from the 63 papers submitted to the Scientific Developments theme of the Third International Symposium on Flood Defence (ISFD) are presented. Realizing that absolute protection against flooding can never be guaranteed, trends in flood management have shifted: (1) from flood protection to flood-risk management, (2) from reinforcing structural protection to lowering flood levels, and (3) to sustainable management through integrated problem solving. Improved understanding of watershed responses, climate changes, applications of GIS and remote-sensing technologies, and advanced analytical tools appeared to be the driving forces for renewing flood-risk management strategies. Technical competence in integrating analytical tools to form the basin wide management systems are demonstrated by several large, transnation models. However, analyses from social-economic-environmental points of view are found lag in general. ?? 2006 Taylor & Francis Group.

  18. Dishonesty in scientific research.

    Science.gov (United States)

    Mazar, Nina; Ariely, Dan

    2015-11-02

    Fraudulent business practices, such as those leading to the Enron scandal and the conviction of Bernard Madoff, evoke a strong sense of public outrage. But fraudulent or dishonest actions are not exclusive to the realm of big corporations or to evil individuals without consciences. Dishonest actions are all too prevalent in everyone's daily lives, because people are constantly encountering situations in which they can gain advantages by cutting corners. Whether it's adding a few dollars in value to the stolen items reported on an insurance claim form or dropping outlier data points from a figure to make a paper sound more interesting, dishonesty is part of the human condition. Here, we explore how people rationalize dishonesty, the implications for scientific research, and what can be done to foster a culture of research integrity.

  19. Dishonesty in scientific research

    Science.gov (United States)

    Mazar, Nina; Ariely, Dan

    2015-01-01

    Fraudulent business practices, such as those leading to the Enron scandal and the conviction of Bernard Madoff, evoke a strong sense of public outrage. But fraudulent or dishonest actions are not exclusive to the realm of big corporations or to evil individuals without consciences. Dishonest actions are all too prevalent in everyone’s daily lives, because people are constantly encountering situations in which they can gain advantages by cutting corners. Whether it’s adding a few dollars in value to the stolen items reported on an insurance claim form or dropping outlier data points from a figure to make a paper sound more interesting, dishonesty is part of the human condition. Here, we explore how people rationalize dishonesty, the implications for scientific research, and what can be done to foster a culture of research integrity. PMID:26524587

  20. Annual scientific report 1978

    International Nuclear Information System (INIS)

    Proost, J.; Billiau, R.; Kirk, F.

    1979-01-01

    This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1978. The research areas are: 1. The sodium cooled fast reactor and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)