WorldWideScience

Sample records for high-performance astrophysical visualization

  1. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  2. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  3. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    International Nuclear Information System (INIS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W.; Edmon, P.

    2017-01-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  4. High performance visual display for HENP detectors

    CERN Document Server

    McGuigan, M; Spiletic, J; Fine, V; Nevski, P

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactiv...

  5. High performance visual display for HENP detectors

    International Nuclear Information System (INIS)

    McGuigan, Michael; Smith, Gordon; Spiletic, John; Fine, Valeri; Nevski, Pavel

    2001-01-01

    A high end visual display for High Energy Nuclear Physics (HENP) detectors is necessary because of the sheer size and complexity of the detector. For BNL this display will be of special interest because of STAR and ATLAS. To load, rotate, query, and debug simulation code with a modern detector simply takes too long even on a powerful work station. To visualize the HENP detectors with maximal performance we have developed software with the following characteristics. We develop a visual display of HENP detectors on BNL multiprocessor visualization server at multiple level of detail. We work with general and generic detector framework consistent with ROOT, GAUDI etc, to avoid conflicting with the many graphic development groups associated with specific detectors like STAR and ATLAS. We develop advanced OpenGL features such as transparency and polarized stereoscopy. We enable collaborative viewing of detector and events by directly running the analysis in BNL stereoscopic theatre. We construct enhanced interactive control, including the ability to slice, search and mark areas of the detector. We incorporate the ability to make a high quality still image of a view of the detector and the ability to generate animations and a fly through of the detector and output these to MPEG or VRML models. We develop data compression hardware and software so that remote interactive visualization will be possible among dispersed collaborators. We obtain real time visual display for events accumulated during simulations

  6. Visualization needs and techniques for astrophysical simulations

    International Nuclear Information System (INIS)

    Kapferer, W; Riser, T

    2008-01-01

    Numerical simulations have evolved continuously towards being an important field in astrophysics, equivalent to theory and observation. Due to the enormous developments in computer sciences, both hardware- and software-architecture, state-of-the-art simulations produce huge amounts of raw data with increasing complexity. In this paper some aspects of problems in the field of visualization in numerical astrophysics in combination with possible solutions are given. Commonly used visualization packages along with a newly developed approach to real-time visualization, incorporating shader programming to uncover the computational power of modern graphics cards, are presented. With these techniques at hand, real-time visualizations help scientists to understand the coherences in the results of their numerical simulations. Furthermore a fundamental problem in data analysis, i.e. coverage of metadata on how a visualization was created, is highlighted.

  7. High Performance Interactive System Dynamics Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brunhart-Lupo, Nicholas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Duckworth, Jonathan C [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    This brochure describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  8. High Performance Interactive System Dynamics Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brunhart-Lupo, Nicholas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Duckworth, Jonathan C [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    This presentation describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  9. Spherical Panoramas for Astrophysical Data Visualization

    Science.gov (United States)

    Kent, Brian R.

    2017-05-01

    Data immersion has advantages in astrophysical visualization. Complex multi-dimensional data and phase spaces can be explored in a seamless and interactive viewing environment. Putting the user in the data is a first step toward immersive data analysis. We present a technique for creating 360° spherical panoramas with astrophysical data. The three-dimensional software package Blender and the Google Spatial Media module are used together to immerse users in data exploration. Several examples employing these methods exhibit how the technique works using different types of astronomical data.

  10. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  11. Java 3D Interactive Visualization for Astrophysics

    Science.gov (United States)

    Chae, K.; Edirisinghe, D.; Lingerfelt, E. J.; Guidry, M. W.

    2003-05-01

    We are developing a series of interactive 3D visualization tools that employ the Java 3D API. We have applied this approach initially to a simple 3-dimensional galaxy collision model (restricted 3-body approximation), with quite satisfactory results. Running either as an applet under Web browser control, or as a Java standalone application, this program permits real-time zooming, panning, and 3-dimensional rotation of the galaxy collision simulation under user mouse and keyboard control. We shall also discuss applications of this technology to 3-dimensional visualization for other problems of astrophysical interest such as neutron star mergers and the time evolution of element/energy production networks in X-ray bursts. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  12. High Performance Multivariate Visual Data Exploration for Extremely Large Data

    International Nuclear Information System (INIS)

    Ruebel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat

    2008-01-01

    One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system

  13. High Performance Multivariate Visual Data Exploration for Extremely Large Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,

    2008-08-22

    One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.

  14. Sunfall: a collaborative visual analytics system for astrophysics

    International Nuclear Information System (INIS)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C

    2008-01-01

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project

  15. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Cecilia R.; Aragon, Cecilia R.; Bailey, Stephen J.; Poon, Sarah; Runge, Karl; Thomas, Rollin C.

    2008-07-07

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  16. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)], E-mail: CRAragon@lbl.gov

    2008-07-15

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  17. Visualizing astrophysical N-body systems

    International Nuclear Information System (INIS)

    Dubinski, John

    2008-01-01

    I begin with a brief history of N-body simulation and visualization and then go on to describe various methods for creating images and animations of modern simulations in cosmology and galactic dynamics. These techniques are incorporated into a specialized particle visualization software library called MYRIAD that is designed to render images within large parallel N-body simulations as they run. I present several case studies that explore the application of these methods to animations in star clusters, interacting galaxies and cosmological structure formation.

  18. AstroBlend: An astrophysical visualization package for Blender

    Science.gov (United States)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  19. Using Visual Analytics to Maintain Situation Awareness in Astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Cecilia R.; Poon, Sarah S.; Aldering, Gregory S.; Thomas, Rollin C.; Quimby, Robert

    2008-07-01

    We present a novel collaborative visual analytics application for cognitively overloaded users in the astrophysics domain. The system was developed for scientists needing to analyze heterogeneous, complex data under time pressure, and then make predictions and time-critical decisions rapidly and correctly under a constant influx of changing data. The Sunfall Data Taking system utilizes severalnovel visualization and analysis techniques to enable a team of geographically distributed domain specialists to effectively and remotely maneuver a custom-built instrument under challenging operational conditions. Sunfall Data Taking has been in use for over eighteen months by a major international astrophysics collaboration (the largest data volume supernova search currently in operation), and has substantially improved the operational efficiency of its users. We describe the system design process by an interdisciplinary team, the system architecture, and the results of an informal usability evaluation of the production system by domain experts in the context of Endsley?s three levels of situation awareness.

  20. HST Observations of Astrophysically Important Visual Binaries

    Science.gov (United States)

    Bond, Howard

    2015-10-01

    We propose to continue our long-term program of astrometry of close visual binaries, with the primary goal of determining purely dynamical masses for 3 important main-sequence stars and 9 white dwarfs (WDs). A secondary aim is to set limits on third bodies in the systems down to planetary mass. Three of our targets are naked-eye stars with much fainter companions that are extremely difficult to image from the ground. Our other 2 targets are double WDs, whose small separations and faintness likewise make them difficult to measure using ground-based techniques. Observations have been completed for a 3rd double WD.The bright stars, to be imaged with WFC3, are: (1) Procyon (P = 40.83 yr), containing a bright F star and a much fainter WD companion. With the continued monitoring proposed here, we will obtain masses to an accuracy of better than 1%, providing a testbed for theories of both Sun-like stars and WDs. (2) Sirius (P = 50.14 yr), an A-type star also having a faint WD companion, Sirius B, the nearest and brightest of all WDs. (3) Mu Cas (P = 21.08 yr), a nearby metal-deficient G dwarf for which accurate masses will lead to the stars' helium contents, with cosmological implications. The faint double WDs, to be observed with FGS, are: (1) G 107-70 (P = 18.84 yr), and (2) WD 1818+126 (P = 12.19 yr). Our astrometry of these systems will add 4 accurate masses to the handful of WD masses that are directly known from dynamical measurements. The FGS measurements will also provide precise parallaxes for the systems, a necessary ingredient in the mass determinations.

  1. From Big Data to Big Displays High-Performance Visualization at Blue Brain

    KAUST Repository

    Eilemann, Stefan; Abdellah, Marwan; Antille, Nicolas; Bilgili, Ahmet; Chevtchenko, Grigory; Dumusc, Raphael; Favreau, Cyrille; Hernando, Juan; Nachbaur, Daniel; Podhajski, Pawel; Villafranca, Jafet; Schü rmann, Felix

    2017-01-01

    Blue Brain has pushed high-performance visualization (HPV) to complement its HPC strategy since its inception in 2007. In 2011, this strategy has been accelerated to develop innovative visualization solutions through increased funding and strategic

  2. Applications of Java and Vector Graphics to Astrophysical Visualization

    Science.gov (United States)

    Edirisinghe, D.; Budiardja, R.; Chae, K.; Edirisinghe, G.; Lingerfelt, E.; Guidry, M.

    2002-12-01

    We describe a series of projects utilizing the portability of Java programming coupled with the compact nature of vector graphics (SVG and SWF formats) for setup and control of calculations, local and collaborative visualization, and interactive 2D and 3D animation presentations in astrophysics. Through a set of examples, we demonstrate how such an approach can allow efficient and user-friendly control of calculations in compiled languages such as Fortran 90 or C++ through portable graphical interfaces written in Java, and how the output of such calculations can be packaged in vector-based animation having interactive controls and extremely high visual quality, but very low bandwidth requirements.

  3. From Big Data to Big Displays High-Performance Visualization at Blue Brain

    KAUST Repository

    Eilemann, Stefan

    2017-10-19

    Blue Brain has pushed high-performance visualization (HPV) to complement its HPC strategy since its inception in 2007. In 2011, this strategy has been accelerated to develop innovative visualization solutions through increased funding and strategic partnerships with other research institutions. We present the key elements of this HPV ecosystem, which integrates C++ visualization applications with novel collaborative display systems. We motivate how our strategy of transforming visualization engines into services enables a variety of use cases, not only for the integration with high-fidelity displays, but also to build service oriented architectures, to link into web applications and to provide remote services to Python applications.

  4. High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL

    Science.gov (United States)

    Stone, John E.; Messmer, Peter; Sisneros, Robert; Schulten, Klaus

    2016-01-01

    Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications. PMID:27747137

  5. Visualization of Distributed Data Structures for High Performance Fortran-Like Languages

    Directory of Open Access Journals (Sweden)

    Rainer Koppler

    1997-01-01

    Full Text Available This article motivates the usage of graphics and visualization for efficient utilization of High Performance Fortran's (HPF's data distribution facilities. It proposes a graphical toolkit consisting of exploratory and estimation tools which allow the programmer to navigate through complex distributions and to obtain graphical ratings with respect to load distribution and communication. The toolkit has been implemented in a mapping design and visualization tool which is coupled with a compilation system for the HPF predecessor Vienna Fortran. Since this language covers a superset of HPF's facilities, the tool may also be used for visualization of HPF data structures.

  6. Application of High-performance Visual Analysis Methods to Laser Wakefield Particle Acceleration Data

    International Nuclear Information System (INIS)

    Rubel, Oliver; Prabhat, Mr.; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2008-01-01

    Our work combines and extends techniques from high-performance scientific data management and visualization to enable scientific researchers to gain insight from extremely large, complex, time-varying laser wakefield particle accelerator simulation data. We extend histogram-based parallel coordinates for use in visual information display as well as an interface for guiding and performing data mining operations, which are based upon multi-dimensional and temporal thresholding and data subsetting operations. To achieve very high performance on parallel computing platforms, we leverage FastBit, a state-of-the-art index/query technology, to accelerate data mining and multi-dimensional histogram computation. We show how these techniques are used in practice by scientific researchers to identify, visualize and analyze a particle beam in a large, time-varying dataset

  7. Visualization of Octree Adaptive Mesh Refinement (AMR) in Astrophysical Simulations

    Science.gov (United States)

    Labadens, M.; Chapon, D.; Pomaréde, D.; Teyssier, R.

    2012-09-01

    Computer simulations are important in current cosmological research. Those simulations run in parallel on thousands of processors, and produce huge amount of data. Adaptive mesh refinement is used to reduce the computing cost while keeping good numerical accuracy in regions of interest. RAMSES is a cosmological code developed by the Commissariat à l'énergie atomique et aux énergies alternatives (English: Atomic Energy and Alternative Energies Commission) which uses Octree adaptive mesh refinement. Compared to grid based AMR, the Octree AMR has the advantage to fit very precisely the adaptive resolution of the grid to the local problem complexity. However, this specific octree data type need some specific software to be visualized, as generic visualization tools works on Cartesian grid data type. This is why the PYMSES software has been also developed by our team. It relies on the python scripting language to ensure a modular and easy access to explore those specific data. In order to take advantage of the High Performance Computer which runs the RAMSES simulation, it also uses MPI and multiprocessing to run some parallel code. We would like to present with more details our PYMSES software with some performance benchmarks. PYMSES has currently two visualization techniques which work directly on the AMR. The first one is a splatting technique, and the second one is a custom ray tracing technique. Both have their own advantages and drawbacks. We have also compared two parallel programming techniques with the python multiprocessing library versus the use of MPI run. The load balancing strategy has to be smartly defined in order to achieve a good speed up in our computation. Results obtained with this software are illustrated in the context of a massive, 9000-processor parallel simulation of a Milky Way-like galaxy.

  8. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.

    Science.gov (United States)

    Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P

    2013-03-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.

  9. 360-degree videos: a new visualization technique for astrophysical simulations

    Science.gov (United States)

    Russell, Christopher M. P.

    2017-11-01

    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360° videos from astrophysical simulations is not only a new way to view these simulations as you are immersed in them, but is also a way to create engaging content for outreach to the public. We present what we believe is the first 360° video of an astrophysical simulation: a hydrodynamics calculation of the central parsec of the Galactic centre. We also describe how to create such movies, and briefly comment on what new science can be extracted from astrophysical simulations using 360° videos.

  10. Cactus and Visapult: A case study of ultra-high performance distributed visualization using connectionless protocols

    Energy Technology Data Exchange (ETDEWEB)

    Shalf, John; Bethel, E. Wes

    2002-05-07

    This past decade has seen rapid growth in the size, resolution, and complexity of Grand Challenge simulation codes. Many such problems still require interactive visualization tools to make sense of multi-terabyte data stores. Visapult is a parallel volume rendering tool that employs distributed components, latency tolerant algorithms, and high performance network I/O for effective remote visualization of massive datasets. In this paper we discuss using connectionless protocols to accelerate Visapult network I/O and interfacing Visapult to the Cactus General Relativity code to enable scalable remote monitoring and steering capabilities. With these modifications, network utilization has moved from 25 percent of line-rate using tuned multi-streamed TCP to sustaining 88 percent of line rate using the new UDP-based transport protocol.

  11. Multi-scale data visualization for computational astrophysics and climate dynamics at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Ahern, Sean; Daniel, Jamison R; Gao, Jinzhu; Ostrouchov, George; Toedte, Ross J; Wang, Chaoli

    2006-01-01

    Computational astrophysics and climate dynamics are two principal application foci at the Center for Computational Sciences (CCS) at Oak Ridge National Laboratory (ORNL). We identify a dataset frontier that is shared by several SciDAC computational science domains and present an exploration of traditional production visualization techniques enhanced with new enabling research technologies such as advanced parallel occlusion culling and high resolution small multiples statistical analysis. In collaboration with our research partners, these techniques will allow the visual exploration of a new generation of peta-scale datasets that cross this data frontier along all axes

  12. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  13. Detecting Distributed Scans Using High-Performance Query-DrivenVisualization

    Energy Technology Data Exchange (ETDEWEB)

    Stockinger, Kurt; Bethel, E. Wes; Campbell, Scott; Dart, Eli; Wu,Kesheng

    2006-09-01

    Modern forensic analytics applications, like network trafficanalysis, perform high-performance hypothesis testing, knowledgediscovery and data mining on very large datasets. One essential strategyto reduce the time required for these operations is to select only themost relevant data records for a given computation. In this paper, wepresent a set of parallel algorithms that demonstrate how an efficientselection mechanism -- bitmap indexing -- significantly speeds up acommon analysist ask, namely, computing conditional histogram on verylarge datasets. We present a thorough study of the performancecharacteristics of the parallel conditional histogram algorithms. Asacase study, we compute conditional histograms for detecting distributedscans hidden in a dataset consisting of approximately 2.5 billion networkconnection records. We show that these conditional histograms can becomputed on interactive timescale (i.e., in seconds). We also show how toprogressively modify the selection criteria to narrow the analysis andfind the sources of the distributed scans.

  14. High-performance execution of psychophysical tasks with complex visual stimuli in MATLAB

    Science.gov (United States)

    Asaad, Wael F.; Santhanam, Navaneethan; McClellan, Steven

    2013-01-01

    Behavioral, psychological, and physiological experiments often require the ability to present sensory stimuli, monitor and record subjects' responses, interface with a wide range of devices, and precisely control the timing of events within a behavioral task. Here, we describe our recent progress developing an accessible and full-featured software system for controlling such studies using the MATLAB environment. Compared with earlier reports on this software, key new features have been implemented to allow the presentation of more complex visual stimuli, increase temporal precision, and enhance user interaction. These features greatly improve the performance of the system and broaden its applicability to a wider range of possible experiments. This report describes these new features and improvements, current limitations, and quantifies the performance of the system in a real-world experimental setting. PMID:23034363

  15. 3D Printing Meets Astrophysics: A New Way to Visualize and Communicate Science

    Science.gov (United States)

    Madura, Thomas Ignatius; Steffen, Wolfgang; Clementel, Nicola; Gull, Theodore R.

    2015-08-01

    3D printing has the potential to improve the astronomy community’s ability to visualize, understand, interpret, and communicate important scientific results. I summarize recent efforts to use 3D printing to understand in detail the 3D structure of a complex astrophysical system, the supermassive binary star Eta Carinae and its surrounding bipolar ‘Homunculus’ nebula. Using mapping observations of molecular hydrogen line emission obtained with the ESO Very Large Telescope, we obtained a full 3D model of the Homunculus, allowing us to 3D print, for the first time, a detailed replica of a nebula (Steffen et al. 2014, MNRAS, 442, 3316). I also present 3D prints of output from supercomputer simulations of the colliding stellar winds in the highly eccentric binary located near the center of the Homunculus (Madura et al. 2015, arXiv:1503.00716). These 3D prints, the first of their kind, reveal previously unknown ‘finger-like’ structures at orbital phases shortly after periastron (when the two stars are closest to each other) that protrude outward from the spiral wind-wind collision region. The results of both efforts have received significant media attention in recent months, including two NASA press releases (http://www.nasa.gov/content/goddard/astronomers-bring-the-third-dimension-to-a-doomed-stars-outburst/ and http://www.nasa.gov/content/goddard/nasa-observatories-take-an-unprecedented-look-into-superstar-eta-carinae/), demonstrating the potential of using 3D printing for astronomy outreach and education. Perhaps more importantly, 3D printing makes it possible to bring the wonders of astronomy to new, often neglected, audiences, i.e. the blind and visually impaired.

  16. A study of visual double stars with early type primaries. IV. Astrophysical data

    International Nuclear Information System (INIS)

    Lindroos, K.P.

    1985-01-01

    Astrophysical parameters (MK class, colour excess, absolute magnitude, distance, effective temperature mass and age) are derived from calibrations of the uvbyβ indices for the members of 253 double stars with O or B type primaries and faint secondaries. The photometric spectral classification is compared to the MK classes and the agreement is very good. The derived data together with spectroscopic and JHKL data are used for deciding which pairs are likely to be physical and which are optical and it is shown that 98 (34%) of the secondaries are likely to be members of physical systems. For 90% of the physical pairs the projected separations between the components is less than 25000 AU. A majority of the physical secondaries are late type stars and 50% of them are contracting towards the zero-age main-sequence. Also presented are new uvbyβ data for 43 secondaries and a computer programme for determining astrophysical parameters from uvbyβ data

  17. 360-degree videos: a new visualization technique for astrophysical simulations, applied to the Galactic Center

    Science.gov (United States)

    Russell, Christopher

    2018-01-01

    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360-degree videos from astrophysical simulations not only provide a new way to view these simulations due to their immersive nature, but also yield engaging content for outreach to the public. We present our 360-degree video of an astrophysical simulation of the Galactic center: a hydrodynamics calculation of the colliding and accreting winds of the 30 Wolf-Rayet stars orbiting within the central parsec. Viewing the movie, which renders column density, from the location of the supermassive black hole gives a unique and immersive perspective of the shocked wind material inspiraling and tidally stretching as it plummets toward the black hole. We also describe how to create such movies, discuss what type of content does and does not look appealing in 360-degree format, and briefly comment on what new science can be extracted from astrophysical simulations using 360-degree videos.

  18. ISC High Performance 2017 International Workshops, DRBSD, ExaComm, HCPM, HPC-IODC, IWOPH, IXPUG, P^3MA, VHPC, Visualization at Scale, WOPSSS

    CERN Document Server

    Yokota, Rio; Taufer, Michela; Shalf, John

    2017-01-01

    This book constitutes revised selected papers from 10 workshops that were held as the ISC High Performance 2017 conference in Frankfurt, Germany, in June 2017. The 59 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They stem from the following workshops: Workshop on Virtualization in High-Performance Cloud Computing (VHPC) Visualization at Scale: Deployment Case Studies and Experience Reports International Workshop on Performance Portable Programming Models for Accelerators (P^3MA) OpenPOWER for HPC (IWOPH) International Workshop on Data Reduction for Big Scientific Data (DRBSD) International Workshop on Communication Architectures for HPC, Big Data, Deep Learning and Clouds at Extreme Scale Workshop on HPC Computing in a Post Moore's Law World (HCPM) HPC I/O in the Data Center ( HPC-IODC) Workshop on Performance and Scalability of Storage Systems (WOPSSS) IXPUG: Experiences on Intel Knights Landing at the One Year Mark International Workshop on Communicati...

  19. Relativistic Astrophysics

    International Nuclear Information System (INIS)

    Font, J. A.

    2015-01-01

    The relativistic astrophysics is the field of astrophysics employing the theory of relativity Einstein as physical-mathematical model is to study the universe. This discipline analyzes astronomical contexts in which the laws of classical mechanics of Newton's law of gravitation are not valid. (Author)

  20. Essential astrophysics

    CERN Document Server

    Lang, Kenneth R

    2013-01-01

    Essential Astrophysics is a book to learn or teach from, as well as a fundamental reference volume for anyone interested in astronomy and astrophysics. It presents astrophysics from basic principles without requiring any previous study of astronomy or astrophysics. It serves as a comprehensive introductory text, which takes the student through the field of astrophysics in lecture-sized chapters of basic physical principles applied to the cosmos. This one-semester overview will be enjoyed by undergraduate students with an interest in the physical sciences, such as astronomy, chemistry, engineering or physics, as well as by any curious student interested in learning about our celestial science. The mathematics required for understanding the text is on the level of simple algebra, for that is all that is needed to describe the fundamental principles. The text is of sufficient breadth and depth to prepare the interested student for more advanced specialized courses in the future. Astronomical examples are provide...

  1. Astrophysical Flows

    Science.gov (United States)

    Pringle, James E.; King, Andrew

    2003-07-01

    Almost all conventional matter in the Universe is fluid, and fluid dynamics plays a crucial role in astrophysics. This new graduate textbook provides a basic understanding of the fluid dynamical processes relevant to astrophysics. The mathematics used to describe these processes is simplified to bring out the underlying physics. The authors cover many topics, including wave propagation, shocks, spherical flows, stellar oscillations, the instabilities caused by effects such as magnetic fields, thermal driving, gravity, shear flows, and the basic concepts of compressible fluid dynamics and magnetohydrodynamics. The authors are Directors of the UK Astrophysical Fluids Facility (UKAFF) at the University of Leicester, and editors of the Cambridge Astrophysics Series. This book has been developed from a course in astrophysical fluid dynamics taught at the University of Cambridge. It is suitable for graduate students in astrophysics, physics and applied mathematics, and requires only a basic familiarity with fluid dynamics.• Provides coverage of the fundamental fluid dynamical processes an astrophysical theorist needs to know • Introduces new mathematical theory and techniques in a straightforward manner • Includes end-of-chapter problems to illustrate the course and introduce additional ideas

  2. Nuclear astrophysics

    International Nuclear Information System (INIS)

    Haxton, W.C.

    1992-01-01

    The problem of core-collapse supernovae is used to illustrate the many connections between nuclear astrophysics and the problems nuclear physicists study in terrestrial laboratories. Efforts to better understand the collapse and mantle ejection are also motivated by a variety of interdisciplinary issues in nuclear, particle, and astrophysics, including galactic chemical evolution, neutrino masses and mixing, and stellar cooling by the emission of new particles. The current status of theory and observations is summarized

  3. Relativistic astrophysics

    CERN Document Server

    Demianski, Marek

    2013-01-01

    Relativistic Astrophysics brings together important astronomical discoveries and the significant achievements, as well as the difficulties in the field of relativistic astrophysics. This book is divided into 10 chapters that tackle some aspects of the field, including the gravitational field, stellar equilibrium, black holes, and cosmology. The opening chapters introduce the theories to delineate gravitational field and the elements of relativistic thermodynamics and hydrodynamics. The succeeding chapters deal with the gravitational fields in matter; stellar equilibrium and general relativity

  4. Nuclear astrophysics

    International Nuclear Information System (INIS)

    Lehoucq, Roland; Klotz, Gregory

    2015-11-01

    Astronomy deals with the position and observation of the objects in our Universe, from planets to galaxies. It is the oldest of the sciences. Astrophysics is the study of the physical properties of these objects. It dates from the start of the 20. century. Nuclear astrophysics is the marriage of nuclear physics, a laboratory science concerned with the infinitely small, and astrophysics, the science of what is far away and infinitely large. Its aim is to explain the origin, evolution and abundance of the elements in the Universe. It was born in 1938 with the work of Hans Bethe, an American physicist who won the Nobel Prize for physics in 1967, on the nuclear reactions that can occur at the center of stars. It explains where the incredible energy of the stars and the Sun comes from and enables us to understand how they are born, live and die. The matter all around us and from which we are made, is made up of ninety-two chemical elements that can be found in every corner of the Universe. Nuclear astrophysics explains the origin of these chemical elements by nucleosynthesis, which is the synthesis of atomic nuclei in different astrophysical environments such as stars. Nuclear astrophysics provides answers to fundamental questions: - Our Sun and the stars in general shine because nuclear reactions are taking place within them. - The stars follow a sequence of nuclear reaction cycles. Nucleosynthesis in the stars enables us to explain the origin and abundance of elements essential to life, such as carbon, oxygen, nitrogen and iron. - Star explosions, in the form of supernovae, disperse the nuclei formed by nucleosynthesis into space and explain the formation of the heaviest chemical elements such as gold, platinum and lead. Nuclear astrophysics is still a growing area of science. (authors)

  5. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Maxine D. [Acting Director, EVL; Leigh, Jason [PI

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascale computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.

  6. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  7. Astrophysical Concepts

    CERN Document Server

    Harwit, Martin

    2006-01-01

    This classic text, aimed at senior undergraduates and beginning graduate students in physics and astronomy, presents a wide range of astrophysical concepts in sufficient depth to give the reader a quantitative understanding of the subject. Emphasizing physical concepts, the book outlines cosmic events but does not portray them in detail: it provides a series of astrophysical sketches. For this fourth edition, nearly every part of the text has been reconsidered and rewritten, new sections have been added to cover recent developments, and others have been extensively revised and brought up to date. The book begins with an outline of the scope of modern astrophysics and enumerates some of the outstanding problems faced in the field today. The basic physics needed to tackle these questions are developed in the next few chapters using specific astronomical processes as examples. The second half of the book enlarges on these topics and shows how we can obtain quantitative insight into the structure and evolution of...

  8. Nuclear astrophysics

    International Nuclear Information System (INIS)

    Arnould, M.; Takahashi, K.

    1999-01-01

    Nuclear astrophysics is that branch of astrophysics which helps understanding of the Universe, or at least some of its many faces, through the knowledge of the microcosm of the atomic nucleus. It attempts to find as many nuclear physics imprints as possible in the macrocosm, and to decipher what those messages are telling us about the varied constituent objects in the Universe at present and in the past. In the last decades much advance has been made in nuclear astrophysics thanks to the sometimes spectacular progress made in the modelling of the structure and evolution of the stars, in the quality and diversity of the astronomical observations, as well as in the experimental and theoretical understanding of the atomic nucleus and of its spontaneous or induced transformations. Developments in other subfields of physics and chemistry have also contributed to that advance. Notwithstanding the accomplishment, many long-standing problems remain to be solved, and the theoretical understanding of a large variety of observational facts needs to be put on safer grounds. In addition, new questions are continuously emerging, and new facts endangering old ideas. This review shows that astrophysics has been, and still is, highly demanding to nuclear physics in both its experimental and theoretical components. On top of the fact that large varieties of nuclei have to be dealt with, these nuclei are immersed in highly unusual environments which may have a significant impact on their static properties, the diversity of their transmutation modes, and on the probabilities of these modes. In order to have a chance of solving some of the problems nuclear astrophysics is facing, the astrophysicists and nuclear physicists are obviously bound to put their competence in common, and have sometimes to benefit from the help of other fields of physics, like particle physics, plasma physics or solid-state physics. Given the highly varied and complex aspects, we pick here some specific nuclear

  9. Plasma astrophysics

    CERN Document Server

    Kaplan, S A; ter Haar, D

    2013-01-01

    Plasma Astrophysics is a translation from the Russian language; the topics discussed are based on lectures given by V.N. Tsytovich at several universities. The book describes the physics of the various phenomena and their mathematical formulation connected with plasma astrophysics. This book also explains the theory of the interaction of fast particles plasma, their radiation activities, as well as the plasma behavior when exposed to a very strong magnetic field. The text describes the nature of collective plasma processes and of plasma turbulence. One author explains the method of elementary

  10. Neutrino astrophysics

    International Nuclear Information System (INIS)

    Roulet, E.

    2001-01-01

    A general overview of neutrino physics and astrophysics is given, starting with a historical account of the development of our understanding of neutrinos and how they helped to unravel the structure of the Standard Model. We discuss why it is so important to establish if neutrinos are massive and introduce the main scenarios to provide them a mass. The present bounds and the positive indications in favor of non-zero neutrino masses are discussed, including the recent results on atmospheric and solar neutrinos. The major role that neutrinos play in astrophysics and cosmology is illustrated. (author)

  11. Relativistic astrophysics

    CERN Document Server

    Price, R H

    1993-01-01

    Work reported in the workshop on relativistic astrophysics spanned a wide varicy of topics. Two specific areas seemed of particular interest. Much attention was focussed on gravitational wave sources, especially on the waveforms they produce, and progress was reported in theoretical and observational aspects of accretion disks.

  12. Astrophysics today

    International Nuclear Information System (INIS)

    Cameron, A.G.W.

    1984-01-01

    Examining recent history, current trends, and future possibilities, the author reports the frontiers of research on the solar system, stars, galactic physics, and cosmological physics. The book discusses the great discoveries in astronomy and astrophysics and examines the circumstances in which they occurred. It discusses the physics of white dwarfs, the inflationary universe, the extinction of dinosaurs, black hole, cosmological models, and much more

  13. INL High Performance Building Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Jennifer D. Morton

    2010-02-01

    High performance buildings, also known as sustainable buildings and green buildings, are resource efficient structures that minimize the impact on the environment by using less energy and water, reduce solid waste and pollutants, and limit the depletion of natural resources while also providing a thermally and visually comfortable working environment that increases productivity for building occupants. As Idaho National Laboratory (INL) becomes the nation’s premier nuclear energy research laboratory, the physical infrastructure will be established to help accomplish this mission. This infrastructure, particularly the buildings, should incorporate high performance sustainable design features in order to be environmentally responsible and reflect an image of progressiveness and innovation to the public and prospective employees. Additionally, INL is a large consumer of energy that contributes to both carbon emissions and resource inefficiency. In the current climate of rising energy prices and political pressure for carbon reduction, this guide will help new construction project teams to design facilities that are sustainable and reduce energy costs, thereby reducing carbon emissions. With these concerns in mind, the recommendations described in the INL High Performance Building Strategy (previously called the INL Green Building Strategy) are intended to form the INL foundation for high performance building standards. This revised strategy incorporates the latest federal and DOE orders (Executive Order [EO] 13514, “Federal Leadership in Environmental, Energy, and Economic Performance” [2009], EO 13423, “Strengthening Federal Environmental, Energy, and Transportation Management” [2007], and DOE Order 430.2B, “Departmental Energy, Renewable Energy, and Transportation Management” [2008]), the latest guidelines, trends, and observations in high performance building construction, and the latest changes to the Leadership in Energy and Environmental Design

  14. High Performance Marine Vessels

    CERN Document Server

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  15. High Performance Macromolecular Material

    National Research Council Canada - National Science Library

    Forest, M

    2002-01-01

    .... In essence, most commercial high-performance polymers are processed through fiber spinning, following Nature and spider silk, which is still pound-for-pound the toughest liquid crystalline polymer...

  16. Observational astrophysics

    CERN Document Server

    Léna, Pierre; Lebrun, François; Mignard, François; Pelat, Didier

    2012-01-01

    This is the updated, widely revised, restructured and expanded third edition of Léna et al.'s successful work Observational Astrophysics. It presents a synthesis on tools and methods of observational astrophysics of the early 21st century. Written specifically for astrophysicists and graduate students, this textbook focuses on fundamental and sometimes practical limitations on the ultimate performance that an astronomical system may reach, rather than presenting particular systems in detail. In little more than a decade there has been extraordinary progress in imaging and detection technologies, in the fields of adaptive optics, optical interferometry, in the sub-millimetre waveband, observation of neutrinos, discovery of exoplanets, to name but a few examples. The work deals with ground-based and space-based astronomy and their respective fields. And it also presents the ambitious concepts behind space missions aimed for the next decades. Avoiding particulars, it covers the whole of the electromagnetic spec...

  17. astrophysical significance

    Directory of Open Access Journals (Sweden)

    Dartois E.

    2014-02-01

    Full Text Available Clathrate hydrates, ice inclusion compounds, are of major importance for the Earth’s permafrost regions and may control the stability of gases in many astrophysical bodies such as the planets, comets and possibly interstellar grains. Their physical behavior may provide a trapping mechanism to modify the absolute and relative composition of icy bodies that could be the source of late-time injection of gaseous species in planetary atmospheres or hot cores. In this study, we provide and discuss laboratory-recorded infrared signatures of clathrate hydrates in the near to mid-infrared and the implications for space-based astrophysical tele-detection in order to constrain their possible presence.

  18. Observational astrophysics

    CERN Document Server

    Smith, Robert C

    1995-01-01

    Combining a critical account of observational methods (telescopes and instrumentation) with a lucid description of the Universe, including stars, galaxies and cosmology, Smith provides a comprehensive introduction to the whole of modern astrophysics beyond the solar system. The first half describes the techniques used by astronomers to observe the Universe: optical telescopes and instruments are discussed in detail, but observations at all wavelengths are covered, from radio to gamma-rays. After a short interlude describing the appearance of the sky at all wavelengths, the role of positional astronomy is highlighted. In the second half, a clear description is given of the contents of the Universe, including accounts of stellar evolution and cosmological models. Fully illustrated throughout, with exercises given in each chapter, this textbook provides a thorough introduction to astrophysics for all physics undergraduates, and a valuable background for physics graduates turning to research in astronomy.

  19. Stellar astrophysics

    International Nuclear Information System (INIS)

    1987-01-01

    A number of studies in the field of steller astrophysics were undertaken by the South African Astronomical Observatory in 1986. These studies included; evolutionary effects on the surface abundances of an early-type supergiant; hydrogen deficient stars; t tauri stars; rotational modulation and flares on RS CVn and BY Dra stars; carbon and heavy element stars, and slow variability and circumstellar shells of red variable stars. 4 figs

  20. Reduction and processing of astrophysical data by visualization and creation of merger trees from dark matter particle simulations

    International Nuclear Information System (INIS)

    Riser, T.

    2012-01-01

    State of the art dark matter particle simulations of galaxy clusters produce vast amounts of raw data that need to be interpreted and scientifically understood. In this thesis two cornerstones involved in this process are presented. First, a unique and robust algorithm is shown, which extracts a so called ''merger tree'' from dark matter particle data. It represents the development and history of every galaxy that lives within the gravitational potential of the dark matter halos formed by the simulated structure formation process, with a special focus on the merging of smaller halos into bigger ones through the course of time. Second, a modern approach is discussed that facilitates the massively parallel calculative power of state of the art graphics cards to greatly improve the image quality of real-time particle visualizations without the requirement of additional geometric data. (author)

  1. High performance conductometry

    International Nuclear Information System (INIS)

    Saha, B.

    2000-01-01

    Inexpensive but high performance systems have emerged progressively for basic and applied measurements in physical and analytical chemistry on one hand, and for on-line monitoring and leak detection in plants and facilities on the other. Salient features of the developments will be presented with specific examples

  2. High performance systems

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, M.B. [comp.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  3. Danish High Performance Concretes

    DEFF Research Database (Denmark)

    Nielsen, M. P.; Christoffersen, J.; Frederiksen, J.

    1994-01-01

    In this paper the main results obtained in the research program High Performance Concretes in the 90's are presented. This program was financed by the Danish government and was carried out in cooperation between The Technical University of Denmark, several private companies, and Aalborg University...... concretes, workability, ductility, and confinement problems....

  4. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    . Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  5. Astrophysical cosmology

    Science.gov (United States)

    Bardeen, J. M.

    The last several years have seen a tremendous ferment of activity in astrophysical cosmology. Much of the theoretical impetus has come from particle physics theories of the early universe and candidates for dark matter, but what promise to be even more significant are improved direct observations of high z galaxies and intergalactic matter, deeper and more comprehensive redshift surveys, and the increasing power of computer simulations of the dynamical evolution of large scale structure. Upper limits on the anisotropy of the microwave background radiation are gradually getting tighter and constraining more severely theoretical scenarios for the evolution of the universe.

  6. Astrophysical cosmology

    International Nuclear Information System (INIS)

    Bardeen, J.M.

    1986-01-01

    The last several years have seen a tremendous ferment of activity in astrophysical cosmology. Much of the theoretical impetus has come from particle physics theories of the early universe and candidates for dark matter, but what promise to be even more significant are improved direct observations of high z galaxies and intergalactic matter, deeper and more comprehensive redshift surveys, and the increasing power of computer simulations of the dynamical evolution of large scale structure. Upper limits on the anisotropy of the microwave background radiation are gradually getting tighter and constraining more severely theoretical scenarios for the evolution of the universe. 47 refs

  7. High-Performance Networking

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known "standard computer network communication". It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90's, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM). Some remarks will be made concerning the rapid expanding applications of networked storage.

  8. High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Traian Oneţ

    2009-01-01

    Full Text Available The paper presents the last studies and researches accomplished in Cluj-Napoca related to high performance concrete, high strength concrete and self compacting concrete. The purpose of this paper is to raid upon the advantages and inconveniences when a particular concrete type is used. Two concrete recipes are presented, namely for the concrete used in rigid pavement for roads and another one for self-compacting concrete.

  9. High performance polymeric foams

    International Nuclear Information System (INIS)

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-01-01

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy

  10. Clojure high performance programming

    CERN Document Server

    Kumar, Shantanu

    2013-01-01

    This is a short, practical guide that will teach you everything you need to know to start writing high performance Clojure code.This book is ideal for intermediate Clojure developers who are looking to get a good grip on how to achieve optimum performance. You should already have some experience with Clojure and it would help if you already know a little bit of Java. Knowledge of performance analysis and engineering is not required. For hands-on practice, you should have access to Clojure REPL with Leiningen.

  11. High performance data transfer

    Science.gov (United States)

    Cottrell, R.; Fang, C.; Hanushevsky, A.; Kreuger, W.; Yang, W.

    2017-10-01

    The exponentially increasing need for high speed data transfer is driven by big data, and cloud computing together with the needs of data intensive science, High Performance Computing (HPC), defense, the oil and gas industry etc. We report on the Zettar ZX software. This has been developed since 2013 to meet these growing needs by providing high performance data transfer and encryption in a scalable, balanced, easy to deploy and use way while minimizing power and space utilization. In collaboration with several commercial vendors, Proofs of Concept (PoC) consisting of clusters have been put together using off-the- shelf components to test the ZX scalability and ability to balance services using multiple cores, and links. The PoCs are based on SSD flash storage that is managed by a parallel file system. Each cluster occupies 4 rack units. Using the PoCs, between clusters we have achieved almost 200Gbps memory to memory over two 100Gbps links, and 70Gbps parallel file to parallel file with encryption over a 5000 mile 100Gbps link.

  12. High performance sapphire windows

    Science.gov (United States)

    Bates, Stephen C.; Liou, Larry

    1993-02-01

    High-quality, wide-aperture optical access is usually required for the advanced laser diagnostics that can now make a wide variety of non-intrusive measurements of combustion processes. Specially processed and mounted sapphire windows are proposed to provide this optical access to extreme environment. Through surface treatments and proper thermal stress design, single crystal sapphire can be a mechanically equivalent replacement for high strength steel. A prototype sapphire window and mounting system have been developed in a successful NASA SBIR Phase 1 project. A large and reliable increase in sapphire design strength (as much as 10x) has been achieved, and the initial specifications necessary for these gains have been defined. Failure testing of small windows has conclusively demonstrated the increased sapphire strength, indicating that a nearly flawless surface polish is the primary cause of strengthening, while an unusual mounting arrangement also significantly contributes to a larger effective strength. Phase 2 work will complete specification and demonstration of these windows, and will fabricate a set for use at NASA. The enhanced capabilities of these high performance sapphire windows will lead to many diagnostic capabilities not previously possible, as well as new applications for sapphire.

  13. Astrophysical Hydrodynamics An Introduction

    CERN Document Server

    Shore, Steven N

    2007-01-01

    This latest edition of the proven and comprehensive treatment on the topic -- from the bestselling author of ""Tapestry of Modern Astrophysics"" -- has been updated and revised to reflect the newest research results. Suitable for AS0000 and AS0200 courses, as well as advanced astrophysics and astronomy lectures, this is an indispensable theoretical backup for studies on celestial body formation and astrophysics. Includes exercises with solutions.

  14. Gravity, particles and astrophysics

    International Nuclear Information System (INIS)

    Wesson, P.S.

    1980-01-01

    The author deals with the relationship between gravitation and elementary particle physics, and the implications of these subjects for astrophysics. The text is split up into two parts. The first part represents a relatively non-technical overview of the subject, while the second part represents a technical examination of the most important aspects of non-Einsteinian gravitational theory and its relation to astrophysics. Relevant references from the fields of gravitation, elementary particle theory and astrophysics are included. (Auth.)

  15. Particle Physics & Astrophysics (PPA)

    Data.gov (United States)

    Federal Laboratory Consortium — Scientists at SLAC's Particle Physics and Astrophysics develop and utilize unique instruments from underground to outer space to explore the ultimate laws of nature...

  16. RavenDB high performance

    CERN Document Server

    Ritchie, Brian

    2013-01-01

    RavenDB High Performance is comprehensive yet concise tutorial that developers can use to.This book is for developers & software architects who are designing systems in order to achieve high performance right from the start. A basic understanding of RavenDB is recommended, but not required. While the book focuses on advanced topics, it does not assume that the reader has a great deal of prior knowledge of working with RavenDB.

  17. Astrophysical Institute, Potsdam

    Science.gov (United States)

    Murdin, P.

    2000-11-01

    Built upon a tradition of almost 300 years, the Astrophysical Institute Potsdam (AIP) is in an historical sense the successor of one of the oldest astronomical observatories in Germany. It is the first institute in the world which incorporated the term `astrophysical' in its name, and is connected with distinguished scientists such as Karl Schwarzschild and Albert Einstein. The AIP constitutes on...

  18. Black hole astrophysics

    International Nuclear Information System (INIS)

    Blandford, R.D.; Thorne, K.S.

    1979-01-01

    Following an introductory section, the subject is discussed under the headings: on the character of research in black hole astrophysics; isolated holes produced by collapse of normal stars; black holes in binary systems; black holes in globular clusters; black holes in quasars and active galactic nuclei; primordial black holes; concluding remarks on the present state of research in black hole astrophysics. (U.K.)

  19. Plasma in astrophysics

    International Nuclear Information System (INIS)

    Kulsrud, R.M.

    1982-10-01

    Two examples of plasma phenomena of importance to astrophysics are reviewed. These are examples where astrophysical understanding hinges on further progress in plasma physics understanding. The two examples are magnetic reconnection and the collisionless interaction between a population of energetic particles and a cooler gas or plasma, in particular the interaction between galactic cosmic rays and the interstellar medium

  20. "Journey to the Stars": Presenting What Stars Are to Global Planetarium Audiences by Blending Astrophysical Visualizations Into a Single Immersive Production at the American Museum of Natural History

    Science.gov (United States)

    Emmart, Carter; Mac Low, M.; Oppenheimer, B. R.; Kinzler, R.; Paglione, T. A. D.; Abbott, B. P.

    2010-01-01

    "Journey to the Stars" is the latest and fourth space show based on storytelling from data visualization at the Rose Center for Earth and Space at the American Museum of Natural History. This twenty five minute, full dome movie production presents to planetarium audiences what the stars are, where they come from, how they vary in type and over time, and why they are important to life of Earth. Over forty scientists from around the world contributed their research to what is visualized into roughly fifteen major scenes. How this production is directed into a consolidated immersive informal science experience with learning goals is an integrative process with many inputs and concerns for scientific accuracy. The goal is a seamless merger of visualizations at varying spatial and temporal scales with acuity toward depth perception, revealing unseen phenomena, and the layering of concepts together to build an understanding of stars; to blend our common experience of them in the sky with the uncommon meaning we have come to know through science. Scripted by Louise Gikow who has worked for Children's Television Workshop, narrated by Whoopie Goldberg, and musically scored by Robert Miller, this production strives to guide audiences through challenging scientific concepts by complimenting the natural beauty the subject matter presents with understandable prose and musical grandeur. "Journey to the Stars" was produced in cooperation with NASA's Science Mission Directorate, Heliophysics Division and is in release at major planetariums, worldwide.

  1. High Performance Bulk Thermoelectric Materials

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhifeng [Boston College, Chestnut Hill, MA (United States)

    2013-03-31

    Over 13 plus years, we have carried out research on electron pairing symmetry of superconductors, growth and their field emission property studies on carbon nanotubes and semiconducting nanowires, high performance thermoelectric materials and other interesting materials. As a result of the research, we have published 104 papers, have educated six undergraduate students, twenty graduate students, nine postdocs, nine visitors, and one technician.

  2. High-Performance Operating Systems

    DEFF Research Database (Denmark)

    Sharp, Robin

    1999-01-01

    Notes prepared for the DTU course 49421 "High Performance Operating Systems". The notes deal with quantitative and qualitative techniques for use in the design and evaluation of operating systems in computer systems for which performance is an important parameter, such as real-time applications......, communication systems and multimedia systems....

  3. Astrophysics in a nutshell

    CERN Document Server

    Maoz, Dan

    2007-01-01

    A concise but thorough introduction to the observational data and theoretical concepts underlying modern astronomy, Astrophysics in a Nutshell is designed for advanced undergraduate science majors taking a one-semester course. This well-balanced and up-to-date textbook covers the essentials of modern astrophysics--from stars to cosmology--emphasizing the common, familiar physical principles that govern astronomical phenomena, and the interplay between theory and observation. In addition to traditional topics such as stellar remnants, galaxies, and the interstellar medium, Astrophysics in a N

  4. An invitation to astrophysics

    CERN Document Server

    Padmanabhan, Thanu

    2006-01-01

    This unique book provides a clear and lucid description of several aspects of astrophysics and cosmology in a language understandable to a physicist or beginner in astrophysics. It presents the key topics in all branches of astrophysics and cosmology in a simple and concise language. The emphasis is on currently active research areas and exciting new frontiers rather than on more pedantic topics. Many complicated results are introduced with simple, novel derivations which strengthen the conceptual understanding of the subject. The book also contains over one hundred exercises which will help s

  5. A Linux Workstation for High Performance Graphics

    Science.gov (United States)

    Geist, Robert; Westall, James

    2000-01-01

    The primary goal of this effort was to provide a low-cost method of obtaining high-performance 3-D graphics using an industry standard library (OpenGL) on PC class computers. Previously, users interested in doing substantial visualization or graphical manipulation were constrained to using specialized, custom hardware most often found in computers from Silicon Graphics (SGI). We provided an alternative to expensive SGI hardware by taking advantage of third-party, 3-D graphics accelerators that have now become available at very affordable prices. To make use of this hardware our goal was to provide a free, redistributable, and fully-compatible OpenGL work-alike library so that existing bodies of code could simply be recompiled. for PC class machines running a free version of Unix. This should allow substantial cost savings while greatly expanding the population of people with access to a serious graphics development and viewing environment. This should offer a means for NASA to provide a spectrum of graphics performance to its scientists, supplying high-end specialized SGI hardware for high-performance visualization while fulfilling the requirements of medium and lower performance applications with generic, off-the-shelf components and still maintaining compatibility between the two.

  6. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  7. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  8. Neo4j high performance

    CERN Document Server

    Raj, Sonal

    2015-01-01

    If you are a professional or enthusiast who has a basic understanding of graphs or has basic knowledge of Neo4j operations, this is the book for you. Although it is targeted at an advanced user base, this book can be used by beginners as it touches upon the basics. So, if you are passionate about taming complex data with the help of graphs and building high performance applications, you will be able to get valuable insights from this book.

  9. High energy astrophysics

    International Nuclear Information System (INIS)

    Engel, A.R.

    1979-01-01

    High energy astrophysical research carried out at the Blackett Laboratory, Imperial College, London is reviewed. Work considered includes cosmic ray particle detection, x-ray astronomy, gamma-ray astronomy, gamma and x-ray bursts. (U.K.)

  10. 2004 ASTRONOMY & ASTROPHYSICS

    Indian Academy of Sciences (India)

    user

    This publication of the Academy on Astronomy and Astrophysics is unique in ... bring out position papers on societal issues where science plays a major ..... funding agencies, the Astronomical Society of ..... orbit very close to the parent star.

  11. Topics in Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Chung, K.C.

    1982-01-01

    Some topics in nuclear astrophysics are discussed, e.g.: highly evolved stellar cores, stellar evolution (through the temperature analysis of stellar surface), nucleosynthesis and finally the solar neutrino problem. (L.C.) [pt

  12. Astrophysics Decoding the cosmos

    CERN Document Server

    Irwin, Judith A

    2007-01-01

    Astrophysics: Decoding the Cosmos is an accessible introduction to the key principles and theories underlying astrophysics. This text takes a close look at the radiation and particles that we receive from astronomical objects, providing a thorough understanding of what this tells us, drawing the information together using examples to illustrate the process of astrophysics. Chapters dedicated to objects showing complex processes are written in an accessible manner and pull relevant background information together to put the subject firmly into context. The intention of the author is that the book will be a 'tool chest' for undergraduate astronomers wanting to know the how of astrophysics. Students will gain a thorough grasp of the key principles, ensuring that this often-difficult subject becomes more accessible.

  13. Collisionless plasmas in astrophysics

    CERN Document Server

    Belmont, Gerard; Mottez, Fabrice; Pantellini, Filippo; Pelletier, Guy

    2013-01-01

    Collisionless Plasmas in Astrophysics examines the unique properties of media without collisions in plasma physics. Experts in this field, the authors present the first book to concentrate on collisionless conditions in plasmas, whether close or not to thermal equilibrium. Filling a void in scientific literature, Collisionless Plasmas in Astrophysics explains the possibilities of modeling such plasmas, using a fluid or a kinetic framework. It also addresses common misconceptions that even professionals may possess, on phenomena such as "collisionless (Landau) damping". Abundant illustrations

  14. Nonlinear dynamics and astrophysics

    International Nuclear Information System (INIS)

    Vallejo, J. C.; Sanjuan, M. A. F.

    2000-01-01

    Concepts and techniques from Nonlinear Dynamics, also known as Chaos Theory, have been applied successfully to several astrophysical fields such as orbital motion, time series analysis or galactic dynamics, providing answers to old questions but also opening a few new ones. Some of these topics are described in this review article, showing the basis of Nonlinear Dynamics, and how it is applied in Astrophysics. (Author)

  15. High Performance Proactive Digital Forensics

    International Nuclear Information System (INIS)

    Alharbi, Soltan; Traore, Issa; Moa, Belaid; Weber-Jahnke, Jens

    2012-01-01

    With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

  16. Laboratory Astrophysics Prize: Laboratory Astrophysics with Nuclei

    Science.gov (United States)

    Wiescher, Michael

    2018-06-01

    Nuclear astrophysics is concerned with nuclear reaction and decay processes from the Big Bang to the present star generation controlling the chemical evolution of our universe. Such nuclear reactions maintain stellar life, determine stellar evolution, and finally drive stellar explosion in the circle of stellar life. Laboratory nuclear astrophysics seeks to simulate and understand the underlying processes using a broad portfolio of nuclear instrumentation, from reactor to accelerator from stable to radioactive beams to map the broad spectrum of nucleosynthesis processes. This talk focuses on only two aspects of the broad field, the need of deep underground accelerator facilities in cosmic ray free environments in order to understand the nucleosynthesis in stars, and the need for high intensity radioactive beam facilities to recreate the conditions found in stellar explosions. Both concepts represent the two main frontiers of the field, which are being pursued in the US with the CASPAR accelerator at the Sanford Underground Research Facility in South Dakota and the FRIB facility at Michigan State University.

  17. Toward observational neutrino astrophysics

    International Nuclear Information System (INIS)

    Koshiba, M.

    1988-01-01

    It is true that: (1) The first observation of the neutrino burst from the supernova SN1987a by Kamiokande-II which was immediately confirmed by IBM; and (2) the first real-time, directional, and spectral observation of solar 8 B neutrinos also by Kamiokande-II could perhaps be considered as signalling the birth of observational astrophysics. The field, however, is still in its infancy and is crying out for tender loving care. Namely, while the construction of astronomy requires the time and the direction of the signal and that of astrophysics requires, in addition to the spectral information, the observations of (1) could not give the directional information and the results of both (1) and (2) are still suffering from the meager statistics. How do we remedy this situation to let this new born science of observational neutrino astrophysics grow healthy. This is what the author addresses in this talk. 15 refs., 8 figs

  18. Numerical simulation in astrophysics

    International Nuclear Information System (INIS)

    Miyama, Shoken

    1985-01-01

    There have been many numerical simulations of hydrodynamical problems in astrophysics, e.g. processes of star formation, supernova explosion and formation of neutron stars, and general relativistic collapse of star to form black hole. The codes are made to be suitable for computing such problems. In astrophysical hydrodynamical problems, there are the characteristics: problems of self-gravity or external gravity acting, objects of scales very large or very short, objects changing by short period or long time scale, problems of magnetic force and/or centrifugal force acting. In this paper, we present one of methods of numerical simulations which may satisfy these requirements, so-called smoothed particle methods. We then introduce the methods briefly. Then, we show one of the applications of the methods to astrophysical problem (fragmentation and collapse of rotating isothermal cloud). (Mori, K.)

  19. Nuclear reactions in astrophysics

    International Nuclear Information System (INIS)

    Arnould, M.; Rayet, M.

    1990-01-01

    At all times and at all astrophysical scales, nuclear reactions have played and continue to play a key role. This concerns the energetics as well as the production of nuclides (nucleosynthesis). After a brief review of the observed composition of various objects in the universe, and especially of the solar system, the basic ingredients that are required in order to build up models for the chemical evolution of galaxies are sketched. Special attention is paid to the evaluation of the stellar yields through an overview of the important burning episodes and nucleosynthetic processes that can develop in non-exploding or exploding stars. Emphasis is put on the remaining astrophysical and nuclear physics uncertainties that hamper a clear understanding of the observed characteristics, and especially compositions, of a large variety of astrophysical objects

  20. The new astrophysics

    International Nuclear Information System (INIS)

    Longair, M.

    1989-01-01

    The author offers a review of advances in astrophysics since 1945 when astronomers started to explore the universe beyond the bounds of the optical wavelength of the electromagnetic spectrum, especially in the fields of radio, x ray and gamma ray, cosmic ray, ultraviolet and infrared astronomies, as well as neutral hydrogen and molecular line studies. Theoretical and technological advances have also kept pace. An overview of the new astrophysics is offered focusing on the large-scale distribution of matter and the background microwave radiation, galaxies, stellar evolution and the interstellar media (dust, gas and high energy particles). Nucleosynthesis in stars is mentioned in a broader discussion of stellar evolution, and dead stars including supernovae. Active galaxies and quasars are discussed. After considering what should be included in astrophysical cosmology, the author looks to the future of the science. (U.K.)

  1. Astrophysics Update 2

    CERN Document Server

    Mason, John W

    2006-01-01

    "Astrophysics Updates" is intended to serve the information needs of professional astronomers and postgraduate students about areas of astronomy, astrophysics and cosmology that are rich and active research spheres. Observational methods and the latest results of astronomical research are presented as well as their theoretical foundations and interrelations. The contributed commissioned articles are written by leading exponents in a format that will appeal to professional astronomers and astrophysicists who are interested in topics outside their own specific areas of research. This collection of timely reviews may also attract the interest of advanced amateur astronomers seeking scientifically rigorous coverage.

  2. Astrophysical opacity library

    International Nuclear Information System (INIS)

    Huebner, W.F.; Merts, A.L.; Magee, N.H. Jr.; Argo, M.F.

    1977-08-01

    The astrophysical elements opacity library includes equation of state data, various mean opacities, and 2000 values of the frequency-dependent extinction coefficients in equally spaced intervals u identical with hν/kT from 0 to 20 for 41 degeneracy parameters eta from -28 (nondegenerate) to 500 and 46 temperatures kT from 1 eV to 100 keV. Among available auxiliary quantities are the free electron density, mass density, and plasma cutoff frequency. A library-associated program can produce opacities for mixtures with up to 20 astrophysically abundant constituent elements at 4 levels of utility for the user

  3. Theoretical astrophysics an introduction

    CERN Document Server

    Bartelmann, Matthias

    2013-01-01

    A concise yet comprehensive introduction to the central theoretical concepts of modern astrophysics, presenting hydrodynamics, radiation, and stellar dynamics all in one textbook. Adopting a modular structure, the author illustrates a small number of fundamental physical methods and principles, which are sufficient to describe and understand a wide range of seemingly very diverse astrophysical phenomena and processes. For example, the formulae that define the macroscopic behavior of stellar systems are all derived in the same way from the microscopic distribution function. This function it

  4. Astrophysics in a nutshell

    CERN Document Server

    Maoz, Dan

    2016-01-01

    Winner of the American Astronomical Society's Chambliss Award, Astrophysics in a Nutshell has become the text of choice in astrophysics courses for science majors at top universities in North America and beyond. In this expanded and fully updated second edition, the book gets even better, with a new chapter on extrasolar planets; a greatly expanded chapter on the interstellar medium; fully updated facts and figures on all subjects, from the observed properties of white dwarfs to the latest results from precision cosmology; and additional instructive problem sets. Throughout, the text features the same focused, concise style and emphasis on physics intuition that have made the book a favorite of students and teachers.

  5. Introduction to Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Iliadis, Christian

    2010-01-01

    In the first lecture of this volume, we will present the basic fundamental ideas regarding nuclear processes occurring in stars. We start from stellar observations, will then elaborate on some important quantum-mechanical phenomena governing nuclear reactions, continue with how nuclear reactions proceed in a hot stellar plasma and, finally, we will provide an overview of stellar burning stages. At the end, the current knowledge regarding the origin of the elements is briefly summarized. This lecture is directed towards the student of nuclear astrophysics. Our intention is to present seemingly unrelated phenomena of nuclear physics and astrophysics in a coherent framework.

  6. Development of high performance cladding

    International Nuclear Information System (INIS)

    Kiuchi, Kiyoshi

    2003-01-01

    The developments of superior next-generation light water reactor are requested on the basis of general view points, such as improvement of safety, economics, reduction of radiation waste and effective utilization of plutonium, until 2030 year in which conventional reactor plants should be renovate. Improvements of stainless steel cladding for conventional high burn-up reactor to more than 100 GWd/t, developments of manufacturing technology for reduced moderation-light water reactor (RMWR) of breeding ratio beyond 1.0 and researches of water-materials interaction on super critical pressure-water cooled reactor are carried out in Japan Atomic Energy Research Institute. Stable austenite stainless steel has been selected for fuel element cladding of advanced boiling water reactor (ABWR). The austenite stain less has the superiority for anti-irradiation properties, corrosion resistance and mechanical strength. A hard spectrum of neutron energy up above 0.1 MeV takes place in core of the reduced moderation-light water reactor, as liquid metal-fast breeding reactor (LMFBR). High performance cladding for the RMWR fuel elements is required to get anti-irradiation properties, corrosion resistance and mechanical strength also. Slow strain rate test (SSRT) of SUS 304 and SUS 316 are carried out for studying stress corrosion cracking (SCC). Irradiation tests in LMFBR are intended to obtain irradiation data for damaged quantity of the cladding materials. (M. Suetake)

  7. High performance fuel technology development

    Energy Technology Data Exchange (ETDEWEB)

    Koon, Yang Hyun; Kim, Keon Sik; Park, Jeong Yong; Yang, Yong Sik; In, Wang Kee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    {omicron} Development of High Plasticity and Annular Pellet - Development of strong candidates of ultra high burn-up fuel pellets for a PCI remedy - Development of fabrication technology of annular fuel pellet {omicron} Development of High Performance Cladding Materials - Irradiation test of HANA claddings in Halden research reactor and the evaluation of the in-pile performance - Development of the final candidates for the next generation cladding materials. - Development of the manufacturing technology for the dual-cooled fuel cladding tubes. {omicron} Irradiated Fuel Performance Evaluation Technology Development - Development of performance analysis code system for the dual-cooled fuel - Development of fuel performance-proving technology {omicron} Feasibility Studies on Dual-Cooled Annular Fuel Core - Analysis on the property of a reactor core with dual-cooled fuel - Feasibility evaluation on the dual-cooled fuel core {omicron} Development of Design Technology for Dual-Cooled Fuel Structure - Definition of technical issues and invention of concept for dual-cooled fuel structure - Basic design and development of main structure components for dual- cooled fuel - Basic design of a dual-cooled fuel rod.

  8. Astrophysical Russian Dolls

    OpenAIRE

    Loeb, Abraham; Imara, Nia

    2017-01-01

    Are there examples of "astrophysical Russian dolls," and what could we learn from their similarities? In this article, we list a few such examples, including disks, filaments, and clusters. We suggest that forging connections across disciplinary borders enhances our perception of beauty, while simultaneously leading to a more comprehensive understanding of the Universe.

  9. High energy astrophysics

    International Nuclear Information System (INIS)

    Shklorsky, I.S.

    1979-01-01

    A selected list of articles of accessible recent review articles and conference reports, wherein up-to-date summaries of various topics in the field of high energy astrophysics can be found, is presented. A special report outlines work done in the Soviet Union in this area. (Auth.)

  10. The NASA Astrophysics Program

    Science.gov (United States)

    Zebulum, Ricardo S.

    2011-01-01

    NASA's scientists are enjoying unprecedented access to astronomy data from space, both from missions launched and operated only by NASA, as well as missions led by other space agencies to which NASA contributed instruments or technology. This paper describes the NASA astrophysics program for the next decade, including NASA's response to the ASTRO2010 Decadal Survey.

  11. The usage of numerical code FLASH in plasma astrophysics

    OpenAIRE

    BROŽ, Jaroslav

    2013-01-01

    My diploma thesis is focused on the use of numerical computer codes for simulation in plasma astrophysics. They will learn the basic characteristics of the Sun, a closer focus on the solar corona and the solar corona heating problem. The following section is devoted to simulation software in plasma astrophysics, their installing and displaying the results using the visualization software. In the conclusion is demonstrated using this software on a model example and a simulation that performs s...

  12. Encyclopedia of Astronomy and Astrophysics

    CERN Document Server

    2002-01-01

    Interstellar medium, Light, Magnetisphere, Matter, Planet Earth, Public Impact, Solar Activity, Solar Heliosphere, Solar Interior, Solar Systems, Space, Stellar Astrophysics, Stellar Populations, Telescopes, Time The Encyclopedia of Astronomy and Astrophysics covers 30 major subject areas, such as Active galaxies, Astrometry, Astrophysical theory, Atmospheres, Binary stars, Biography, Clusters, Coordinates, Cosmology, Earth, Education, Galaxies,

  13. Astronomical optical interferometry, II: Astrophysical results

    Directory of Open Access Journals (Sweden)

    Jankov S.

    2011-01-01

    Full Text Available Optical interferometry is entering a new age with several ground- based long-baseline observatories now making observations of unprecedented spatial resolution. Based on a great leap forward in the quality and quantity of interferometric data, the astrophysical applications are not limited anymore to classical subjects, such as determination of fundamental properties of stars; namely, their effective temperatures, radii, luminosities and masses, but the present rapid development in this field allowed to move to a situation where optical interferometry is a general tool in studies of many astrophysical phenomena. Particularly, the advent of long-baseline interferometers making use of very large pupils has opened the way to faint objects science and first results on extragalactic objects have made it a reality. The first decade of XXI century is also remarkable for aperture synthesis in the visual and near-infrared wavelength regimes, which provided image reconstructions from stellar surfaces to Active Galactic Nuclei. Here I review the numerous astrophysical results obtained up to date, except for binary and multiple stars milliarcsecond astrometry, which should be a subject of an independent detailed review, taking into account its importance and expected results at microarcsecond precision level. To the results obtained with currently available interferometers, I associate the adopted instrumental settings in order to provide a guide for potential users concerning the appropriate instruments which can be used to obtain the desired astrophysical information.

  14. Indirect techniques in nuclear astrophysics

    International Nuclear Information System (INIS)

    Mukhamedzhanov, A.M.; Tribble, R.E.; Blokhintsev, L.D.; Cherubini, S.; Spitaleri, C.; Kroha, V.; Nunes, F.M.

    2005-01-01

    It is very difficult or often impossible to measure in the lab conditions nuclear cross sections at astrophysically relevant energies. That is why different indirect techniques are used to extract astrophysical information. In this talk different experimental possibilities to get astrophysical information using radioactive and stable beams will be addressed. 1. The asymptotic normalization coefficient (ANC) method. 2. Radiative neutron captures are determined by the spectroscopic factors (SP). A new experimental technique to determine the neutron SPs will be addressed. 3. 'Trojan Horse' is another unique indirect method, which allows one to extract the astrophysical factors for direct and resonant nuclear reactions at astrophysically relevant energies. (author)

  15. Scaling law in laboratory astrophysics

    International Nuclear Information System (INIS)

    Xia Jiangfan; Zhang Jie

    2001-01-01

    The use of state-of-the-art lasers makes it possible to produce, in the laboratory, the extreme conditions similar to those in astrophysical processes. The introduction of astrophysics-relevant ideas in laser-plasma interaction experiments is propitious to the understanding of astrophysical phenomena. However, the great difference between laser-produced plasma and astrophysical objects makes it awkward to model the latter by laser-plasma experiments. The author presents the physical reasons for modeling astrophysical plasmas by laser plasmas, connecting these two kinds of plasmas by scaling laws. This allows the creation of experimental test beds where observation and models can be quantitatively compared with laboratory data

  16. High performance light water reactor

    International Nuclear Information System (INIS)

    Squarer, D.; Schulenberg, T.; Struwe, D.; Oka, Y.; Bittermann, D.; Aksan, N.; Maraczy, C.; Kyrki-Rajamaeki, R.; Souyri, A.; Dumaz, P.

    2003-01-01

    The objective of the high performance light water reactor (HPLWR) project is to assess the merit and economic feasibility of a high efficiency LWR operating at thermodynamically supercritical regime. An efficiency of approximately 44% is expected. To accomplish this objective, a highly qualified team of European research institutes and industrial partners together with the University of Tokyo is assessing the major issues pertaining to a new reactor concept, under the co-sponsorship of the European Commission. The assessment has emphasized the recent advancement achieved in this area by Japan. Additionally, it accounts for advanced European reactor design requirements, recent improvements, practical design aspects, availability of plant components and the availability of high temperature materials. The final objective of this project is to reach a conclusion on the potential of the HPLWR to help sustain the nuclear option, by supplying competitively priced electricity, as well as to continue the nuclear competence in LWR technology. The following is a brief summary of the main project achievements:-A state-of-the-art review of supercritical water-cooled reactors has been performed for the HPLWR project.-Extensive studies have been performed in the last 10 years by the University of Tokyo. Therefore, a 'reference design', developed by the University of Tokyo, was selected in order to assess the available technological tools (i.e. computer codes, analyses, advanced materials, water chemistry, etc.). Design data and results of the analysis were supplied by the University of Tokyo. A benchmark problem, based on the 'reference design' was defined for neutronics calculations and several partners of the HPLWR project carried out independent analyses. The results of these analyses, which in addition help to 'calibrate' the codes, have guided the assessment of the core and the design of an improved HPLWR fuel assembly. Preliminary selection was made for the HPLWR scale

  17. High Time Resolution Astrophysics

    CERN Document Server

    Phelan, Don; Shearer, Andrew

    2008-01-01

    High Time Resolution Astrophysics (HTRA) is an important new window to the universe and a vital tool in understanding a range of phenomena from diverse objects and radiative processes. This importance is demonstrated in this volume with the description of a number of topics in astrophysics, including quantum optics, cataclysmic variables, pulsars, X-ray binaries and stellar pulsations to name a few. Underlining this science foundation, technological developments in both instrumentation and detectors are described. These instruments and detectors combined cover a wide range of timescales and can measure fluxes, spectra and polarisation. These advances make it possible for HTRA to make a big contribution to our understanding of the Universe in the next decade.

  18. Astrophysics a new approach

    CERN Document Server

    Kundt, Wolfgang

    2005-01-01

    For a quantitative understanding of the physics of the universe - from the solar system through the milky way to clusters of galaxies all the way to cosmology - these edited lecture notes are perhaps among the most concise and also among the most critical ones: Astrophysics has not yet stood the redundancy test of laboratory physics, hence should be wary of early interpretations. Special chapters are devoted to magnetic and radiation processes, supernovae, disks, black-hole candidacy, bipolar flows, cosmic rays, gamma-ray bursts, image distortions, and special sources. At the same time, planet earth is viewed as the arena for life, with plants and animals having evolved to homo sapiens during cosmic time. -- This text is unique in covering the basic qualitative and quantitative tools, formulae as well as numbers, needed for the precise interpretation of frontline phenomena in astrophysical research. The author compares mainstream interpretations with new and even controversial ones he wishes to emphasize. The...

  19. Astrophysics of Red Supergiants

    Science.gov (United States)

    Levesque, Emily M.

    2017-12-01

    'Astrophysics of Red Supergiants' is the first book of its kind devoted to our current knowledge of red supergiant stars, a key evolutionary phase that is critical to our larger understanding of massive stars. It provides a comprehensive overview of the fundamental physical properties of red supergiants, their evolution, and their extragalactic and cosmological applications. It serves as a reference for researchers from a broad range of fields (including stellar astrophysics, supernovae, and high-redshift galaxies) who are interested in red supergiants as extreme stages of stellar evolution, dust producers, supernova progenitors, extragalactic metallicity indicators, members of massive binaries and mergers, or simply as compelling objects in their own right. The book is accessible to a range of experience levels, from graduate students up to senior researchers.

  20. Astrophysical black holes

    CERN Document Server

    Gorini, Vittorio; Moschella, Ugo; Treves, Aldo; Colpi, Monica

    2016-01-01

    Based on graduate school lectures in contemporary relativity and gravitational physics, this book gives a complete and unified picture of the present status of theoretical and observational properties of astrophysical black holes. The chapters are written by internationally recognized specialists. They cover general theoretical aspects of black hole astrophysics, the theory of accretion and ejection of gas and jets, stellar-sized black holes observed in the Milky Way, the formation and evolution of supermassive black holes in galactic centers and quasars as well as their influence on the dynamics in galactic nuclei. The final chapter addresses analytical relativity of black holes supporting theoretical understanding of the coalescence of black holes as well as being of great relevance in identifying gravitational wave signals. With its introductory chapters the book is aimed at advanced graduate and post-graduate students, but it will also be useful for specialists.

  1. Nuclear astrophysics at DRAGON

    International Nuclear Information System (INIS)

    Hager, U.

    2014-01-01

    The DRAGON recoil separator is located at the ISAC facility at TRIUMF, Vancouver. It is designed to measure radiative alpha and proton capture reactions of astrophysical importance. Over the last years, the DRAGON collaboration has measured several reactions using both radioactive and high-intensity stable beams. For example, the 160(a, g) cross section was recently measured. The reaction plays a role in steady-state helium burning in massive stars, where it follows the 12C(a, g) reaction. At astrophysically relevant energies, the reaction proceeds exclusively via direct capture, resulting in a low rate. In this measurement, the unique capabilities of DRAGON enabled determination not only of the total reaction rates, but also of decay branching ratios. In addition, results from other recent measurements will be presented

  2. Allen's astrophysical quantities

    CERN Document Server

    2000-01-01

    This new, fourth, edition of Allen's classic Astrophysical Quantities belongs on every astronomer's bookshelf. It has been thoroughly revised and brought up to date by a team of more than ninety internationally renowned astronomers and astrophysicists. While it follows the basic format of the original, this indispensable reference has grown to more than twice the size of the earlier editions to accommodate the great strides made in astronomy and astrophysics. It includes detailed tables of the most recent data on: - General constants and units - Atoms, molecules, and spectra - Observational astronomy at all wavelengths from radio to gamma-rays, and neutrinos - Planetary astronomy: Earth, planets and satellites, and solar system small bodies - The Sun, normal stars, and stars with special characteristics - Stellar populations - Cataclysmic and symbiotic variables, supernovae - Theoretical stellar evolution - Circumstellar and interstellar material - Star clusters, galaxies, quasars, and active galactic nuclei ...

  3. Nuclear reactions in astrophysics

    International Nuclear Information System (INIS)

    Cardenas, M.

    1976-01-01

    It is revised the nuclear reactions which present an interest in astrophysics regarding the explanation of some problems such as the relative quantity of the elements, the structure and evolution of the stars. The principal object of the study is the determination of the experimental possibilities in the field of astrophysics, of an accelerator Van de Graaff's 700 KeV type. Two hundred nuclear reactions approximately, were found, and nothing or very little has been done in the intervals of energy which are of interest. Since the bombardment energies and the involved sections are low in some cases, there are real possibilities, for the largest number of stars to obtain important statistical data with the above mentioned accelerator, taking some necessary precautions. (author)

  4. The new astrophysics

    International Nuclear Information System (INIS)

    Longair, M.

    1993-01-01

    The various themes developed are: radioastronomy, X-ray and gamma-ray astronomy, cosmic ray, ultraviolet, neutral hydrogen and molecular line astronomy, optical and theoretical astronomy; the large scale distribution of matter and radiation in the universe, the galaxies, stars and stellar evolution, the interstellar medium (gas, dust) and star formation, galaxies and clusters of galaxies, active galaxies and quasars, astrophysical cosmology, the astronomy of the future. 86 figs., 60 refs

  5. Astrophysical fluid dynamics

    Science.gov (United States)

    Ogilvie, Gordon I.

    2016-06-01

    > These lecture notes and example problems are based on a course given at the University of Cambridge in Part III of the Mathematical Tripos. Fluid dynamics is involved in a very wide range of astrophysical phenomena, such as the formation and internal dynamics of stars and giant planets, the workings of jets and accretion discs around stars and black holes and the dynamics of the expanding Universe. Effects that can be important in astrophysical fluids include compressibility, self-gravitation and the dynamical influence of the magnetic field that is `frozen in' to a highly conducting plasma. The basic models introduced and applied in this course are Newtonian gas dynamics and magnetohydrodynamics (MHD) for an ideal compressible fluid. The mathematical structure of the governing equations and the associated conservation laws are explored in some detail because of their importance for both analytical and numerical methods of solution, as well as for physical interpretation. Linear and nonlinear waves, including shocks and other discontinuities, are discussed. The spherical blast wave resulting from a supernova, and involving a strong shock, is a classic problem that can be solved analytically. Steady solutions with spherical or axial symmetry reveal the physics of winds and jets from stars and discs. The linearized equations determine the oscillation modes of astrophysical bodies, as well as their stability and their response to tidal forcing.

  6. Indoor Air Quality in High Performance Schools

    Science.gov (United States)

    High performance schools are facilities that improve the learning environment while saving energy, resources, and money. The key is understanding the lifetime value of high performance schools and effectively managing priorities, time, and budget.

  7. Low-Cost High-Performance MRI

    Science.gov (United States)

    Sarracanie, Mathieu; Lapierre, Cristen D.; Salameh, Najat; Waddington, David E. J.; Witzel, Thomas; Rosen, Matthew S.

    2015-10-01

    Magnetic Resonance Imaging (MRI) is unparalleled in its ability to visualize anatomical structure and function non-invasively with high spatial and temporal resolution. Yet to overcome the low sensitivity inherent in inductive detection of weakly polarized nuclear spins, the vast majority of clinical MRI scanners employ superconducting magnets producing very high magnetic fields. Commonly found at 1.5-3 tesla (T), these powerful magnets are massive and have very strict infrastructure demands that preclude operation in many environments. MRI scanners are costly to purchase, site, and maintain, with the purchase price approaching $1 M per tesla (T) of magnetic field. We present here a remarkably simple, non-cryogenic approach to high-performance human MRI at ultra-low magnetic field, whereby modern under-sampling strategies are combined with fully-refocused dynamic spin control using steady-state free precession techniques. At 6.5 mT (more than 450 times lower than clinical MRI scanners) we demonstrate (2.5 × 3.5 × 8.5) mm3 imaging resolution in the living human brain using a simple, open-geometry electromagnet, with 3D image acquisition over the entire brain in 6 minutes. We contend that these practical ultra-low magnetic field implementations of MRI (standards for affordable (<$50,000) and robust portable devices.

  8. Carpet Aids Learning in High Performance Schools

    Science.gov (United States)

    Hurd, Frank

    2009-01-01

    The Healthy and High Performance Schools Act of 2002 has set specific federal guidelines for school design, and developed a federal/state partnership program to assist local districts in their school planning. According to the Collaborative for High Performance Schools (CHPS), high-performance schools are, among other things, healthy, comfortable,…

  9. A high-performance visual profiler for games

    NARCIS (Netherlands)

    Roza, M.; Schroders, M.; Wetering, van de H.M.M.

    2009-01-01

    Video games are software products with the purpose to entertain its players. Unfortunately, the performance of video games can suddenly decrease; this phenomenon is called a frame drop, and causes the amount of fun experienced by players to drop. To avoid this behavior, usually the process of

  10. High-Performance Neural Networks for Visual Object Classification

    OpenAIRE

    Cireşan, Dan C.; Meier, Ueli; Masci, Jonathan; Gambardella, Luca M.; Schmidhuber, Jürgen

    2011-01-01

    We present a fast, fully parameterizable GPU implementation of Convolutional Neural Network variants. Our feature extractors are neither carefully designed nor pre-wired, but rather learned in a supervised way. Our deep hierarchical architectures achieve the best published results on benchmarks for object classification (NORB, CIFAR10) and handwritten digit recognition (MNIST), with error rates of 2.53%, 19.51%, 0.35%, respectively. Deep nets trained by simple back-propagation perform better ...

  11. Research in nuclear astrophysics

    International Nuclear Information System (INIS)

    Lattimer, J.M.; Yahil, A.

    1989-01-01

    The interaction between nuclear theory and some outstanding problems in astrophysics is examined. We are actively researching both the astrophysics of gravitational collapse, neutron star birth, and the emission of neutrinos from supernovae, on the one hand, and the nuclear physics of the equation of state of hot, dense matter on the other hand. There is close coupling between nuclear theory and the supernova phenomenon; in fact, nuclear matter properties, especially at supernuclear densities, might be best delineated by astrophysical considerations. Our research has also focused on the neutrinos emitted from supernovae, since they are the only available observables of the internal supernova mechanism. The recent observations of neutrinos from SN 1987A proved to be in remarkable agreement with models we pioneered in the one and one half years prior to its explosion in February 1987. We have also developed a novel hydrodynamical code in which shocks are treated via Riemann resolution rather than with artificial viscosity. We propose to modify it to use implicit differencing and to include multi-group neutrino diffusion and General Relativity. In parallel, we are extending calculations of the birth of a neutron star to include convection and mass accretion, by incorporating a hydrodynamic envelope onto a hydrostatic core. In view of the possible recent discovery of a pulsar in SN1987A, we are including the effects of rotation. We are undertaking a detailed comparison of current equations of state, focusing on disagreements regarding the nuclear incompressibly, symmetry energy and specific heat. Especially important is the symmetry energy, which below nuclear density controls free proton fractions and weak interaction rates and above this density critically influences the neutron star maximum mass and binding energy. 60 refs

  12. Experimental studies of nuclear astrophysics

    International Nuclear Information System (INIS)

    He Jianjun; Zhou Xiaohong; Zhang Yuhu

    2013-01-01

    Nuclear astrophysics is an interdisciplinary subject combining micro-scale nuclear physics and macro-scale astrophysics. Its main aims are to understand the origin and evolution of the elements in the universe, the time scale of stellar evolution, the stellar environment and sites, the energy generation of stars from thermonuclear processes and its impact on stellar evolution and the mechanisms driving astrophysical phenomena, and the structure and property of compact stars. This paper presents the significance and current research status of nuclear astrophysics; we introduce some fundamental concepts, the nuclear physics input parameters required by certain astrophysics models, and some widely-used experimental approaches in nuclear astrophysics research. The potential and feasibility of research in this field using China’s current and planned large-scale scientific facilities are analyzed briefly. Finally, the prospects of the establishing a deep underground science and engineering laboratory in China are envisaged. (authors)

  13. Astrophysics in 1999

    OpenAIRE

    Trimble, V; Aschwanden, MJ

    2000-01-01

    The year 1999 saw the arrival of a star with three planets, a universe with three parameters, and a solar corona that could be heated at least three ways. In addition, there were at least three papers on every question that has ever been asked in astrophysics, from "Will the Universe expand forever?" to "Does mantle convection occur in one or two layers?" The answers generally were, "Yes," "No," and "None of the above," to each of the questions. The authors have done their best to organize th...

  14. Design and Implementation of High-Performance GIS Dynamic Objects Rendering Engine

    Science.gov (United States)

    Zhong, Y.; Wang, S.; Li, R.; Yun, W.; Song, G.

    2017-12-01

    Spatio-temporal dynamic visualization is more vivid than static visualization. It important to use dynamic visualization techniques to reveal the variation process and trend vividly and comprehensively for the geographical phenomenon. To deal with challenges caused by dynamic visualization of both 2D and 3D spatial dynamic targets, especially for different spatial data types require high-performance GIS dynamic objects rendering engine. The main approach for improving the rendering engine with vast dynamic targets relies on key technologies of high-performance GIS, including memory computing, parallel computing, GPU computing and high-performance algorisms. In this study, high-performance GIS dynamic objects rendering engine is designed and implemented for solving the problem based on hybrid accelerative techniques. The high-performance GIS rendering engine contains GPU computing, OpenGL technology, and high-performance algorism with the advantage of 64-bit memory computing. It processes 2D, 3D dynamic target data efficiently and runs smoothly with vast dynamic target data. The prototype system of high-performance GIS dynamic objects rendering engine is developed based SuperMap GIS iObjects. The experiments are designed for large-scale spatial data visualization, the results showed that the high-performance GIS dynamic objects rendering engine have the advantage of high performance. Rendering two-dimensional and three-dimensional dynamic objects achieve 20 times faster on GPU than on CPU.

  15. High performance carbon nanocomposites for ultracapacitors

    Science.gov (United States)

    Lu, Wen

    2012-10-02

    The present invention relates to composite electrodes for electrochemical devices, particularly to carbon nanotube composite electrodes for high performance electrochemical devices, such as ultracapacitors.

  16. Nuclear Data on Unstable Nuclei for Astrophysics

    International Nuclear Information System (INIS)

    Smith, Michael Scott; Meyer, Richard A; Lingerfelt, Eric; Scott, J.P.; Hix, William Raphael; Ma, Zhanwen; Bardayan, Daniel W.; Blackmon, Jeff C.; Guidry, Mike W.; KOZUB, RAYMOND L.; Chae, Kyung YuK.

    2004-01-01

    Recent measurements with radioactive beams at ORNL's Holifield Radioactive Ion Beam Facility (HRIBF) have prompted the evaluation of a number of reactions involving unstable nuclei needed for stellar explosion studies. We discuss these evaluations, as well as the development of a new computational infrastructure to enable the rapid incorporation of the latest nuclear physics results in astrophysics models. This infrastructure includes programs that simplify the generation of reaction rates, manage rate databases, and visualize reaction rates, all hosted at a new website http://www.nucastrodata.org

  17. Important plasma problems in astrophysics

    International Nuclear Information System (INIS)

    Kulsrud, R.M.

    1995-01-01

    In astrophysics, plasmas occur under very extreme conditions. For example, there are ultrastrong magnetic fields in neutron stars, relativistic plasmas around black holes and in jets, extremely energetic particles such as cosmic rays in the interstellar medium, extremely dense plasmas in accretion disks, and extremely large magnetic Reynolds numbers in the interstellar medium. These extreme limits for astrophysical plasmas make plasma phenomena much simpler to analyze in astrophysics than in the laboratory. An understanding of such phenomena often results in an interesting way, by simply taking the extreme limiting case of a known plasma theory. The author will describe one of the more exciting examples and will attempt to convey the excitement he felt when he was first exposed to it. However, not all plasma astrophysical phenomena are so simple. There are certain important plasma phenomena in astrophysics that have not been so easily resolved. In fact, a resolution of them is blocking significant progress in astrophysical research. They have not yet yielded to attacks by theoretical astrophysicists nor to extensive numerical simulation. The author will attempt to describe one of the more important of these plasma--astrophysical problems, and discuss why its resolution is so important to astrophysics. This significant example is fast, magnetic reconnection. Another significant example is the large-magnetic-Reynolds number magnetohydrodynamics (MHD) dynamos

  18. Computational Infrastructure for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Smith, Michael S.; Hix, W. Raphael; Bardayan, Daniel W.; Blackmon, Jeffery C.; Lingerfelt, Eric J.; Scott, Jason P.; Nesaraja, Caroline D.; Chae, Kyungyuk; Guidry, Michael W.; Koura, Hiroyuki; Meyer, Richard A.

    2006-01-01

    A Computational Infrastructure for Nuclear Astrophysics has been developed to streamline the inclusion of the latest nuclear physics data in astrophysics simulations. The infrastructure consists of a platform-independent suite of computer codes that is freely available online at nucastrodata.org. Features of, and future plans for, this software suite are given

  19. Cyberinfrastructure for Computational Relativistic Astrophysics

    OpenAIRE

    Ott, Christian

    2012-01-01

    Poster presented at the NSF Office of Cyberinfrastructure CyberBridges CAREER PI workshop. This poster discusses the computational challenges involved in the modeling of complex relativistic astrophysical systems. The Einstein Toolkit is introduced. It is an open-source community infrastructure for numerical relativity and computational astrophysics.

  20. Radiation processes in astrophysics

    CERN Document Server

    Tucker, Wallace H

    1975-01-01

    The purpose of this book is twofold: to provide a brief, simple introduction to the theory of radiation and its application in astrophysics and to serve as a reference manual for researchers. The first part of the book consists of a discussion of the basic formulas and concepts that underlie the classical and quantum descriptions of radiation processes. The rest of the book is concerned with applications. The spirit of the discussion is to present simple derivations that will provide some insight into the basic physics involved and then to state the exact results in a form useful for applications. The reader is referred to the original literature and to reviews for rigorous derivations.The wide range of topics covered is illustrated by the following table of contents: Basic Formulas for Classical Radiation Processes; Basic Formulas for Quantum Radiation Processes; Cyclotron and Synchrotron Radiation; Electron Scattering; Bremsstrahlung and Collision Losses; Radiative Recombination; The Photoelectric Effect; a...

  1. NASA's Astrophysics Data Archives

    Science.gov (United States)

    Hasan, H.; Hanisch, R.; Bredekamp, J.

    2000-09-01

    The NASA Office of Space Science has established a series of archival centers where science data acquired through its space science missions is deposited. The availability of high quality data to the general public through these open archives enables the maximization of science return of the flight missions. The Astrophysics Data Centers Coordinating Council, an informal collaboration of archival centers, coordinates data from five archival centers distiguished primarily by the wavelength range of the data deposited there. Data are available in FITS format. An overview of NASA's data centers and services is presented in this paper. A standard front-end modifyer called `Astrowbrowse' is described. Other catalog browsers and tools include WISARD and AMASE supported by the National Space Scince Data Center, as well as ISAIA, a follow on to Astrobrowse.

  2. Black-hole astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Bender, P. [Univ. of Colorado, Boulder, CO (United States); Bloom, E. [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Cominsky, L. [Sonoma State Univ., Rohnert Park, CA (United States). Dept. of Physics and Astronomy] [and others

    1995-07-01

    Black-hole astrophysics is not just the investigation of yet another, even if extremely remarkable type of celestial body, but a test of the correctness of the understanding of the very properties of space and time in very strong gravitational fields. Physicists` excitement at this new prospect for testing theories of fundamental processes is matched by that of astronomers at the possibility to discover and study a new and dramatically different kind of astronomical object. Here the authors review the currently known ways that black holes can be identified by their effects on their neighborhood--since, of course, the hole itself does not yield any direct evidence of its existence or information about its properties. The two most important empirical considerations are determination of masses, or lower limits thereof, of unseen companions in binary star systems, and measurement of luminosity fluctuations on very short time scales.

  3. Astrophysics Faces the Millennium

    Science.gov (United States)

    Trimble, Virginia

    2001-03-01

    The Medieval synthesis of Aristotelian philosophy and church doctrine, due largely to Thomas Aquinas, insisted that the universe outside the earth's atmosphere must be immutable, single-centered, fully inventoried, immaculate or perfect, including perfectly spherical, and much else that sounds strange to modern ears. The beginnings of modern astronomy can be largely described as the overthrow of these various concepts by a combination of new technologies and new ways of thinking, and many current questions in astrophysics can be directly tied to developments of those same concepts. Indeed they probably all can be, but not over time, ending with questions like: Do other stars have spots? What does it mean when quasar jets look like they are moving faster than the speed of light? Is there anything special about our star, our galaxy, our planet, or our universe? How did these all form, and what is their long-term fate?

  4. Numerical relativity beyond astrophysics

    Science.gov (United States)

    Garfinkle, David

    2017-01-01

    Though the main applications of computer simulations in relativity are to astrophysical systems such as black holes and neutron stars, nonetheless there are important applications of numerical methods to the investigation of general relativity as a fundamental theory of the nature of space and time. This paper gives an overview of some of these applications. In particular we cover (i) investigations of the properties of spacetime singularities such as those that occur in the interior of black holes and in big bang cosmology. (ii) investigations of critical behavior at the threshold of black hole formation in gravitational collapse. (iii) investigations inspired by string theory, in particular analogs of black holes in more than 4 spacetime dimensions and gravitational collapse in spacetimes with a negative cosmological constant.

  5. High energy astrophysical techniques

    CERN Document Server

    Poggiani, Rosa

    2017-01-01

    This textbook presents ultraviolet and X-ray astronomy, gamma-ray astronomy, cosmic ray astronomy, neutrino astronomy, and gravitational wave astronomy as distinct research areas, focusing on the astrophysics targets and the requirements with respect to instrumentation and observation methods. The purpose of the book is to bridge the gap between the reference books and the specialized literature. For each type of astronomy, the discussion proceeds from the orders of magnitude for observable quantities. The physical principles of photon and particle detectors are then addressed, and the specific telescopes and combinations of detectors, presented. Finally the instruments and their limits are discussed with a view to assisting readers in the planning and execution of observations. Astronomical observations with high-energy photons and particles represent the newest additions to multimessenger astronomy and this book will be of value to all with an interest in the field.

  6. Exotic nuclei and astrophysics

    Directory of Open Access Journals (Sweden)

    Penionzhkevich Yu.

    2012-12-01

    Full Text Available In recent years, nuclear physics investigations of the laws of the microscopic world contributed significantly to extension of our knowledge of phenomena occurring in the macroscopic world (Universe and made a formidable contribution to the development of astrophysical and cosmological theories. First of all, this concerns the expanding universe model, the evolution of stars, and the abundances of elements, as well as the properties of various stars and cosmic objects, including “cold” and neutron stars, black holes, and pulsars. Without claiming to give a full account of all cosmological problems, we will dwell upon those of them that, in my opinion, have much in common with nuclear-matter properties manifesting themselves in nuclear interactions.

  7. Numerical relativity beyond astrophysics.

    Science.gov (United States)

    Garfinkle, David

    2017-01-01

    Though the main applications of computer simulations in relativity are to astrophysical systems such as black holes and neutron stars, nonetheless there are important applications of numerical methods to the investigation of general relativity as a fundamental theory of the nature of space and time. This paper gives an overview of some of these applications. In particular we cover (i) investigations of the properties of spacetime singularities such as those that occur in the interior of black holes and in big bang cosmology. (ii) investigations of critical behavior at the threshold of black hole formation in gravitational collapse. (iii) investigations inspired by string theory, in particular analogs of black holes in more than 4 spacetime dimensions and gravitational collapse in spacetimes with a negative cosmological constant.

  8. Astrophysics days and MHD

    International Nuclear Information System (INIS)

    Falgarone, Edith; Rieutord, Michel; Richard, Denis; Zahn, Jean-Paul; Dauchot, Olivier; Daviaud, Francois; Dubrulle, Berengere; Laval, Jean-Philippe; Noullez, Alain; Bourgoin, Mickael; Odier, Philippe; Pinton, Jean-Francois; Leveque, Emmanuel; Chainais, Pierre; Abry, Patrice; Mordant, Nicolas; Michel, Olivier; Marie, Louis; Chiffaudel, Arnaud; Daviaud, Francois; Petrelis, Francois; Fauve, Stephan; Nore, C.; Brachet, M.-E.; Politano, H.; Pouquet, A.; Leorat, Jacques; Grapin, Roland; Brun, Sacha; Delour, Jean; Arneodo, Alain; Muzy, Jean-Francois; Magnaudet, Jacques; Braza, Marianna; Boree, Jacques; Maurel, S.; Ben, L.; Moreau, J.; Bazile, R.; Charnay, G.; Lewandowski, Roger; Laveder, Dimitri; Bouchet, Freddy; Sommeria, Joel; Le Gal, P.; Eloy, C.; Le Dizes, S.; Schneider, Kai; Farge, Marie; Bottausci, Frederic; Petitjeans, Philippe; Maurel, Agnes; Carlier, Johan; Anselmet, Fabien

    2001-05-01

    This publication gathers extended summaries of presentations proposed during two days on astrophysics and magnetohydrodynamics (MHD). The first session addressed astrophysics and MHD: The cold interstellar medium, a low ionized turbulent plasma; Turbulent convection in stars; Turbulence in differential rotation; Protoplanetary disks and washing machines; gravitational instability and large structures; MHD turbulence in the sodium von Karman flow; Numerical study of the dynamo effect in the Taylor-Green eddy geometry; Solar turbulent convection under the influence of rotation and of the magnetic field. The second session addressed the description of turbulence: Should we give up cascade models to describe the spatial complexity of the velocity field in a developed turbulence?; What do we learn with RDT about the turbulence at the vicinity of a plane surface?; Qualitative explanation of intermittency; Reduced model of Navier-Stokes equations: quickly extinguished energy cascade; Some mathematical properties of turbulent closure models. The third session addressed turbulence and coherent structures: Alfven wave filamentation and formation of coherent structures in dispersive MHD; Statistical mechanics for quasi-geo-strophic turbulence: applications to Jupiter's coherent structures; Elliptic instabilities; Physics and modelling of turbulent detached unsteady flows in aerodynamics and fluid-structure interaction; Intermittency and coherent structures in a washing machine: a wavelet analysis of joint pressure/velocity measurements; CVS filtering of 3D turbulent mixing layer using orthogonal wavelets. The last session addressed experimental methods: Lagrangian velocity measurements; Energy dissipation and instabilities within a locally stretched vortex; Study by laser imagery of the generation and breakage of a compressed eddy flow; Study of coherent structures of turbulent boundary layer at high Reynolds number

  9. Nuclear physics and astrophysics

    International Nuclear Information System (INIS)

    Schramm, D.N.; Olinto, A.V.

    1992-09-01

    We have investigated a variety of research topics on the interface of nuclear physics and astrophysics during the past year. We have continued our study of dihyperon states in dense matter and have started to make a connection between their properties in the core of neutron stars with the ongoing experimental searches at Brookhaven National Laboratory. We started to build a scenario for the origin of gamma-ray bursts using the conversion of neutron stars to strange stars close to an active galactic nucleous. We have been reconsidering the constraints due to neutron star cooling rates on the equation of state for high density matter in the light, of recent findings which show that the faster direct Urca cooling process is possible for a range of nuclear compositions. We have developed a model for the formation of primordial magnetic fields due to the dynamics of the quark-hadron phase transition. Encouraged by the most recent observational developments, we have investigated the possible origin of the boron and beryllium abundances. We have greatly improved the calculations of the primordial abundances of these elements I>y augmenting the reaction networks and by updating the most recent experimental nuclear reaction rates. Our calculations have shown that the primordial abundances are much higher than previously thought but that the observed abundances cannot be explained by primordial sources alone. We have also studied the origin of the boron and beryllium abundances due to cosmic ray spallation. Finally, we have continued to address the solar neutrino problem by investigating the impact of astrophysical uncertainties on the MSW solution for a full three-family treatment of MSW mixing

  10. Nuclear physics and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N.; Olinto, A.V.

    1992-09-01

    We have investigated a variety of research topics on the interface of nuclear physics and astrophysics during the past year. We have continued our study of dihyperon states in dense matter and have started to make a connection between their properties in the core of neutron stars with the ongoing experimental searches at Brookhaven National Laboratory. We started to build a scenario for the origin of gamma-ray bursts using the conversion of neutron stars to strange stars close to an active galactic nucleous. We have been reconsidering the constraints due to neutron star cooling rates on the equation of state for high density matter in the light, of recent findings which show that the faster direct Urca cooling process is possible for a range of nuclear compositions. We have developed a model for the formation of primordial magnetic fields due to the dynamics of the quark-hadron phase transition. Encouraged by the most recent observational developments, we have investigated the possible origin of the boron and beryllium abundances. We have greatly improved the calculations of the primordial abundances of these elements I>y augmenting the reaction networks and by updating the most recent experimental nuclear reaction rates. Our calculations have shown that the primordial abundances are much higher than previously thought but that the observed abundances cannot be explained by primordial sources alone. We have also studied the origin of the boron and beryllium abundances due to cosmic ray spallation. Finally, we have continued to address the solar neutrino problem by investigating the impact of astrophysical uncertainties on the MSW solution for a full three-family treatment of MSW mixing.

  11. Advanced Architectures for Astrophysical Supercomputing

    Science.gov (United States)

    Barsdell, B. R.; Barnes, D. G.; Fluke, C. J.

    2010-12-01

    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×) in general-purpose computation - performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.

  12. Delivering high performance BWR fuel reliably

    International Nuclear Information System (INIS)

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  13. VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration

    Science.gov (United States)

    Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.

    2012-09-01

    VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to

  14. High-performance ceramics. Fabrication, structure, properties

    International Nuclear Information System (INIS)

    Petzow, G.; Tobolski, J.; Telle, R.

    1996-01-01

    The program ''Ceramic High-performance Materials'' pursued the objective to understand the chaining of cause and effect in the development of high-performance ceramics. This chain of problems begins with the chemical reactions for the production of powders, comprises the characterization, processing, shaping and compacting of powders, structural optimization, heat treatment, production and finishing, and leads to issues of materials testing and of a design appropriate to the material. The program ''Ceramic High-performance Materials'' has resulted in contributions to the understanding of fundamental interrelationships in terms of materials science, which are summarized in the present volume - broken down into eight special aspects. (orig./RHM)

  15. High Performance Grinding and Advanced Cutting Tools

    CERN Document Server

    Jackson, Mark J

    2013-01-01

    High Performance Grinding and Advanced Cutting Tools discusses the fundamentals and advances in high performance grinding processes, and provides a complete overview of newly-developing areas in the field. Topics covered are grinding tool formulation and structure, grinding wheel design and conditioning and applications using high performance grinding wheels. Also included are heat treatment strategies for grinding tools, using grinding tools for high speed applications, laser-based and diamond dressing techniques, high-efficiency deep grinding, VIPER grinding, and new grinding wheels.

  16. Strategy Guideline: High Performance Residential Lighting

    Energy Technology Data Exchange (ETDEWEB)

    Holton, J.

    2012-02-01

    The Strategy Guideline: High Performance Residential Lighting has been developed to provide a tool for the understanding and application of high performance lighting in the home. The high performance lighting strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner's expectations for high quality lighting.

  17. Nuclear astrophysics with radioactive beams

    International Nuclear Information System (INIS)

    Bertulani, C.A.; Gade, A.

    2010-01-01

    The quest to comprehend how nuclear processes influence astrophysical phenomena is driving experimental and theoretical research programs worldwide. One of the main goals in nuclear astrophysics is to understand how energy is generated in stars, how elements are synthesized in stellar events and what the nature of neutron stars is. New experimental capabilities, the availability of radioactive beams and increased computational power paired with new astronomical observations have advanced the present knowledge. This review summarizes the progress in the field of nuclear astrophysics with a focus on the role of indirect methods and reactions involving beams of rare isotopes.

  18. Nuclear Astrophysics Experiments at CIAE

    International Nuclear Information System (INIS)

    Liu Weiping; Li Zhihong; Bai Xixiang; Lian Gang; Guo Bing; Zeng, Sheng; Yan Shengquan; Wang Baoxiang; Shu Nengchuan; Wu Kaisu; Chen Yongshou

    2005-01-01

    This paper describes nuclear astrophysical studies using the unstable ion beam facility GIRAFFE. We measured the angular distributions for some low energy reactions, such as 7 Be(d, n) 8 B, 11 C(d, n) 12 N, 8 Li(d, n) 9 Be and 8 Li(d, p) 9 Li in inverse kinematics, and indirectly derived the astrophysical S-factors or reaction rates of 7 Be(p, γ) 8 B, 11 C(p, γ) 12 N, 8 Li(n, γ) 9 Li at astrophysically relevant energies

  19. High performance liquid chromatographic determination of ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-08

    ) high performance liquid chromatography (HPLC) grade .... applications. These are important requirements if the reagent is to be applicable to on-line pre or post column derivatisation in a possible automation of the analytical.

  20. Analog circuit design designing high performance amplifiers

    CERN Document Server

    Feucht, Dennis

    2010-01-01

    The third volume Designing High Performance Amplifiers applies the concepts from the first two volumes. It is an advanced treatment of amplifier design/analysis emphasizing both wideband and precision amplification.

  1. Strategies and Experiences Using High Performance Fortran

    National Research Council Canada - National Science Library

    Shires, Dale

    2001-01-01

    .... High performance Fortran (HPF) is a relative new addition to the Fortran dialect It is an attempt to provide an efficient high-level Fortran parallel programming language for the latest generation of been debatable...

  2. High-performance computing using FPGAs

    CERN Document Server

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  3. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  4. Gradient High Performance Liquid Chromatography Method ...

    African Journals Online (AJOL)

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  5. Atoms in astrophysics

    CERN Document Server

    Eissner, W; Hummer, D; Percival, I

    1983-01-01

    It is hard to appreciate but nevertheless true that Michael John Seaton, known internationally for the enthusiasm and skill with which he pursues his research in atomic physics and astrophysics, will be sixty years old on the 16th of January 1983. To mark this occasion some of his colleagues and former students have prepared this volume. It contains articles that de­ scribe some of the topics that have attracted his attention since he first started his research work at University College London so many years ago. Seaton's association with University College London has now stretched over a period of some 37 years, first as an undergraduate student, then as a research student, and then, successively, as Assistant Lecturer, Lecturer, Reader, and Professor. Seaton arrived at University College London in 1946 to become an undergraduate in the Physics Department, having just left the Royal Air Force in which he had served as a navigator in the Pathfinder Force of Bomber Command. There are a number of stories of ho...

  6. Photonuclear reactions: astrophysical implications

    International Nuclear Information System (INIS)

    Nedorezov, V.G.

    2005-01-01

    Full text: Brief review on astrophysical aspects in photonuclear studies is presented. Main attention is paid on the two kind experiments. The first one was performed at ESRF by GRAAL collaboration using the back scattering laser photons technique to study light speed anisotropy with respect to the dipole of the Cosmic Microwave Background (CMB) radiation. This is a modern analog of the Michelson - Morley experiment. The results obtained are not only methologically different from those of the above mentioned experiments but also provide stronger constrains on the light speed anisotropy in CMB frame. Second subject is related to the electron scattering on exotic nuclei which can play significant role in explosive phenomena such as novae, supernovae and neutron stars. Such approach may be considered as the alternative to traditional low energy accelerator experiments. Exotic nuclei for these purposes can be obtained at CSI (ELISe project). The experiment is foreseen to be installed at the New Experimental Storage Ring (NESR) at RAIR where cooled secondary beams of radioactive ions will collide with an intense electron beam circulating in a small electron storage beam

  7. Astrophysical implications of periodicity

    International Nuclear Information System (INIS)

    Muller, R.A.

    1988-01-01

    Two remarkable discoveries of the last decade have profound implications for astrophysics and for geophysics. These are the discovery by Alvarez et al., that certain mass extinctions are caused by the impact on the earth of a large asteroid or comet, and the discovery by Raup and Sepkoski that such extinctions are periodic, with a cycle time of 26 to 30 million years. The validity of both of these discoveries is assumed and the implications are examined. Most of the phenomena described depend not on periodicity, but just on the weaker assumption that the impacts on the earth take place primarily in showers. Proposed explanations for the periodicity include galactic oscillations, the Planet X model, and the possibility of Nemesis, a solar companion star. These hypotheses are critically examined. Results of the search for the solar companion are reported. The Deccan flood basalts of India have been proposed as the impact site for the Cretaceous impact, but this hypotheisis is in contradiction with the conclusion of Courtillot et al., that the magma flow began during a period of normal magnetic field. A possible resolution of this contradiction is proposed

  8. Nuclear physics and astrophysics

    International Nuclear Information System (INIS)

    Schramm, D.N.; Olinto, A.V.

    1993-06-01

    The authors report on recent progress of research at the interface of nuclear physics and astrophysics. During the past year, the authors continued to work on Big Bang and stellar nucleosynthesis, the solar neutrino problem, the equation of state for dense matter, the quark-hadron phase transition, and the origin of gamma-ray bursts; and began studying the consequences of nuclear reaction rates in the presence of strong magnetic fields. They have shown that the primordial production of B and Be cannot explain recent detections of these elements in halo stars and have looked at spallation as the likely source of these elements. By looking at nucleosynthesis with inhomogeneous initial conditions, they concluded that the Universe must have been very smooth before nucleosynthesis. They have also constrained neutrino oscillations and primordial magnetic fields by Big Bang nucleosynthesis. On the solar neutrino problem, they have analyzed the implications of the SAGE and GALLEX experiments. They also showed that the presence of dibaryons in neutron stars depends weakly on uncertainties of nuclear equations of state. They have started to investigate the consequences of strong magnetic fields on nuclear reactions and implications for neutron star cooling and supernova nucleosynthesis

  9. High performance computing in Windows Azure cloud

    OpenAIRE

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  10. Carbon nanomaterials for high-performance supercapacitors

    OpenAIRE

    Tao Chen; Liming Dai

    2013-01-01

    Owing to their high energy density and power density, supercapacitors exhibit great potential as high-performance energy sources for advanced technologies. Recently, carbon nanomaterials (especially, carbon nanotubes and graphene) have been widely investigated as effective electrodes in supercapacitors due to their high specific surface area, excellent electrical and mechanical properties. This article summarizes the recent progresses on the development of high-performance supercapacitors bas...

  11. Delivering high performance BWR fuel reliably

    Energy Technology Data Exchange (ETDEWEB)

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  12. HPTA: High-Performance Text Analytics

    OpenAIRE

    Vandierendonck, Hans; Murphy, Karen; Arif, Mahwish; Nikolopoulos, Dimitrios S.

    2017-01-01

    One of the main targets of data analytics is unstructured data, which primarily involves textual data. High-performance processing of textual data is non-trivial. We present the HPTA library for high-performance text analytics. The library helps programmers to map textual data to a dense numeric representation, which can be handled more efficiently. HPTA encapsulates three performance optimizations: (i) efficient memory management for textual data, (ii) parallel computation on associative dat...

  13. High-performance computing — an overview

    Science.gov (United States)

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  14. An introduction to observational astrophysics

    CERN Document Server

    Gallaway, Mark

    2016-01-01

    Observational Astrophysics follows the general outline of an astrophysics undergraduate curriculum targeting practical observing information to what will be covered at the university level. This includes the basics of optics and coordinate systems to the technical details of CCD imaging, photometry, spectography and radio astronomy.  General enough to be used by students at a variety of institutions and advanced enough to be far more useful than observing guides targeted at amateurs, the author provides a comprehensive and up-to-date treatment of observational astrophysics at undergraduate level to be used with a university’s teaching telescope.  The practical approach takes the reader from basic first year techniques to those required for a final year project. Using this textbook as a resource, students can easily become conversant in the practical aspects of astrophysics in the field as opposed to the classroom.

  15. Recent progress on astrophysical opacity

    International Nuclear Information System (INIS)

    Rogers, F.J.; Iglesias, C.A.

    1992-08-01

    Improvements in the calculation of the opacity of astrophysical plasmas has helped to resolve several long-standing puzzles in the modeling of variable stars. The most significant opacity enhancements over the Los Alamos Astrophysical Library (LAOL) are due to improvements in the equation of state and atomic physics. Comparison with experiment has corroborated the predicted large opacity increases due to transitions in M-shell iron. We give a summary of recent developments

  16. An introduction to astrophysical hydrodynamics

    CERN Document Server

    Shore, Steven N

    1992-01-01

    This book is an introduction to astrophysical hydrodynamics for both astronomy and physics students. It provides a comprehensive and unified view of the general problems associated with fluids in a cosmic context, with a discussion of fluid dynamics and plasma physics. It is the only book on hydrodynamics that addresses the astrophysical context. Researchers and students will find this work to be an exceptional reference. Contents include chapters on irrotational and rotational flows, turbulence, magnetohydrodynamics, and instabilities.

  17. Minicourses in Astrophysics, Modular Approach, Vol. I.

    Science.gov (United States)

    Illinois Univ., Chicago.

    This is the first volume of a two-volume minicourse in astrophysics. It contains chapters on the following topics: planetary atmospheres; X-ray astronomy; radio astrophysics; molecular astrophysics; and gamma-ray astrophysics. Each chapter gives much technical discussion, mathematical treatment, diagrams, and examples. References are included with…

  18. Relativistic astrophysics and theory of gravity

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1982-01-01

    A brief historical review of the development of astrophysical science in the State Astrophysical Institute named after Shternberg (SAISh) has been given in a popular form. The main directions of the SAISh astrophysical investigations have been presented: relativistic theory of gravity, relativistic astrophysics of interplanetary medium and cosmology

  19. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  20. High performance bio-integrated devices

    Science.gov (United States)

    Kim, Dae-Hyeong; Lee, Jongha; Park, Minjoon

    2014-06-01

    In recent years, personalized electronics for medical applications, particularly, have attracted much attention with the rise of smartphones because the coupling of such devices and smartphones enables the continuous health-monitoring in patients' daily life. Especially, it is expected that the high performance biomedical electronics integrated with the human body can open new opportunities in the ubiquitous healthcare. However, the mechanical and geometrical constraints inherent in all standard forms of high performance rigid wafer-based electronics raise unique integration challenges with biotic entities. Here, we describe materials and design constructs for high performance skin-mountable bio-integrated electronic devices, which incorporate arrays of single crystalline inorganic nanomembranes. The resulting electronic devices include flexible and stretchable electrophysiology electrodes and sensors coupled with active electronic components. These advances in bio-integrated systems create new directions in the personalized health monitoring and/or human-machine interfaces.

  1. Strategy Guideline. Partnering for High Performance Homes

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, Duncan [IBACOS, Inc., Pittsburgh, PA (United States)

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. This guide is intended for use by all parties associated in the design and construction of high performance homes. It serves as a starting point and features initial tools and resources for teams to collaborate to continually improve the energy efficiency and durability of new houses.

  2. High performance parallel I/O

    CERN Document Server

    Prabhat

    2014-01-01

    Gain Critical Insight into the Parallel I/O EcosystemParallel I/O is an integral component of modern high performance computing (HPC), especially in storing and processing very large datasets to facilitate scientific discovery. Revealing the state of the art in this field, High Performance Parallel I/O draws on insights from leading practitioners, researchers, software architects, developers, and scientists who shed light on the parallel I/O ecosystem.The first part of the book explains how large-scale HPC facilities scope, configure, and operate systems, with an emphasis on choices of I/O har

  3. High energy astrophysics. An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Courvoisier, Thierry J.L. [Geneva Univ., Versoix (Switzerland). ISDC, Data Centre for Astrophysics

    2013-07-01

    Based on observational examples this book reveals and explains high-energy astrophysical processes. Presents the theory of astrophysical processes in a didactic approach by deriving equations step by step. With several attractive astronomical pictures. High-energy astrophysics has unveiled a Universe very different from that only known from optical observations. It has revealed many types of objects in which typical variability timescales are as short as years, months, days, and hours (in quasars, X-ray binaries, and other objects), and even down to milli-seconds in gamma ray bursts. The sources of energy that are encountered are only very seldom nuclear fusion, and most of the time gravitation, a paradox when one thinks that gravitation is, by many orders of magnitude, the weakest of the fundamental interactions. The understanding of these objects' physical conditions and the processes revealed by high-energy astrophysics in the last decades is nowadays part of astrophysicists' culture, even of those active in other domains of astronomy. This book evolved from lectures given to master and PhD students at the University of Geneva since the early 1990s. It aims at providing astronomers and physicists intending to be active in high-energy astrophysics a broad basis on which they should be able to build the more specific knowledge they will need. While in the first part of the book the physical processes are described and derived in detail, the second part studies astrophysical objects in which high-energy astrophysics plays a crucial role. This two-pronged approach will help students recognise physical processes by their observational signatures in contexts that may differ widely from those presented here.

  4. Nuclear Data for Astrophysics: Resources, Challenges, Strategies, and Software Solutions

    International Nuclear Information System (INIS)

    Smith, Michael Scott; Lingerfelt, Eric J.; Nesaraja, Caroline D.; Hix, William Raphael; Roberts, Luke F.; Koura, Hiroyuki; Fuller, George M.; Tytler, David

    2008-01-01

    One of the most exciting utilizations of nuclear data is to help unlock the mysteries of the Cosmos -- the creation of the chemical elements, the evolution and explosion of stars, and the origin and fate of the Universe. There are now many nuclear data sets, tools, and other resources online to help address these important questions. However, numerous serious challenges make it important to develop strategies now to ensure a sustainable future for this work. A number of strategies are advocated, including: enlisting additional manpower to evaluate the newest data; devising ways to streamline evaluation activities; and improving communication and coordination between existing efforts. Software projects are central to some of these strategies. Examples include: creating a virtual 'pipeline' leading from the nuclear laboratory to astrophysics simulations; improving data visualization and management to get the most science out of the existing datasets; and creating a nuclear astrophysics data virtual (online) community. Recent examples will be detailed, including the development of two first-generation software pipelines, the Computational Infrastructure for Nuclear Astrophysics for stellar astrophysics and the bigbangonline suite of codes for cosmology, and the coupling of nuclear data to sensitivity studies with astrophysical simulation codes to guide future research.

  5. Nuclear data for astrophysics: resources, challenges, strategies, and software solutions

    International Nuclear Information System (INIS)

    Smith, M.S.; Lingerfelt, E.J.; Nesaraja, C.D.; Raphael Hix, W.; Roberts, L.F.; Hiroyuki, Koura; Fuller, G.M.; Tytler, D.

    2008-01-01

    One of the most exciting utilizations of nuclear data is to help unlock the mysteries of the Cosmos - the creation of the chemical elements, the evolution and explosion of stars, and the origin and fate of the Universe. There are now many nuclear data sets, tools, and other resources online to help address these important questions. However, numerous serious challenges make it important to develop strategies now to ensure a sustainable future for this work. A number of strategies are advocated, including: enlisting additional manpower to evaluate the newest data; devising ways to streamline evaluation activities; and improving communication and coordination between existing efforts. Software projects are central to some of these strategies. Examples include: creating a virtual - pipeline - leading from the nuclear laboratory to astrophysics simulations; improving data visualization and management to get the most science out of the existing datasets; and creating a nuclear astrophysics data virtual (online) community. Recent examples will be detailed, including the development of two first-generation software pipelines, the Computational Infrastructure for Nuclear Astrophysics for stellar astrophysics and the Bigbangonline suite of codes for cosmology, and the coupling of nuclear data to sensitivity studies with astrophysical simulation codes to guide future research. (authors)

  6. Team Development for High Performance Management.

    Science.gov (United States)

    Schermerhorn, John R., Jr.

    1986-01-01

    The author examines a team development approach to management that creates shared commitments to performance improvement by focusing the attention of managers on individual workers and their task accomplishments. It uses the "high-performance equation" to help managers confront shared beliefs and concerns about performance and develop realistic…

  7. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  8. An Introduction to High Performance Fortran

    Directory of Open Access Journals (Sweden)

    John Merlin

    1995-01-01

    Full Text Available High Performance Fortran (HPF is an informal standard for extensions to Fortran 90 to assist its implementation on parallel architectures, particularly for data-parallel computation. Among other things, it includes directives for specifying data distribution across multiple memories, and concurrent execution features. This article provides a tutorial introduction to the main features of HPF.

  9. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  10. High Performance Work Systems for Online Education

    Science.gov (United States)

    Contacos-Sawyer, Jonna; Revels, Mark; Ciampa, Mark

    2010-01-01

    The purpose of this paper is to identify the key elements of a High Performance Work System (HPWS) and explore the possibility of implementation in an online institution of higher learning. With the projected rapid growth of the demand for online education and its importance in post-secondary education, providing high quality curriculum, excellent…

  11. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  12. High Performance Networks for High Impact Science

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Mary A.; Bair, Raymond A.

    2003-02-13

    This workshop was the first major activity in developing a strategic plan for high-performance networking in the Office of Science. Held August 13 through 15, 2002, it brought together a selection of end users, especially representing the emerging, high-visibility initiatives, and network visionaries to identify opportunities and begin defining the path forward.

  13. Teacher Accountability at High Performing Charter Schools

    Science.gov (United States)

    Aguirre, Moises G.

    2016-01-01

    This study will examine the teacher accountability and evaluation policies and practices at three high performing charter schools located in San Diego County, California. Charter schools are exempted from many laws, rules, and regulations that apply to traditional school systems. By examining the teacher accountability systems at high performing…

  14. Technology Leadership in Malaysia's High Performance School

    Science.gov (United States)

    Yieng, Wong Ai; Daud, Khadijah Binti

    2017-01-01

    Headmaster as leader of the school also plays a role as a technology leader. This applies to the high performance schools (HPS) headmaster as well. The HPS excel in all aspects of education. In this study, researcher is interested in examining the role of the headmaster as a technology leader through interviews with three headmasters of high…

  15. Toward High Performance in Industrial Refrigeration Systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, Roozbeh; Niemann, H.

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  16. Towards high performance in industrial refrigeration systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, R.; Niemann, Hans Henrik

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  17. Validated high performance liquid chromatographic (HPLC) method ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  18. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid ... response, tailing factor and resolution of six replicate injections was < 3 %. ... Cefadroxil monohydrate, Human plasma, Pharmacokinetics Bioequivalence ... Drug-free plasma was obtained from the local .... Influence of probenicid on the renal.

  19. High-performance OPCPA laser system

    International Nuclear Information System (INIS)

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J.

    2006-01-01

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  20. High-performance OPCPA laser system

    Energy Technology Data Exchange (ETDEWEB)

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J. [Rochester Univ., Lab. for Laser Energetics, NY (United States)

    2006-06-15

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  1. Comparing Dutch and British high performing managers

    NARCIS (Netherlands)

    Waal, A.A. de; Heijden, B.I.J.M. van der; Selvarajah, C.; Meyer, D.

    2016-01-01

    National cultures have a strong influence on the performance of organizations and should be taken into account when studying the traits of high performing managers. At the same time, many studies that focus upon the attributes of successful managers show that there are attributes that are similar

  2. Project materials [Commercial High Performance Buildings Project

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-01-01

    The Consortium for High Performance Buildings (ChiPB) is an outgrowth of DOE'S Commercial Whole Buildings Roadmapping initiatives. It is a team-driven public/private partnership that seeks to enable and demonstrate the benefit of buildings that are designed, built and operated to be energy efficient, environmentally sustainable, superior quality, and cost effective.

  3. High performance structural ceramics for nuclear industry

    International Nuclear Information System (INIS)

    Pujari, Vimal K.; Faker, Paul

    2006-01-01

    A family of Saint-Gobain structural ceramic materials and products produced by its High performance Refractory Division is described. Over the last fifty years or so, Saint-Gobain has been a leader in developing non oxide ceramic based novel materials, processes and products for application in Nuclear, Chemical, Automotive, Defense and Mining industries

  4. A new high performance current transducer

    International Nuclear Information System (INIS)

    Tang Lijun; Lu Songlin; Li Deming

    2003-01-01

    A DC-100 kHz current transducer is developed using a new technique on zero-flux detecting principle. It was shown that the new current transducer is of high performance, its magnetic core need not be selected very stringently, and it is easy to manufacture

  5. Recent results in nuclear astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Coc, Alain; Kiener, Juergen [CNRS/IN2P3 et Universite Paris Sud 11, UMR 8609, Centre de Sciences Nucleaires et de Sciences de la Matiere (CSNSM), Orsay Campus (France); Hammache, Fairouz [CNRS/IN2P3 et Universite Paris Sud 11, UMR 8608, Institut de Physique Nucleaire d' Orsay (IPNO), Orsay Campus (France)

    2015-03-01

    In this review, we emphasize the interplay between astrophysical observations, modeling, and nuclear physics laboratory experiments. Several important nuclear cross sections for astrophysics have long been identified, e.g., {sup 12}C(α, γ){sup 16}O for stellar evolution, or {sup 13}C(α, n){sup 16}O and {sup 22}Ne(α, n){sup 25}Mg as neutron sources for the s-process. More recently, observations of lithium abundances in the oldest stars, or of nuclear gamma-ray lines from space, have required new laboratory experiments. New evaluation of thermonuclear reaction rates now includes the associated rate uncertainties that are used in astrophysical models to i) estimate final uncertainties on nucleosynthesis yields and ii) identify those reactions that require further experimental investigation. Sometimes direct cross section measurements are possible, but more generally the use of indirect methods is compulsory in view of the very low cross sections. Non-thermal processes are often overlooked but are also important for nuclear astrophysics, e.g., in gamma-ray emission from solar flares or in the interaction of cosmic rays with matter, and also motivate laboratory experiments. Finally, we show that beyond the historical motivations of nuclear astrophysics, understanding i) the energy sources that drive stellar evolution and ii) the origin of the elements can also be used to give new insights into physics beyond the standard model. (orig.)

  6. Recent results in nuclear astrophysics

    International Nuclear Information System (INIS)

    Coc, Alain; Kiener, Juergen; Hammache, Fairouz

    2015-01-01

    In this review, we emphasize the interplay between astrophysical observations, modeling, and nuclear physics laboratory experiments. Several important nuclear cross sections for astrophysics have long been identified, e.g., 12 C(α, γ) 16 O for stellar evolution, or 13 C(α, n) 16 O and 22 Ne(α, n) 25 Mg as neutron sources for the s-process. More recently, observations of lithium abundances in the oldest stars, or of nuclear gamma-ray lines from space, have required new laboratory experiments. New evaluation of thermonuclear reaction rates now includes the associated rate uncertainties that are used in astrophysical models to i) estimate final uncertainties on nucleosynthesis yields and ii) identify those reactions that require further experimental investigation. Sometimes direct cross section measurements are possible, but more generally the use of indirect methods is compulsory in view of the very low cross sections. Non-thermal processes are often overlooked but are also important for nuclear astrophysics, e.g., in gamma-ray emission from solar flares or in the interaction of cosmic rays with matter, and also motivate laboratory experiments. Finally, we show that beyond the historical motivations of nuclear astrophysics, understanding i) the energy sources that drive stellar evolution and ii) the origin of the elements can also be used to give new insights into physics beyond the standard model. (orig.)

  7. Art as a Vehicle for Nuclear Astrophysics

    Science.gov (United States)

    Kilburn, Micha

    2013-04-01

    One aim of the The Joint Institute for Nuclear Astrophysics (JINA) is to teach K-12 students concepts and ideas related to nuclear astrophysics. For students who have not yet seen the periodic table, this can be daunting, and we often begin with astronomy concepts. The field of astronomy naturally lends itself to an art connection through its beautiful images. Our Art 2 Science programming adopts a hands-on approach by teaching astronomy through student created art projects. This approach engages the students, through tactile means, visually and spatially. For younger students, we also include physics based craft projects that facilitate the assimilation of problem solving skills. The arts can be useful for aural and kinetic learners as well. Our program also includes singing and dancing to songs with lyrics that teach physics and astronomy concepts. The Art 2 Science programming has been successfully used in after-school programs at schools, community centers, and art studios. We have even expanded the program into a popular week long summer camp. I will discuss our methods, projects, specific goals, and survey results for JINA's Art 2 Science programs.

  8. High Performance Data Distribution for Scientific Community

    Science.gov (United States)

    Tirado, Juan M.; Higuero, Daniel; Carretero, Jesus

    2010-05-01

    Institutions such as NASA, ESA or JAXA find solutions to distribute data from their missions to the scientific community, and their long term archives. This is a complex problem, as it includes a vast amount of data, several geographically distributed archives, heterogeneous architectures with heterogeneous networks, and users spread around the world. We propose a novel architecture (HIDDRA) that solves this problem aiming to reduce user intervention in data acquisition and processing. HIDDRA is a modular system that provides a highly efficient parallel multiprotocol download engine, using a publish/subscribe policy which helps the final user to obtain data of interest transparently. Our system can deal simultaneously with multiple protocols (HTTP,HTTPS, FTP, GridFTP among others) to obtain the maximum bandwidth, reducing the workload in data server and increasing flexibility. It can also provide high reliability and fault tolerance, as several sources of data can be used to perform one file download. HIDDRA architecture can be arranged into a data distribution network deployed on several sites that can cooperate to provide former features. HIDDRA has been addressed by the 2009 e-IRG Report on Data Management as a promising initiative for data interoperability. Our first prototype has been evaluated in collaboration with the ESAC centre in Villafranca del Castillo (Spain) that shows a high scalability and performance, opening a wide spectrum of opportunities. Some preliminary results have been published in the Journal of Astrophysics and Space Science [1]. [1] D. Higuero, J.M. Tirado, J. Carretero, F. Félix, and A. de La Fuente. HIDDRA: a highly independent data distribution and retrieval architecture for space observation missions. Astrophysics and Space Science, 321(3):169-175, 2009

  9. Astrophysics a very short introduction

    CERN Document Server

    Binney, James

    2016-01-01

    Astrophysics is the physics of the stars, and more widely the physics of the Universe. It enables us to understand the structure and evolution of planetary systems, stars, galaxies, interstellar gas, and the cosmos as a whole. In this Very Short Introduction, the leading astrophysicist James Binney shows how the field of astrophysics has expanded rapidly in the past century, with vast quantities of data gathered by telescopes exploiting all parts of the electromagnetic spectrum, combined with the rapid advance of computing power, which has allowed increasingly effective mathematical modelling. He illustrates how the application of fundamental principles of physics - the consideration of energy and mass, and momentum - and the two pillars of relativity and quantum mechanics, has provided insights into phenomena ranging from rapidly spinning millisecond pulsars to the collision of giant spiral galaxies. This is a clear, rigorous introduction to astrophysics for those keen to cut their teeth on a conceptual trea...

  10. Magnetohydrodynamic models of astrophysical jets

    International Nuclear Information System (INIS)

    Beskin, Vasily S

    2010-01-01

    In this review, analytical results obtained for a wide class of stationary axisymmetric flows in the vicinity of compact astrophysical objects are analyzed, with an emphasis on quantitative predictions for specific sources. Recent years have witnessed a great increase in understanding the formation and properties of astrophysical jets. This is due not only to new observations but also to advances in analytical theory which has produced fairly simple relations, and to what can undoubtedly be called a breakthrough in numerical simulation which has enabled confirmation of theoretical predictions. Of course, we are still very far from fully understanding the physical processes occurring in compact sources. Nevertheless, the progress made raises hopes for near-future test observations that can give insight into the physical processes occurring in active astrophysical objects. (reviews of topical problems)

  11. High Energy Density Laboratory Astrophysics

    CERN Document Server

    Lebedev, Sergey V

    2007-01-01

    During the past decade, research teams around the world have developed astrophysics-relevant research utilizing high energy-density facilities such as intense lasers and z-pinches. Every two years, at the International conference on High Energy Density Laboratory Astrophysics, scientists interested in this emerging field discuss the progress in topics covering: - Stellar evolution, stellar envelopes, opacities, radiation transport - Planetary Interiors, high-pressure EOS, dense plasma atomic physics - Supernovae, gamma-ray bursts, exploding systems, strong shocks, turbulent mixing - Supernova remnants, shock processing, radiative shocks - Astrophysical jets, high-Mach-number flows, magnetized radiative jets, magnetic reconnection - Compact object accretion disks, x-ray photoionized plasmas - Ultrastrong fields, particle acceleration, collisionless shocks. These proceedings cover many of the invited and contributed papers presented at the 6th International Conference on High Energy Density Laboratory Astrophys...

  12. EDITORIAL: Focus on Visualization in Physics FOCUS ON VISUALIZATION IN PHYSICS

    Science.gov (United States)

    Sanders, Barry C.; Senden, Tim; Springel, Volker

    2008-12-01

    Advances in physics are intimately connected with developments in a new technology, the telescope, precision clocks, even the computer all have heralded a shift in thinking. These landmark developments open new opportunities accelerating research and in turn new scientific directions. These technological drivers often correspond to new instruments, but equally might just as well flag a new mathematical tool, an algorithm or even means to visualize physics in a new way. Early on in this twenty-first century, scientific communities are just starting to explore the potential of digital visualization. Whether visualization is used to represent and communicate complex concepts, or to understand and interpret experimental data, or to visualize solutions to complex dynamical equations, the basic tools of visualization are shared in each of these applications and implementations. High-performance computing exemplifies the integration of visualization with leading research. Visualization is an indispensable tool for analyzing and interpreting complex three-dimensional dynamics as well as to diagnose numerical problems in intricate parallel calculation algorithms. The effectiveness of visualization arises by exploiting the unmatched capability of the human eye and visual cortex to process the large information content of images. In a brief glance, we recognize patterns or identify subtle features even in noisy data, something that is difficult or impossible to achieve with more traditional forms of data analysis. Importantly, visualizations guide the intuition of researchers and help to comprehend physical phenomena that lie far outside of direct experience. In fact, visualizations literally allow us to see what would otherwise remain completely invisible. For example, artificial imagery created to visualize the distribution of dark matter in the Universe has been instrumental to develop the notion of a cosmic web, and for helping to establish the current standard model of

  13. Advances in astronomy and astrophysics

    CERN Document Server

    Kopal, Zdenek

    1963-01-01

    Advances in Astronomy and Astrophysics, Volume 2 brings together numerous research works on different aspects of astronomy and astrophysics. This volume is composed of six chapters and begins with a summary of observational record on twilight extensions of the Venus cusps. The next chapter deals with the common and related properties of binary stars, with emphasis on the evaluation of their cataclysmic variables. Cataclysmic variables refer to an object in one of three classes: dwarf nova, nova, or supernova. These topics are followed by discussions on the eclipse phenomena and the eclipses i

  14. Advances in astronomy and astrophysics

    CERN Document Server

    Kopal, Zdenek

    1962-01-01

    Advances in Astronomy and Astrophysics, Volume 1 brings together numerous research works on different aspects of astronomy and astrophysics. This book is divided into five chapters and begins with an observational summary of the shock-wave theory of novae. The subsequent chapter provides the properties and problems of T tauri stars and related objects. These topics are followed by discussions on the structure and origin of meteorites and cosmic dust, as well as the models for evaluation of mass distribution in oblate stellar systems. The final chapter describes the methods of polarization mea

  15. Advances in astronomy and astrophysics

    CERN Document Server

    Kopal, Zdenek

    1966-01-01

    Advances in Astronomy and Astrophysics, Volume 4 brings together numerous research works on different aspects of astronomy and astrophysics. This volume is composed of five chapters, and starts with a description of objective prism and its application in space observations. The next chapter deals with the possibilities of deriving reliable models of the figure, density distribution, and gravity field of the Moon based on data obtained through Earth-bound telescopes. These topics are followed by a discussion on the ideal partially relativistic, partially degenerate gas in an exact manner. A ch

  16. Advanced LIGO: sources and astrophysics

    International Nuclear Information System (INIS)

    Creighton, Teviet

    2003-01-01

    Second-generation detectors in LIGO will take us from the discovery phase of gravitational-wave observations to the phase of true gravitational-wave astrophysics, with hundreds or thousands of potential sources. This paper surveys the most likely and interesting potential sources for Advanced LIGO, and the astrophysical processes that each one will probe. I conclude that binary inspiral signals are expected, while continuous signals from pulsars are plausible but not guaranteed. Other sources, such as core-collapse bursts, cosmic strings and primordial stochastic backgrounds, are speculative sources for Advanced LIGO, but also potentially the most interesting, since they push the limits of our theoretical knowledge

  17. Nuclear astrophysics away from stability

    International Nuclear Information System (INIS)

    Mathews, G.J.; Howard, W.M.; Takahashi, K.; Ward, R.A.

    1985-08-01

    Explosive astrophysical environments invariably lead to the production of nuclei away from stability. An understanding of the dynamics and nucleosynthesis in such environments is inextricably coupled to an understanding of the properties of the synthesized nuclei. In this talk a review is presented of the basic explosive nucleosynthesis mechanisms (s-process, r-process, n-process, p-process, and rp-process). Specific stellar model calculations are discussed and a summary of the pertinent nuclear data is presented. Possible experiments and nuclear-model calculations are suggested that could facilitate a better understanding of the astrophysical scenarios. 39 refs., 4 figs

  18. White Paper on Nuclear Astrophysics

    OpenAIRE

    Arcones, Almudena; Bardayan, Dan W.; Beers, Timothy C.; Berstein, Lee A.; Blackmon, Jeffrey C.; Messer, Bronson; Brown, B. Alex; Brown, Edward F.; Brune, Carl R.; Champagne, Art E.; Chieffi, Alessandro; Couture, Aaron J.; Danielewicz, Pawel; Diehl, Roland; El-Eid, Mounib

    2016-01-01

    This white paper informs the nuclear astrophysics community and funding agencies about the scientific directions and priorities of the field and provides input from this community for the 2015 Nuclear Science Long Range Plan. It summarizes the outcome of the nuclear astrophysics town meeting that was held on August 21-23, 2014 in College Station at the campus of Texas A&M University in preparation of the NSAC Nuclear Science Long Range Plan. It also reflects the outcome of an earlier town mee...

  19. Nuclear astrophysics lessons from INTEGRAL.

    Science.gov (United States)

    Diehl, Roland

    2013-02-01

    Measurements of high-energy photons from cosmic sources of nuclear radiation through ESA's INTEGRAL mission have advanced our knowledge: new data with high spectral resolution showed that characteristic gamma-ray lines from radioactive decays occur throughout the Galaxy in its interstellar medium. Although the number of detected sources and often the significance of the astrophysical results remain modest, conclusions derived from this unique astronomical window of radiation originating from nuclear processes are important, complementing the widely-employed atomic-line based spectroscopy. We review the results and insights obtained in the past decade from gamma-ray line measurements of cosmic sources in the context of their astrophysical questions.

  20. Advances in astronomy and astrophysics

    CERN Document Server

    Kopal, Zdenek

    1968-01-01

    Advances in Astronomy and Astrophysics, Volume 6 brings together numerous research works on different aspects of astronomy and astrophysics. This volume is composed of five chapters, and starts with the description of improved methods for analyzing and classifying families of periodic orbits in a conservative dynamical system with two degrees of freedom. The next chapter describes the variation of fractional luminosity of distorted components of close binary systems in the course of their revolution, or the accompanying changes in radial velocity. This topic is followed by discussions on vari

  1. Nuclear astrophysics data at ORNL

    International Nuclear Information System (INIS)

    Smith, M.S.; Blackmon, J.C.

    1998-01-01

    There is a new program of evaluation and dissemination of nuclear data of critical importance for nuclear astrophysics within the Physics Division of Oak Ridge National Laboratory. Recent activities include determining the rates of the important 14 O(α,p) 17 F and 17 F(p,γ) 18 Ne reactions, disseminating the Caughlan and Fowler reaction rate compilation on the World Wide Web, and evaluating the 17 O(p,α) 14 N reaction rate. These projects, which are closely coupled to current ORNL nuclear astrophysics research, are briefly discussed along with future plans

  2. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  3. Architecting Web Sites for High Performance

    Directory of Open Access Journals (Sweden)

    Arun Iyengar

    2002-01-01

    Full Text Available Web site applications are some of the most challenging high-performance applications currently being developed and deployed. The challenges emerge from the specific combination of high variability in workload characteristics and of high performance demands regarding the service level, scalability, availability, and costs. In recent years, a large body of research has addressed the Web site application domain, and a host of innovative software and hardware solutions have been proposed and deployed. This paper is an overview of recent solutions concerning the architectures and the software infrastructures used in building Web site applications. The presentation emphasizes three of the main functions in a complex Web site: the processing of client requests, the control of service levels, and the interaction with remote network caches.

  4. High performance cloud auditing and applications

    CERN Document Server

    Choi, Baek-Young; Song, Sejun

    2014-01-01

    This book mainly focuses on cloud security and high performance computing for cloud auditing. The book discusses emerging challenges and techniques developed for high performance semantic cloud auditing, and presents the state of the art in cloud auditing, computing and security techniques with focus on technical aspects and feasibility of auditing issues in federated cloud computing environments.   In summer 2011, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team initiated the exploration of the cloud security challenges and future cloud auditing research directions that are covered in this book. This work was supported by the United States government funds from the Air Force Office of Scientific Research (AFOSR), the AFOSR Summer Faculty Fellowship Program (SFFP), the Air Force Research Laboratory (AFRL) Visiting Faculty Research Program (VFRP), the National Science Foundation (NSF) and the National Institute of Health (NIH). All chapters were partially suppor...

  5. Monitoring SLAC High Performance UNIX Computing Systems

    International Nuclear Information System (INIS)

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  6. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe; Sarmiento, Adel; Cortes, Adriano Mauricio; Dalcin, L.; Collier, N.; Calo, Victor M.

    2015-01-01

    and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  7. Designing a High Performance Parallel Personal Cluster

    OpenAIRE

    Kapanova, K. G.; Sellier, J. M.

    2016-01-01

    Today, many scientific and engineering areas require high performance computing to perform computationally intensive experiments. For example, many advances in transport phenomena, thermodynamics, material properties, computational chemistry and physics are possible only because of the availability of such large scale computing infrastructures. Yet many challenges are still open. The cost of energy consumption, cooling, competition for resources have been some of the reasons why the scientifi...

  8. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  9. Governance among Malaysian high performing companies

    Directory of Open Access Journals (Sweden)

    Asri Marsidi

    2016-07-01

    Full Text Available Well performed companies have always been linked with effective governance which is generally reflected through effective board of directors. However many issues concerning the attributes for effective board of directors remained unresolved. Nowadays diversity has been perceived as able to influence the corporate performance due to the likelihood of meeting variety of needs and demands from diverse customers and clients. The study therefore aims to provide a fundamental understanding on governance among high performing companies in Malaysia.

  10. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  11. High Performance Computing Operations Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  12. Planning for high performance project teams

    International Nuclear Information System (INIS)

    Reed, W.; Keeney, J.; Westney, R.

    1997-01-01

    Both industry-wide research and corporate benchmarking studies confirm the significant savings in cost and time that result from early planning of a project. Amoco's Team Planning Workshop combines long-term strategic project planning and short-term tactical planning with team building to provide the basis for high performing project teams, better project planning, and effective implementation of the Amoco Common Process for managing projects

  13. vSphere high performance cookbook

    CERN Document Server

    Sarkar, Prasenjit

    2013-01-01

    vSphere High Performance Cookbook is written in a practical, helpful style with numerous recipes focusing on answering and providing solutions to common, and not-so common, performance issues and problems.The book is primarily written for technical professionals with system administration skills and some VMware experience who wish to learn about advanced optimization and the configuration features and functions for vSphere 5.1.

  14. High performance work practices, innovation and performance

    DEFF Research Database (Denmark)

    Jørgensen, Frances; Newton, Cameron; Johnston, Kim

    2013-01-01

    Research spanning nearly 20 years has provided considerable empirical evidence for relationships between High Performance Work Practices (HPWPs) and various measures of performance including increased productivity, improved customer service, and reduced turnover. What stands out from......, and Africa to examine these various questions relating to the HPWP-innovation-performance relationship. Each paper discusses a practice that has been identified in HPWP literature and potential variables that can facilitate or hinder the effects of these practices of innovation- and performance...

  15. High Performance Electronics on Flexible Silicon

    KAUST Repository

    Sevilla, Galo T.

    2016-09-01

    Over the last few years, flexible electronic systems have gained increased attention from researchers around the world because of their potential to create new applications such as flexible displays, flexible energy harvesters, artificial skin, and health monitoring systems that cannot be integrated with conventional wafer based complementary metal oxide semiconductor processes. Most of the current efforts to create flexible high performance devices are based on the use of organic semiconductors. However, inherent material\\'s limitations make them unsuitable for big data processing and high speed communications. The objective of my doctoral dissertation is to develop integration processes that allow the transformation of rigid high performance electronics into flexible ones while maintaining their performance and cost. In this work, two different techniques to transform inorganic complementary metal-oxide-semiconductor electronics into flexible ones have been developed using industry compatible processes. Furthermore, these techniques were used to realize flexible discrete devices and circuits which include metal-oxide-semiconductor field-effect-transistors, the first demonstration of flexible Fin-field-effect-transistors, and metal-oxide-semiconductors-based circuits. Finally, this thesis presents a new technique to package, integrate, and interconnect flexible high performance electronics using low cost additive manufacturing techniques such as 3D printing and inkjet printing. This thesis contains in depth studies on electrical, mechanical, and thermal properties of the fabricated devices.

  16. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  17. ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS

    International Nuclear Information System (INIS)

    WONG, CPC; MALANG, S; NISHIO, S; RAFFRAY, R; SAGARA, S

    2002-01-01

    OAK A271 ADVANCED HIGH PERFORMANCE SOLID WALL BLANKET CONCEPTS. First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  18. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  19. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  20. Journal of Astrophysics and Astronomy

    Indian Academy of Sciences (India)

    27

    Indian Institute of Astrophysics, Koramangala 2nd Block, Bangalore. 560034, India .... the hydrogen rich thermosphere so significantly that the internal energy of the gas becomes greater than the gravitational potential energy. This leads ... way greenhouse, water vapor would reach the stratosphere where it would.

  1. Journal of Astrophysics and Astronomy

    Indian Academy of Sciences (India)

    65

    Northern IMF as simulated by PIC code in parallel with MHD model-Journal of Astrophysics ... The global structure of the collisionless bow shock was inves- tigated by ..... international research community, access to modern space science simulations. ...... LaTeX Font Info: Redeclaring math alphabet \\mathbf on input line 29.

  2. An introduction to nuclear astrophysics

    International Nuclear Information System (INIS)

    Norman, E.B.

    1987-09-01

    The role of nuclear reactions in astrophysics is described. Stellar energy generation and heavy element nucleosynthesis is explained in terms of specific sequences of charged-particle and neutron induced reactions. The evolution and final states of stars are examined. 20 refs. 11 figs., 2 tabs

  3. Nuclear astrophysics of light nuclei

    DEFF Research Database (Denmark)

    Fynbo, Hans Otto Uldall

    2013-01-01

    A review of nuclear astrophysics of light nuclei using radioactive beams or techniques developed for radioactive beams is given. We discuss Big Bang nucleosynthesis, with special focus on the lithium problem, aspects of neutrino-physics, helium-burning and finally selected examples of studies...

  4. Astronomy & Astrophysics: an international journal

    Science.gov (United States)

    Bertout, C.

    2011-07-01

    After a brief historical introduction, we review the scope, editorial process, and production organization of A&A, one of the leading journals worldwide dedicated to publishing the results of astrophysical research. We then briefly discuss the economic model of the Journal and some current issues in scientific publishing.

  5. Compressed Baryonic Matter of Astrophysics

    OpenAIRE

    Guo, Yanjun; Xu, Renxin

    2013-01-01

    Baryonic matter in the core of a massive and evolved star is compressed significantly to form a supra-nuclear object, and compressed baryonic matter (CBM) is then produced after supernova. The state of cold matter at a few nuclear density is pedagogically reviewed, with significant attention paid to a possible quark-cluster state conjectured from an astrophysical point of view.

  6. Electric Currents along Astrophysical Jets

    Directory of Open Access Journals (Sweden)

    Ioannis Contopoulos

    2017-10-01

    Full Text Available Astrophysical black holes and their surrounding accretion disks are believed to be threaded by grand design helical magnetic fields. There is strong theoretical evidence that the main driver of their winds and jets is the Lorentz force generated by these fields and their associated electric currents. Several researchers have reported direct evidence for large scale electric currents along astrophysical jets. Quite unexpectedly, their directions are not random as would have been the case if the magnetic field were generated by a magnetohydrodynamic dynamo. Instead, in all kpc-scale detections, the inferred electric currents are found to flow away from the galactic nucleus. This unexpected break of symmetry suggests that a battery mechanism is operating around the central black hole. In the present article, we summarize observational evidence for the existence of large scale electric currents and their associated grand design helical magnetic fields in kpc-scale astrophysical jets. We also present recent results of general relativistic radiation magnetohydrodynamic simulations which show the action of the Cosmic Battery in the vicinity of astrophysical black holes.

  7. Astrophysics at very high energies

    International Nuclear Information System (INIS)

    Aharonian, Felix; Bergstroem, Lars; Dermer, Charles

    2013-01-01

    Presents three complementary lectures on very-high-energy astrophysics given by worldwide leaders in the field. Reviews the recent advances in and prospects of gamma-ray astrophysics and of multi-messenger astronomy. Prepares readers for using space and ground-based gamma-ray observatories, as well as neutrino and other multi-messenger detectors. With the success of Cherenkov Astronomy and more recently with the launch of NASA's Fermi mission, very-high-energy astrophysics has undergone a revolution in the last years. This book provides three comprehensive and up-to-date reviews of the recent advances in gamma-ray astrophysics and of multi-messenger astronomy. Felix Aharonian and Charles Dermer address our current knowledge on the sources of GeV and TeV photons, gleaned from the precise measurements made by the new instrumentation. Lars Bergstroem presents the challenges and prospects of astro-particle physics with a particular emphasis on the detection of dark matter candidates. The topics covered by the 40th Saas-Fee Course present the capabilities of current instrumentation and the physics at play in sources of very-high-energy radiation to students and researchers alike. This book will encourage and prepare readers for using space and ground-based gamma-ray observatories, as well as neutrino and other multi-messenger detectors.

  8. Indirect methods in nuclear astrophysics

    International Nuclear Information System (INIS)

    Bertulani, C.A.; Shubhchintak; Mukhamedzhanov, A.; Kadyrov, A. S.; Kruppa, A.; Pang, D. Y.

    2016-01-01

    We discuss recent developments in indirect methods used in nuclear astrophysics to determine the capture cross sections and subsequent rates of various stellar burning processes, when it is difficult to perform the corresponding direct measurements. We discuss in brief, the basic concepts of Asymptotic Normalization Coefficients, the Trojan Horse Method, the Coulomb Dissociation Method, (d,p), and charge-exchange reactions. (paper)

  9. Astrophysics on the Lab Bench

    Science.gov (United States)

    Hughes, Stephen W.

    2010-01-01

    In this article some basic laboratory bench experiments are described that are useful for teaching high school students some of the basic principles of stellar astrophysics. For example, in one experiment, students slam a plastic water-filled bottle down onto a bench, ejecting water towards the ceiling, illustrating the physics associated with a…

  10. Nuclear astrophysics: An application of nuclear physics

    International Nuclear Information System (INIS)

    Fueloep, Z.

    2005-01-01

    Nuclear astrophysics, a fruitful combination of nuclear physics and astrophysics can be viewed as a special application of nuclear physics where the study of nuclei and their reactions are motivated by astrophysical problems. Nuclear astrophysics is also a good example for the state of the art interdisciplinary research. The origin of elements studied by geologists is explored by astrophysicists using nuclear reaction rates provided by the nuclear physics community. Due to the high interest in the field two recent Nuclear Physics Divisional Conferences of the European Physical Society were devoted to nuclear astrophysics and a new conference series entitled 'Nuclear Physics in Astrophysics' has been established. Selected problems of nuclear astrophysics will be presented emphasizing the interplay between nuclear physics and astrophysics. As an example the role of 14 N(p,r) 15 O reaction rate in the determination of the age of globular clusters will be discussed in details

  11. International Olympiad on Astronomy and Astrophysics

    Science.gov (United States)

    Soonthornthum, B.; Kunjaya, C.

    2011-01-01

    The International Olympiad on Astronomy and Astrophysics, an annual astronomy and astrophysics competition for high school students, is described. Examples of problems and solutions from the competition are also given. (Contains 3 figures.)

  12. A simulation package for soft X-ray and EUV spectroscopy of astrophysical and laboratory plasmas in different environments

    International Nuclear Information System (INIS)

    Liang, G Y; Li, F; Wang, F L; Zhong, J Y; Zhao, G; Wu, Y

    2014-01-01

    Spectroscopic researches in astronomy are significantly dependent on theoretical modelling methods, such as Chianti, Xstar, Cloudy etc. Recently, a different research community - Laboratory Astrophysics tries to benchmark these theoretical models or simulate the astrophysical phenomenon directly in conditions accessed in ground laboratory. Those unavoidable differences between the astrophysical objects and laboratory provide a need for a self-consistent model to make a bridge for the two cases. So we setup a visualized simulation package for soft X-ray and EUV spectroscopy in astrophysical and laboratory plasmas.

  13. Toward a theory of high performance.

    Science.gov (United States)

    Kirby, Julia

    2005-01-01

    What does it mean to be a high-performance company? The process of measuring relative performance across industries and eras, declaring top performers, and finding the common drivers of their success is such a difficult one that it might seem a fool's errand to attempt. In fact, no one did for the first thousand or so years of business history. The question didn't even occur to many scholars until Tom Peters and Bob Waterman released In Search of Excellence in 1982. Twenty-three years later, we've witnessed several more attempts--and, just maybe, we're getting closer to answers. In this reported piece, HBR senior editor Julia Kirby explores why it's so difficult to study high performance and how various research efforts--including those from John Kotter and Jim Heskett; Jim Collins and Jerry Porras; Bill Joyce, Nitin Nohria, and Bruce Roberson; and several others outlined in a summary chart-have attacked the problem. The challenge starts with deciding which companies to study closely. Are the stars the ones with the highest market caps, the ones with the greatest sales growth, or simply the ones that remain standing at the end of the game? (And when's the end of the game?) Each major study differs in how it defines success, which companies it therefore declares to be worthy of emulation, and the patterns of activity and attitude it finds in common among them. Yet, Kirby concludes, as each study's method incrementally solves problems others have faced, we are progressing toward a consensus theory of high performance.

  14. Utilities for high performance dispersion model PHYSIC

    International Nuclear Information System (INIS)

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  15. Playa: High-Performance Programmable Linear Algebra

    Directory of Open Access Journals (Sweden)

    Victoria E. Howle

    2012-01-01

    Full Text Available This paper introduces Playa, a high-level user interface layer for composing algorithms for complex multiphysics problems out of objects from other Trilinos packages. Among other features, Playa provides very high-performance overloaded operators implemented through an expression template mechanism. In this paper, we give an overview of the central Playa objects from a user's perspective, show application to a sequence of increasingly complex solver algorithms, provide timing results for Playa's overloaded operators and other functions, and briefly survey some of the implementation issues involved.

  16. An integrated high performance fastbus slave interface

    International Nuclear Information System (INIS)

    Christiansen, J.; Ljuslin, C.

    1992-01-01

    A high performance Fastbus slave interface ASIC is presented. The Fastbus slave integrated circuit (FASIC) is a programmable device, enabling its direct use in many different applications. The FASIC acts as an interface between Fastbus and a 'standard' processor/memory bus. It can work stand-alone or together with a microprocessor. A set of address mapping windows can map Fastbus addresses to convenient memory addresses and at the same time act as address decoding logic. Data rates of 100 MBytes/s to Fastbus can be obtained using an internal FIFO buffer in the FASIC. (orig.)

  17. Strategy Guideline. High Performance Residential Lighting

    Energy Technology Data Exchange (ETDEWEB)

    Holton, J. [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-02-01

    This report has been developed to provide a tool for the understanding and application of high performance lighting in the home. The strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner’s expectations for high quality lighting.

  18. The Explorer program for astronomy and astrophysics

    International Nuclear Information System (INIS)

    Savage, B.D.; Becklin, E.E.; Cassinelli, J.P.; Dupree, A.K.; Elliot, J.L.; Hoffmann, W.F.; Hudson, H.S.; Jura, M.; Kurfess, J.; Murray, S.S.

    1986-01-01

    This report was prepared to provide NASA with a strategy for proceeding with Explorer-class programs for research in space astronomy and astrophysics. The role of Explorers in astronomy and astrophysics and their past accomplishments are discussed, as are current and future astronomy and astrophysics Explorers. Specific cost needs for an effective Explorer program are considered

  19. The importance of CNO isotopes in astrophysics

    International Nuclear Information System (INIS)

    Audoze, J.

    1977-01-01

    The research into CNO isotopes in astrophysics includes many different subfields of astrophysics such as meteoretical studies, experimental and theoretical nuclear astrophysics, optical astronomy, radio astronomy, etc. The purpose of this paper is to give some overview of the topic and guideline among these different subfields. (G.T.H.)

  20. High-performance computing in seismology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  1. Transport in JET high performance plasmas

    International Nuclear Information System (INIS)

    2001-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  2. Advanced high performance solid wall blanket concepts

    International Nuclear Information System (INIS)

    Wong, C.P.C.; Malang, S.; Nishio, S.; Raffray, R.; Sagara, A.

    2002-01-01

    First wall and blanket (FW/blanket) design is a crucial element in the performance and acceptance of a fusion power plant. High temperature structural and breeding materials are needed for high thermal performance. A suitable combination of structural design with the selected materials is necessary for D-T fuel sufficiency. Whenever possible, low afterheat, low chemical reactivity and low activation materials are desired to achieve passive safety and minimize the amount of high-level waste. Of course the selected fusion FW/blanket design will have to match the operational scenarios of high performance plasma. The key characteristics of eight advanced high performance FW/blanket concepts are presented in this paper. Design configurations, performance characteristics, unique advantages and issues are summarized. All reviewed designs can satisfy most of the necessary design goals. For further development, in concert with the advancement in plasma control and scrape off layer physics, additional emphasis will be needed in the areas of first wall coating material selection, design of plasma stabilization coils, consideration of reactor startup and transient events. To validate the projected performance of the advanced FW/blanket concepts the critical element is the need for 14 MeV neutron irradiation facilities for the generation of necessary engineering design data and the prediction of FW/blanket components lifetime and availability

  3. A High Performance COTS Based Computer Architecture

    Science.gov (United States)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  4. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  5. Development of high performance cladding materials

    International Nuclear Information System (INIS)

    Park, Jeong Yong; Jeong, Y. H.; Park, S. Y.

    2010-04-01

    The irradiation test for HANA claddings conducted and a series of evaluation for next-HANA claddings as well as their in-pile and out-of pile performances tests were also carried out at Halden research reactor. The 6th irradiation test have been completed successfully in Halden research reactor. As a result, HANA claddings showed high performance, such as corrosion resistance increased by 40% compared to Zircaloy-4. The high performance of HANA claddings in Halden test has enabled lead test rod program as the first step of the commercialization of HANA claddings. DB has been established for thermal and LOCA-related properties. It was confirmed from the thermal shock test that the integrity of HANA claddings was maintained in more expanded region than the criteria regulated by NRC. The manufacturing process of strips was established in order to apply HANA alloys, which were originally developed for the claddings, to the spacer grids. 250 kinds of model alloys for the next-generation claddings were designed and manufactured over 4 times and used to select the preliminary candidate alloys for the next-generation claddings. The selected candidate alloys showed 50% better corrosion resistance and 20% improved high temperature oxidation resistance compared to the foreign advanced claddings. We established the manufacturing condition controlling the performance of the dual-cooled claddings by changing the reduction rate in the cold working steps

  6. Integrated plasma control for high performance tokamaks

    International Nuclear Information System (INIS)

    Humphreys, D.A.; Deranian, R.D.; Ferron, J.R.; Johnson, R.D.; LaHaye, R.J.; Leuer, J.A.; Penaflor, B.G.; Walker, M.L.; Welander, A.S.; Jayakumar, R.J.; Makowski, M.A.; Khayrutdinov, R.R.

    2005-01-01

    Sustaining high performance in a tokamak requires controlling many equilibrium shape and profile characteristics simultaneously with high accuracy and reliability, while suppressing a variety of MHD instabilities. Integrated plasma control, the process of designing high-performance tokamak controllers based on validated system response models and confirming their performance in detailed simulations, provides a systematic method for achieving and ensuring good control performance. For present-day devices, this approach can greatly reduce the need for machine time traditionally dedicated to control optimization, and can allow determination of high-reliability controllers prior to ever producing the target equilibrium experimentally. A full set of tools needed for this approach has recently been completed and applied to present-day devices including DIII-D, NSTX and MAST. This approach has proven essential in the design of several next-generation devices including KSTAR, EAST, JT-60SC, and ITER. We describe the method, results of design and simulation tool development, and recent research producing novel approaches to equilibrium and MHD control in DIII-D. (author)

  7. Strategy Guideline: Partnering for High Performance Homes

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, D.

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. In an environment where the builder is the only source of communication between trades and consultants and where relationships are, in general, adversarial as opposed to cooperative, the chances of any one building system to fail are greater. Furthermore, it is much harder for the builder to identify and capitalize on synergistic opportunities. Partnering can help bridge the cross-functional aspects of the systems approach and achieve performance-based criteria. Critical success factors for partnering include support from top management, mutual trust, effective and open communication, effective coordination around common goals, team building, appropriate use of an outside facilitator, a partnering charter progress toward common goals, an effective problem-solving process, long-term commitment, continuous improvement, and a positive experience for all involved.

  8. Management issues for high performance storage systems

    Energy Technology Data Exchange (ETDEWEB)

    Louis, S. [Lawrence Livermore National Lab., CA (United States); Burris, R. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    Managing distributed high-performance storage systems is complex and, although sharing common ground with traditional network and systems management, presents unique storage-related issues. Integration technologies and frameworks exist to help manage distributed network and system environments. Industry-driven consortia provide open forums where vendors and users cooperate to leverage solutions. But these new approaches to open management fall short addressing the needs of scalable, distributed storage. We discuss the motivation and requirements for storage system management (SSM) capabilities and describe how SSM manages distributed servers and storage resource objects in the High Performance Storage System (HPSS), a new storage facility for data-intensive applications and large-scale computing. Modem storage systems, such as HPSS, require many SSM capabilities, including server and resource configuration control, performance monitoring, quality of service, flexible policies, file migration, file repacking, accounting, and quotas. We present results of initial HPSS SSM development including design decisions and implementation trade-offs. We conclude with plans for follow-on work and provide storage-related recommendations for vendors and standards groups seeking enterprise-wide management solutions.

  9. Transport in JET high performance plasmas

    International Nuclear Information System (INIS)

    1999-01-01

    Two type of high performance scenarios have been produced in JET during DTE1 campaign. One of them is the well known and extensively used in the past ELM-free hot ion H-mode scenario which has two distinct regions- plasma core and the edge transport barrier. The results obtained during DTE-1 campaign with D, DT and pure T plasmas confirms our previous conclusion that the core transport scales as a gyroBohm in the inner half of plasma volume, recovers its Bohm nature closer to the separatrix and behaves as ion neoclassical in the transport barrier. Measurements on the top of the barrier suggest that the width of the barrier is dependent upon isotope and moreover suggest that fast ions play a key role. The other high performance scenario is a relatively recently developed Optimised Shear Scenario with small or slightly negative magnetic shear in plasma core. Different mechanisms of Internal Transport Barrier (ITB) formation have been tested by predictive modelling and the results are compared with experimentally observed phenomena. The experimentally observed non-penetration of the heavy impurities through the strong ITB which contradicts to a prediction of the conventional neo-classical theory is discussed. (author)

  10. High-performance vertical organic transistors.

    Science.gov (United States)

    Kleemann, Hans; Günther, Alrun A; Leo, Karl; Lüssem, Björn

    2013-11-11

    Vertical organic thin-film transistors (VOTFTs) are promising devices to overcome the transconductance and cut-off frequency restrictions of horizontal organic thin-film transistors. The basic physical mechanisms of VOTFT operation, however, are not well understood and VOTFTs often require complex patterning techniques using self-assembly processes which impedes a future large-area production. In this contribution, high-performance vertical organic transistors comprising pentacene for p-type operation and C60 for n-type operation are presented. The static current-voltage behavior as well as the fundamental scaling laws of such transistors are studied, disclosing a remarkable transistor operation with a behavior limited by injection of charge carriers. The transistors are manufactured by photolithography, in contrast to other VOTFT concepts using self-assembled source electrodes. Fluorinated photoresist and solvent compounds allow for photolithographical patterning directly and strongly onto the organic materials, simplifying the fabrication protocol and making VOTFTs a prospective candidate for future high-performance applications of organic transistors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. High performance separation of lanthanides and actinides

    International Nuclear Information System (INIS)

    Sivaraman, N.; Vasudeva Rao, P.R.

    2011-01-01

    The major advantage of High Performance Liquid Chromatography (HPLC) is its ability to provide rapid and high performance separations. It is evident from Van Deemter curve for particle size versus resolution that packing materials with particle sizes less than 2 μm provide better resolution for high speed separations and resolving complex mixtures compared to 5 μm based supports. In the recent past, chromatographic support material using monolith has been studied extensively at our laboratory. Monolith column consists of single piece of porous, rigid material containing mesopores and micropores, which provide fast analyte mass transfer. Monolith support provides significantly higher separation efficiency than particle-packed columns. A clear advantage of monolith is that it could be operated at higher flow rates but with lower back pressure. Higher operating flow rate results in higher column permeability, which drastically reduces analysis time and provides high separation efficiency. The above developed fast separation methods were applied to assay the lanthanides and actinides from the dissolver solutions of nuclear reactor fuels

  12. Building Trust in High-Performing Teams

    Directory of Open Access Journals (Sweden)

    Aki Soudunsaari

    2012-06-01

    Full Text Available Facilitation of growth is more about good, trustworthy contacts than capital. Trust is a driving force for business creation, and to create a global business you need to build a team that is capable of meeting the challenge. Trust is a key factor in team building and a needed enabler for cooperation. In general, trust building is a slow process, but it can be accelerated with open interaction and good communication skills. The fast-growing and ever-changing nature of global business sets demands for cooperation and team building, especially for startup companies. Trust building needs personal knowledge and regular face-to-face interaction, but it also requires empathy, respect, and genuine listening. Trust increases communication, and rich and open communication is essential for the building of high-performing teams. Other building materials are a shared vision, clear roles and responsibilities, willingness for cooperation, and supporting and encouraging leadership. This study focuses on trust in high-performing teams. It asks whether it is possible to manage trust and which tools and operation models should be used to speed up the building of trust. In this article, preliminary results from the authors’ research are presented to highlight the importance of sharing critical information and having a high level of communication through constant interaction.

  13. The path toward HEP High Performance Computing

    CERN Document Server

    Apostolakis, John; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on th...

  14. Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics

    Science.gov (United States)

    Takabe, Hideaki

    2016-10-01

    This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (ID & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.

  15. High performance computing in science and engineering Garching/Munich 2016

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Siegfried; Bode, Arndt; Bruechle, Helmut; Brehm, Matthias (eds.)

    2016-11-01

    Computer simulations are the well-established third pillar of natural sciences along with theory and experimentation. Particularly high performance computing is growing fast and constantly demands more and more powerful machines. To keep pace with this development, in spring 2015, the Leibniz Supercomputing Centre installed the high performance computing system SuperMUC Phase 2, only three years after the inauguration of its sibling SuperMUC Phase 1. Thereby, the compute capabilities were more than doubled. This book covers the time-frame June 2014 until June 2016. Readers will find many examples of outstanding research in the more than 130 projects that are covered in this book, with each one of these projects using at least 4 million core-hours on SuperMUC. The largest scientific communities using SuperMUC in the last two years were computational fluid dynamics simulations, chemistry and material sciences, astrophysics, and life sciences.

  16. Focusing Telescopes in Nuclear Astrophysics

    CERN Document Server

    Ballmoos, Peter von

    2007-01-01

    This volume is the first of its kind on focusing gamma-ray telescopes. Forty-eight refereed papers provide a comprehensive overview of the scientific potential and technical challenges of this nascent tool for nuclear astrophysics. The book features articles dealing with pivotal technologies such as grazing incident mirrors, multilayer coatings, Laue- and Fresnel-lenses - and even an optic using the curvature of space-time. The volume also presents an overview of detectors matching the ambitious objectives of gamma ray optics, and facilities for operating such systems on the ground and in space. The extraordinary scientific potential of focusing gamma-ray telescopes for the study of the most powerful sources and the most violent events in the Universe is emphasized in a series of introductory articles. Practicing professionals, and students interested in experimental high-energy astrophysics, will find this book a useful reference

  17. Intel Xeon Phi coprocessor high performance programming

    CERN Document Server

    Jeffers, James

    2013-01-01

    Authors Jim Jeffers and James Reinders spent two years helping educate customers about the prototype and pre-production hardware before Intel introduced the first Intel Xeon Phi coprocessor. They have distilled their own experiences coupled with insights from many expert customers, Intel Field Engineers, Application Engineers and Technical Consulting Engineers, to create this authoritative first book on the essentials of programming for this new architecture and these new products. This book is useful even before you ever touch a system with an Intel Xeon Phi coprocessor. To ensure that your applications run at maximum efficiency, the authors emphasize key techniques for programming any modern parallel computing system whether based on Intel Xeon processors, Intel Xeon Phi coprocessors, or other high performance microprocessors. Applying these techniques will generally increase your program performance on any system, and better prepare you for Intel Xeon Phi coprocessors and the Intel MIC architecture. It off...

  18. Robust High Performance Aquaporin based Biomimetic Membranes

    DEFF Research Database (Denmark)

    Helix Nielsen, Claus; Zhao, Yichun; Qiu, C.

    2013-01-01

    on top of a support membrane. Control membranes, either without aquaporins or with the inactive AqpZ R189A mutant aquaporin served as controls. The separation performance of the membranes was evaluated by cross-flow forward osmosis (FO) and reverse osmosis (RO) tests. In RO the ABM achieved a water......Aquaporins are water channel proteins with high water permeability and solute rejection, which makes them promising for preparing high-performance biomimetic membranes. Despite the growing interest in aquaporin-based biomimetic membranes (ABMs), it is challenging to produce robust and defect...... permeability of ~ 4 L/(m2 h bar) with a NaCl rejection > 97% at an applied hydraulic pressure of 5 bar. The water permeability was ~40% higher compared to a commercial brackish water RO membrane (BW30) and an order of magnitude higher compared to a seawater RO membrane (SW30HR). In FO, the ABMs had > 90...

  19. High performance nano-composite technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  20. High Performance OLED Panel and Luminaire

    Energy Technology Data Exchange (ETDEWEB)

    Spindler, Jeffrey [OLEDWorks LLC, Rochester, NY (United States)

    2017-02-20

    In this project, OLEDWorks developed and demonstrated the technology required to produce OLED lighting panels with high energy efficiency and excellent light quality. OLED panels developed in this program produce high quality warm white light with CRI greater than 85 and efficacy up to 80 lumens per watt (LPW). An OLED luminaire employing 24 of the high performance panels produces practical levels of illumination for general lighting, with a flux of over 2200 lumens at 60 LPW. This is a significant advance in the state of the art for OLED solid-state lighting (SSL), which is expected to be a complementary light source to the more advanced LED SSL technology that is rapidly replacing all other traditional forms of lighting.

  1. How to create high-performing teams.

    Science.gov (United States)

    Lam, Samuel M

    2010-02-01

    This article is intended to discuss inspirational aspects on how to lead a high-performance team. Cogent topics discussed include how to hire staff through methods of "topgrading" with reference to Geoff Smart and "getting the right people on the bus" referencing Jim Collins' work. In addition, once the staff is hired, this article covers how to separate the "eagles from the ducks" and how to inspire one's staff by creating the right culture with suggestions for further reading by Don Miguel Ruiz (The four agreements) and John Maxwell (21 Irrefutable laws of leadership). In addition, Simon Sinek's concept of "Start with Why" is elaborated to help a leader know what the core element should be with any superior culture. Thieme Medical Publishers.

  2. High performance nano-composite technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D. [KAERI, Taejon (Korea, Republic of); Kim, E. K.; Jung, S. Y.; Ryu, H. J. [KRICT, Taejon (Korea, Republic of); Hwang, S. S.; Kim, J. K.; Hong, S. M. [KIST, Taejon (Korea, Republic of); Chea, Y. B. [KIGAM, Taejon (Korea, Republic of); Choi, C. H.; Kim, S. D. [ATS, Taejon (Korea, Republic of); Cho, B. G.; Lee, S. H. [HGREC, Taejon (Korea, Republic of)

    1999-06-15

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  3. High performance nano-composite technology development

    International Nuclear Information System (INIS)

    Kim, Whung Whoe; Rhee, C. K.; Kim, S. J.; Park, S. D.; Kim, E. K.; Jung, S. Y.; Ryu, H. J.; Hwang, S. S.; Kim, J. K.; Hong, S. M.; Chea, Y. B.; Choi, C. H.; Kim, S. D.; Cho, B. G.; Lee, S. H.

    1999-06-01

    The trend of new material development are being to carried out not only high performance but also environmental attraction. Especially nano composite material which enhances the functional properties of components, extending the component life resulting to reduced the wastes and environmental contamination, has a great effect on various industrial area. The application of nano composite, depends on the polymer matrix and filler materials, has various application from semiconductor to medical field. In spite of nano composite merits, nano composite study are confined to a few special materials as a lab, scale because a few technical difficulties are still on hold. Therefore, the purpose of this study establishes the systematical planning to carried out the next generation projects on order to compete with other countries and overcome the protective policy of advanced countries with grasping over sea's development trends and our present status. (author).

  4. Development of high-performance blended cements

    Science.gov (United States)

    Wu, Zichao

    2000-10-01

    This thesis presents the development of high-performance blended cements from industrial by-products. To overcome the low-early strength of blended cements, several chemicals were studied as the activators for cement hydration. Sodium sulfate was discovered as the best activator. The blending proportions were optimized by Taguchi experimental design. The optimized blended cements containing up to 80% fly ash performed better than Type I cement in strength development and durability. Maintaining a constant cement content, concrete produced from the optimized blended cements had equal or higher strength and higher durability than that produced from Type I cement alone. The key for the activation mechanism was the reaction between added SO4 2- and Ca2+ dissolved from cement hydration products.

  5. High performance parallel computers for science

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  6. High Performance with Prescriptive Optimization and Debugging

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo

    parallelization and automatic vectorization is attractive as it transparently optimizes programs. The thesis contributes an improved dependence analysis for explicitly parallel programs. These improvements lead to more loops being vectorized, on average we achieve a speedup of 1.46 over the existing dependence...... analysis and vectorizer in GCC. Automatic optimizations often fail for theoretical and practical reasons. When they fail we argue that a hybrid approach can be effective. Using compiler feedback, we propose to use the programmer’s intuition and insight to achieve high performance. Compiler feedback...... enlightens the programmer why a given optimization was not applied, and suggest how to change the source code to make it more amenable to optimizations. We show how this can yield significant speedups and achieve 2.4 faster execution on a real industrial use case. To aid in parallel debugging we propose...

  7. The path toward HEP High Performance Computing

    International Nuclear Information System (INIS)

    Apostolakis, John; Brun, René; Gheata, Andrei; Wenzel, Sandro; Carminati, Federico

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit

  8. High performance anode for advanced Li batteries

    Energy Technology Data Exchange (ETDEWEB)

    Lake, Carla [Applied Sciences, Inc., Cedarville, OH (United States)

    2015-11-02

    The overall objective of this Phase I SBIR effort was to advance the manufacturing technology for ASI’s Si-CNF high-performance anode by creating a framework for large volume production and utilization of low-cost Si-coated carbon nanofibers (Si-CNF) for the battery industry. This project explores the use of nano-structured silicon which is deposited on a nano-scale carbon filament to achieve the benefits of high cycle life and high charge capacity without the consequent fading of, or failure in the capacity resulting from stress-induced fracturing of the Si particles and de-coupling from the electrode. ASI’s patented coating process distinguishes itself from others, in that it is highly reproducible, readily scalable and results in a Si-CNF composite structure containing 25-30% silicon, with a compositionally graded interface at the Si-CNF interface that significantly improve cycling stability and enhances adhesion of silicon to the carbon fiber support. In Phase I, the team demonstrated the production of the Si-CNF anode material can successfully be transitioned from a static bench-scale reactor into a fluidized bed reactor. In addition, ASI made significant progress in the development of low cost, quick testing methods which can be performed on silicon coated CNFs as a means of quality control. To date, weight change, density, and cycling performance were the key metrics used to validate the high performance anode material. Under this effort, ASI made strides to establish a quality control protocol for the large volume production of Si-CNFs and has identified several key technical thrusts for future work. Using the results of this Phase I effort as a foundation, ASI has defined a path forward to commercialize and deliver high volume and low-cost production of SI-CNF material for anodes in Li-ion batteries.

  9. Multifragmentation model for astrophysical strangelets

    International Nuclear Information System (INIS)

    Biswas, Sayan; De, J.N.; Joarder, Partha S.; Raha, Sibaji; Syam, Debapriyo

    2012-01-01

    A model for the possible size distribution of astrophysical strangelets, that fragment out of the warm strange quark matter ejected during the merger of binary strange stars in the Galaxy, is presented here by invoking the statistical multifragmentation model. A simplified assumption of zero quark mass has been considered to obtain such mass-spectrum for the strangelets. An approximate estimate for the intensity of such strangelets in the galactic cosmic rays is also attempted by using a diffusion approximation.

  10. Ongoing Space Physics - Astrophysics Connections

    OpenAIRE

    Eichler, David

    2005-01-01

    I review several ongoing connections between space physics and astrophysics: a) Measurements of energetic particle spectra have confirmed theoretical prediction of the highest energy to which shocks can accelerate particles, and this has direct bearing on the origin of the highest energy cosmic rays. b) Mass ejection in solar flares may help us understand photon ejection in the giant flares of magnetar outbursts. c) Measurements of electron heat fluxes in the solar wind can help us understand...

  11. Rounding Up the Astrophysical Weeds

    Science.gov (United States)

    McMillan, James P.

    2016-09-01

    New instruments used for astronomy such as ALMA, Herschel, and SOFIA have greatly increased the quality of available astrophysical data. These improved data contain spectral lines and features which are not accounted for in the quantum mechanical (QM) catalogs. A class of molecules has been identified as being particularly problematic, the so-called "weeds". These molecules have numerous transitions, of non-trivial intensity, which are difficult to model due to highly perturbed low lying vibrational states. The inability to properly describe the complete contribution of these weeds to the astrophysical data has led directly to the misidentification of other target molecules. Ohio State's Microwave Laboratory has developed an alternative approach to this problem. Rather than relying on complex QM calculations, we have developed a temperature dependent approach to laboratory based terahertz spectroscopy. We have developed a set of simple packages, in addition to traditional line list catalogs, that enable astronomers to successfully remove the weed signals from their data. This dissertation will detail my laboratory work and analysis of three keys weeds: methanol, methyl formate and methyl cyanide. Also, discussed will be the analytical technique I used to apply these laboratory results to astrophysical data.

  12. High energy astrophysics an introduction

    CERN Document Server

    Courvoisier, Thierry J -L

    2013-01-01

    High-energy astrophysics has unveiled a Universe very different from that only known from optical observations. It has revealed many types of objects in which typical variability timescales are as short as years, months, days, and hours (in quasars, X-ray binaries, and other objects), and even down to milli-seconds in gamma ray bursts. The sources of energy that are encountered are only very seldom nuclear fusion, and most of the time gravitation, a paradox when one thinks that gravitation is, by many orders of magnitude, the weakest of the fundamental interactions. The understanding of these objects' physical conditions and the processes revealed by high-energy astrophysics in the last decades is nowadays part of astrophysicists' culture, even of those active in other domains of astronomy. This book evolved from lectures given to master and PhD students at the University of Geneva since the early 1990s. It aims at providing astronomers and physicists intending to be active in high-energy astrophysics a broad...

  13. High Performance Computing and Visualization Infrastructure for Simultaneous Parallel Computing and Parallel Visualization Research

    Science.gov (United States)

    2016-11-09

    Total Number: Sub Contractors (DD882) Names of Personnel receiving masters degrees Names of personnel receiving PHDs Names of other research staff...Broadcom 5720 QP 1Gb Network Daughter Card (2) Intel Xeon E5-2680 v3 2.5GHz, 30M Cache, 9.60GT/s QPI, Turbo, HT , 12C/24T (120W...Broadcom 5720 QP 1Gb Network Daughter Card (2) Intel Xeon E5-2680 v3 2.5GHz, 30M Cache, 9.60GT/s QPI, Turbo, HT , 12C/24T (120W

  14. Improving UV Resistance of High Performance Fibers

    Science.gov (United States)

    Hassanin, Ahmed

    High performance fibers are characterized by their superior properties compared to the traditional textile fibers. High strength fibers have high modules, high strength to weight ratio, high chemical resistance, and usually high temperature resistance. It is used in application where superior properties are needed such as bulletproof vests, ropes and cables, cut resistant products, load tendons for giant scientific balloons, fishing rods, tennis racket strings, parachute cords, adhesives and sealants, protective apparel and tire cords. Unfortunately, Ultraviolet (UV) radiation causes serious degradation to the most of high performance fibers. UV lights, either natural or artificial, cause organic compounds to decompose and degrade, because the energy of the photons of UV light is high enough to break chemical bonds causing chain scission. This work is aiming at achieving maximum protection of high performance fibers using sheathing approaches. The sheaths proposed are of lightweight to maintain the advantage of the high performance fiber that is the high strength to weight ratio. This study involves developing three different types of sheathing. The product of interest that need be protected from UV is braid from PBO. First approach is extruding a sheath from Low Density Polyethylene (LDPE) loaded with different rutile TiO2 % nanoparticles around the braid from the PBO. The results of this approach showed that LDPE sheath loaded with 10% TiO2 by weight achieved the highest protection compare to 0% and 5% TiO2. The protection here is judged by strength loss of PBO. This trend noticed in different weathering environments, where the sheathed samples were exposed to UV-VIS radiations in different weatheromter equipments as well as exposure to high altitude environment using NASA BRDL balloon. The second approach is focusing in developing a protective porous membrane from polyurethane loaded with rutile TiO2 nanoparticles. Membrane from polyurethane loaded with 4

  15. SISYPHUS: A high performance seismic inversion factory

    Science.gov (United States)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  16. NCI's Transdisciplinary High Performance Scientific Data Platform

    Science.gov (United States)

    Evans, Ben; Antony, Joseph; Bastrakova, Irina; Car, Nicholas; Cox, Simon; Druken, Kelsey; Evans, Bradley; Fraser, Ryan; Ip, Alex; Kemp, Carina; King, Edward; Minchin, Stuart; Larraondo, Pablo; Pugh, Tim; Richards, Clare; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    The Australian National Computational Infrastructure (NCI) manages Earth Systems data collections sourced from several domains and organisations onto a single High Performance Data (HPD) Node to further Australia's national priority research and innovation agenda. The NCI HPD Node has rapidly established its value, currently managing over 10 PBytes of datasets from collections that span a wide range of disciplines including climate, weather, environment, geoscience, geophysics, water resources and social sciences. Importantly, in order to facilitate broad user uptake, maximise reuse and enable transdisciplinary access through software and standardised interfaces, the datasets, associated information systems and processes have been incorporated into the design and operation of a unified platform that NCI has called, the National Environmental Research Data Interoperability Platform (NERDIP). The key goal of the NERDIP is to regularise data access so that it is easily discoverable, interoperable for different domains and enabled for high performance methods. It adopts and implements international standards and data conventions, and promotes scientific integrity within a high performance computing and data analysis environment. NCI has established a rich and flexible computing environment to access to this data, through the NCI supercomputer; a private cloud that supports both domain focused virtual laboratories and in-common interactive analysis interfaces; as well as remotely through scalable data services. Data collections of this importance must be managed with careful consideration of both their current use and the needs of the end-communities, as well as its future potential use, such as transitioning to more advanced software and improved methods. It is therefore critical that the data platform is both well-managed and trusted for stable production use (including transparency and reproducibility), agile enough to incorporate new technological advances and

  17. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  18. Optimizing High Performance Self Compacting Concrete

    Directory of Open Access Journals (Sweden)

    Raymond A Yonathan

    2017-01-01

    Full Text Available This paper’s objectives are to learn the effect of glass powder, silica fume, Polycarboxylate Ether, and gravel to optimizing composition of each factor in making High Performance SCC. Taguchi method is proposed in this paper as best solution to minimize specimen variable which is more than 80 variations. Taguchi data analysis method is applied to provide composition, optimizing, and the effect of contributing materials for nine variable of specimens. Concrete’s workability was analyzed using Slump flow test, V-funnel test, and L-box test. Compressive and porosity test were performed for the hardened state. With a dimension of 100×200 mm the cylindrical specimens were cast for compressive test with the age of 3, 7, 14, 21, 28 days. Porosity test was conducted at 28 days. It is revealed that silica fume contributes greatly to slump flow and porosity. Coarse aggregate shows the greatest contributing factor to L-box and compressive test. However, all factors show unclear result to V-funnel test.

  19. An integrated high performance Fastbus slave interface

    International Nuclear Information System (INIS)

    Christiansen, J.; Ljuslin, C.

    1993-01-01

    A high performance CMOS Fastbus slave interface ASIC (Application Specific Integrated Circuit) supporting all addressing and data transfer modes defined in the IEEE 960 - 1986 standard is presented. The FAstbus Slave Integrated Circuit (FASIC) is an interface between the asynchronous Fastbus and a clock synchronous processor/memory bus. It can work stand-alone or together with a 32 bit microprocessor. The FASIC is a programmable device enabling its direct use in many different applications. A set of programmable address mapping windows can map Fastbus addresses to convenient memory addresses and at the same time act as address decoding logic. Data rates of 100 MBytes/sec to Fastbus can be obtained using an internal FIFO in the FASIC to buffer data between the two buses during block transfers. Message passing from Fastbus to a microprocessor on the slave module is supported. A compact (70 mm x 170 mm) Fastbus slave piggy back sub-card interface including level conversion between ECL and TTL signal levels has been implemented using surface mount components and the 208 pin FASIC chip

  20. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-01-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of < 100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipment: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost

  1. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M.; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-03-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of <100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipments: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost. 1 fig

  2. High Performance Graphene Oxide Based Rubber Composites

    Science.gov (United States)

    Mao, Yingyan; Wen, Shipeng; Chen, Yulong; Zhang, Fazhong; Panine, Pierre; Chan, Tung W.; Zhang, Liqun; Liang, Yongri; Liu, Li

    2013-01-01

    In this paper, graphene oxide/styrene-butadiene rubber (GO/SBR) composites with complete exfoliation of GO sheets were prepared by aqueous-phase mixing of GO colloid with SBR latex and a small loading of butadiene-styrene-vinyl-pyridine rubber (VPR) latex, followed by their co-coagulation. During co-coagulation, VPR not only plays a key role in the prevention of aggregation of GO sheets but also acts as an interface-bridge between GO and SBR. The results demonstrated that the mechanical properties of the GO/SBR composite with 2.0 vol.% GO is comparable with those of the SBR composite reinforced with 13.1 vol.% of carbon black (CB), with a low mass density and a good gas barrier ability to boot. The present work also showed that GO-silica/SBR composite exhibited outstanding wear resistance and low-rolling resistance which make GO-silica/SBR very competitive for the green tire application, opening up enormous opportunities to prepare high performance rubber composites for future engineering applications. PMID:23974435

  3. Initial rheological description of high performance concretes

    Directory of Open Access Journals (Sweden)

    Alessandra Lorenzetti de Castro

    2006-12-01

    Full Text Available Concrete is defined as a composite material and, in rheological terms, it can be understood as a concentrated suspension of solid particles (aggregates in a viscous liquid (cement paste. On a macroscopic scale, concrete flows as a liquid. It is known that the rheological behavior of the concrete is close to that of a Bingham fluid and two rheological parameters regarding its description are needed: yield stress and plastic viscosity. The aim of this paper is to present the initial rheological description of high performance concretes using the modified slump test. According to the results, an increase of yield stress was observed over time, while a slight variation in plastic viscosity was noticed. The incorporation of silica fume showed changes in the rheological properties of fresh concrete. The behavior of these materials also varied with the mixing procedure employed in their production. The addition of superplasticizer meant that there was a large reduction in the mixture's yield stress, while plastic viscosity remained practically constant.

  4. High performance computing in linear control

    International Nuclear Information System (INIS)

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  5. Development of a High Performance Spacer Grid

    Energy Technology Data Exchange (ETDEWEB)

    Song, Kee Nam; Song, K. N.; Yoon, K. H. (and others)

    2007-03-15

    A spacer grid in a LWR fuel assembly is a key structural component to support fuel rods and to enhance the heat transfer from the fuel rod to the coolant. In this research, the main research items are the development of inherent and high performance spacer grid shapes, the establishment of mechanical/structural analysis and test technology, and the set-up of basic test facilities for the spacer grid. The main research areas and results are as follows. 1. 18 different spacer grid candidates have been invented and applied for domestic and US patents. Among the candidates 16 are chosen from the patent. 2. Two kinds of spacer grids are finally selected for the advanced LWR fuel after detailed performance tests on the candidates and commercial spacer grids from a mechanical/structural point of view. According to the test results the features of the selected spacer grids are better than those of the commercial spacer grids. 3. Four kinds of basic test facilities are set up and the relevant test technologies are established. 4. Mechanical/structural analysis models and technology for spacer grid performance are developed and the analysis results are compared with the test results to enhance the reliability of the models.

  6. Energy Efficient Graphene Based High Performance Capacitors.

    Science.gov (United States)

    Bae, Joonwon; Kwon, Oh Seok; Lee, Chang-Soo

    2017-07-10

    Graphene (GRP) is an interesting class of nano-structured electronic materials for various cutting-edge applications. To date, extensive research activities have been performed on the investigation of diverse properties of GRP. The incorporation of this elegant material can be very lucrative in terms of practical applications in energy storage/conversion systems. Among various those systems, high performance electrochemical capacitors (ECs) have become popular due to the recent need for energy efficient and portable devices. Therefore, in this article, the application of GRP for capacitors is described succinctly. In particular, a concise summary on the previous research activities regarding GRP based capacitors is also covered extensively. It was revealed that a lot of secondary materials such as polymers and metal oxides have been introduced to improve the performance. Also, diverse devices have been combined with capacitors for better use. More importantly, recent patents related to the preparation and application of GRP based capacitors are also introduced briefly. This article can provide essential information for future study. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Durability of high performance concrete in seawater

    International Nuclear Information System (INIS)

    Amjad Hussain Memon; Salihuddin Radin Sumadi; Rabitah Handan

    2000-01-01

    This paper presents a report on the effects of blended cements on the durability of high performance concrete (HPC) in seawater. In this research the effect of seawater was investigated. The specimens were initially subjected to water curing for seven days inside the laboratory at room temperature, followed by seawater curing exposed to tidal zone until testing. In this study three levels of cement replacement (0%, 30% and 70%) were used. The combined use of chemical and mineral admixtures has resulted in a new generation of concrete called HPC. The HPC has been identified as one of the most important advanced materials necessary in the effort to build a nation's infrastructure. HPC opens new opportunities in the utilization of the industrial by-products (mineral admixtures) in the construction industry. As a matter of fact permeability is considered as one of the fundamental properties governing the durability of concrete in the marine environment. Results of this investigation indicated that the oxygen permeability values for the blended cement concretes at the age of one year are reduced by a factor of about 2 as compared to OPC control mix concrete. Therefore both blended cement concretes are expected to withstand in the seawater exposed to tidal zone without serious deterioration. (Author)

  8. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  9. Ultra high performance concrete dematerialization study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-03-01

    Concrete is the most widely used building material in the world and its use is expected to grow. It is well recognized that the production of portland cement results in the release of large amounts of carbon dioxide, a greenhouse gas (GHG). The main challenge facing the industry is to produce concrete in an environmentally sustainable manner. Reclaimed industrial by-proudcts such as fly ash, silica fume and slag can reduce the amount of portland cement needed to make concrete, thereby reducing the amount of GHGs released to the atmosphere. The use of these supplementary cementing materials (SCM) can also enhance the long-term strength and durability of concrete. The intention of the EcoSmart{sup TM} Concrete Project is to develop sustainable concrete through innovation in supply, design and construction. In particular, the project focuses on finding a way to minimize the GHG signature of concrete by maximizing the replacement of portland cement in the concrete mix with SCM while improving the cost, performance and constructability. This paper describes the use of Ductal{sup R} Ultra High Performance Concrete (UHPC) for ramps in a condominium. It examined the relationship between the selection of UHPC and the overall environmental performance, cost, constructability maintenance and operational efficiency as it relates to the EcoSmart Program. The advantages and challenges of using UHPC were outlined. In addition to its very high strength, UHPC has been shown to have very good potential for GHG emission reduction due to the reduced material requirements, reduced transport costs and increased SCM content. refs., tabs., figs.

  10. High-performance laboratories and cleanrooms; TOPICAL

    International Nuclear Information System (INIS)

    Tschudi, William; Sartor, Dale; Mills, Evan; Xu, Tengfang

    2002-01-01

    The California Energy Commission sponsored this roadmap to guide energy efficiency research and deployment for high performance cleanrooms and laboratories. Industries and institutions utilizing these building types (termed high-tech buildings) have played an important part in the vitality of the California economy. This roadmap's key objective to present a multi-year agenda to prioritize and coordinate research efforts. It also addresses delivery mechanisms to get the research products into the market. Because of the importance to the California economy, it is appropriate and important for California to take the lead in assessing the energy efficiency research needs, opportunities, and priorities for this market. In addition to the importance to California's economy, energy demand for this market segment is large and growing (estimated at 9400 GWH for 1996, Mills et al. 1996). With their 24hr. continuous operation, high tech facilities are a major contributor to the peak electrical demand. Laboratories and cleanrooms constitute the high tech building market, and although each building type has its unique features, they are similar in that they are extremely energy intensive, involve special environmental considerations, have very high ventilation requirements, and are subject to regulations-primarily safety driven-that tend to have adverse energy implications. High-tech buildings have largely been overlooked in past energy efficiency research. Many industries and institutions utilize laboratories and cleanrooms. As illustrated, there are many industries operating cleanrooms in California. These include semiconductor manufacturing, semiconductor suppliers, pharmaceutical, biotechnology, disk drive manufacturing, flat panel displays, automotive, aerospace, food, hospitals, medical devices, universities, and federal research facilities

  11. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  12. Dark matter: the astrophysical case

    International Nuclear Information System (INIS)

    Silk, J.

    2012-01-01

    The identification of dark matter is one of the most urgent problems in cosmology. I describe the astrophysical case for dark matter, from both an observational and a theoretical perspective. This overview will therefore focus on the observational motivations rather than the particle physics aspects of dark matter constraints on specific dark matter candidates. First, however, I summarize the astronomical evidence for dark matter, then I highlight the weaknesses of the standard cold dark matter model (LCDM) to provide a robust explanation of some observations. The greatest weakness in the dark matter saga is that we have not yet identified the nature of dark matter itself

  13. Nuclear astrophysics with indirect methods

    International Nuclear Information System (INIS)

    Shubhchintak

    2016-01-01

    In the area of astrophysics, it is well known that several different type of nuclear reactions are involved in the production of elements and for energy generation in stars. The knowledge of rates and cross section of these reactions is necessary in order to understand the origin of elements in the universe. Particularly, interests are there in the processes like pp-chain, CNO cycle, r-process and s-process, which are responsible for the formation of majority of the nuclei via various reactions like (p, γ), (n, γ), (α, γ) etc

  14. Nuclear physics in astrophysics. Part 2. Abstracts

    International Nuclear Information System (INIS)

    Gyuerky, Gy.; Fueloep, Zs.

    2005-01-01

    The proceedings of the 20. International Nuclear Physics Divisional Conference of the European Physical Society covers a wide range of topics in nuclear astrophysics. The topics addressed are big bang nucleosynthesis, stellar nucleosynthesis, measurements and nuclear data for astrophysics, nuclear structure far from stability, neutrino physics, and rare-ion-beam facilities and experiments. The perspectives of nuclear physics and astrophysics are also overviewed. 77 items are indexed separately for the INIS database. (K.A.)

  15. High Energy Astrophysics Science Archive Research Center

    Data.gov (United States)

    National Aeronautics and Space Administration — The High Energy Astrophysics Science Archive Research Center (HEASARC) is the primary archive for NASA missions dealing with extremely energetic phenomena, from...

  16. The need for high performance breeder reactors

    International Nuclear Information System (INIS)

    Vaughan, R.D.; Chermanne, J.

    1977-01-01

    It can be easily demonstrated, on the basis of realistic estimates of continued high oil costs, that an increasing portion of the growth in energy demand must be supplied by nuclear power and that this one might account for 20% of all the energy production by the end of the century. Such assumptions lead very quickly to the conclusion that the discovery, extraction and processing of the uranium will not be able to follow the demand; the bottleneck will essentially be related to the rate at which the ore can be discovered and extracted, and not to the existing quantities nor their grade. Figures as high as 150.000 T/annum and more would be quickly reached, and it is necessary to wonder already now if enough capital can be attracted to meet these requirements. There is only one solution to this problem: improve the conversion ratio of the nuclear system and quickly reach the breeding; this would lead to the reduction of the natural uranium consumption by a factor of about 50. However, this condition is not sufficient; the commercial breeder must have a breeding gain as high as possible because the Pu out-of-pile time and the Pu losses in the cycle could lead to an unacceptable doubling time for the system, if the breeding gain is too low. That is the reason why it is vital to develop high performance breeder reactors. The present paper indicates how the Gas-cooled Breeder Reactor [GBR] can meet the problems mentioned above, on the basis of recent and realistic studies. It briefly describes the present status of GBR development, from the predecessors in the gas cooled reactor line, particularly the AGR. It shows how the GBR fuel takes mostly profit from the LMFBR fuel irradiation experience. It compares the GBR performance on a consistent basis with that of the LMFBR. The GBR capital and fuel cycle costs are compared with those of thermal and fast reactors respectively. The conclusion is, based on a cost-benefit study, that the GBR must be quickly developed in order

  17. Integrating advanced facades into high performance buildings

    International Nuclear Information System (INIS)

    Selkowitz, Stephen E.

    2001-01-01

    Glass is a remarkable material but its functionality is significantly enhanced when it is processed or altered to provide added intrinsic capabilities. The overall performance of glass elements in a building can be further enhanced when they are designed to be part of a complete facade system. Finally the facade system delivers the greatest performance to the building owner and occupants when it becomes an essential element of a fully integrated building design. This presentation examines the growing interest in incorporating advanced glazing elements into more comprehensive facade and building systems in a manner that increases comfort, productivity and amenity for occupants, reduces operating costs for building owners, and contributes to improving the health of the planet by reducing overall energy use and negative environmental impacts. We explore the role of glazing systems in dynamic and responsive facades that provide the following functionality: Enhanced sun protection and cooling load control while improving thermal comfort and providing most of the light needed with daylighting; Enhanced air quality and reduced cooling loads using natural ventilation schemes employing the facade as an active air control element; Reduced operating costs by minimizing lighting, cooling and heating energy use by optimizing the daylighting-thermal tradeoffs; Net positive contributions to the energy balance of the building using integrated photovoltaic systems; Improved indoor environments leading to enhanced occupant health, comfort and performance. In addressing these issues facade system solutions must, of course, respect the constraints of latitude, location, solar orientation, acoustics, earthquake and fire safety, etc. Since climate and occupant needs are dynamic variables, in a high performance building the facade solution have the capacity to respond and adapt to these variable exterior conditions and to changing occupant needs. This responsive performance capability

  18. JT-60U high performance regimes

    International Nuclear Information System (INIS)

    Ishida, S.

    1999-01-01

    High performance regimes of JT-60U plasmas are presented with an emphasis upon the results from the use of a semi-closed pumped divertor with W-shaped geometry. Plasma performance in transient and quasi steady states has been significantly improved in reversed shear and high- βp regimes. The reversed shear regime elevated an equivalent Q DT eq transiently up to 1.25 (n D (0)τ E T i (0)=8.6x10 20 m-3·s·keV) in a reactor-relevant thermonuclear dominant regime. Long sustainment of enhanced confinement with internal transport barriers (ITBs) with a fully non-inductive current drive in a reversed shear discharge was successfully demonstrated with LH wave injection. Performance sustainment has been extended in the high- bp regime with a high triangularity achieving a long sustainment of plasma conditions equivalent to Q DT eq ∼0.16 (n D (0)τ E T i (0)∼1.4x10 20 m -3 ·s·keV) for ∼4.5 s with a large non-inductive current drive fraction of 60-70% of the plasma current. Thermal and particle transport analyses show significant reduction of thermal and particle diffusivities around ITB resulting in a strong Er shear in the ITB region. The W-shaped divertor is effective for He ash exhaust demonstrating steady exhaust capability of τ He */τ E ∼3-10 in support of ITER. Suppression of neutral back flow and chemical sputtering effect have been observed while MARFE onset density is rather decreased. Negative-ion based neutral beam injection (N-NBI) experiments have created a clear H-mode transition. Enhanced ionization cross- section due to multi-step ionization processes was confirmed as theoretically predicted. A current density profile driven by N-NBI is measured in a good agreement with theoretical prediction. N-NBI induced TAE modes characterized as persistent and bursting oscillations have been observed from a low hot beta of h >∼0.1-0.2% without a significant loss of fast ions. (author)

  19. High Performance Commercial Fenestration Framing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mike Manteghi; Sneh Kumar; Joshua Early; Bhaskar Adusumalli

    2010-01-31

    A major objective of the U.S. Department of Energy is to have a zero energy commercial building by the year 2025. Windows have a major influence on the energy performance of the building envelope as they control over 55% of building energy load, and represent one important area where technologies can be developed to save energy. Aluminum framing systems are used in over 80% of commercial fenestration products (i.e. windows, curtain walls, store fronts, etc.). Aluminum framing systems are often required in commercial buildings because of their inherent good structural properties and long service life, which is required from commercial and architectural frames. At the same time, they are lightweight and durable, requiring very little maintenance, and offer design flexibility. An additional benefit of aluminum framing systems is their relatively low cost and easy manufacturability. Aluminum, being an easily recyclable material, also offers sustainable features. However, from energy efficiency point of view, aluminum frames have lower thermal performance due to the very high thermal conductivity of aluminum. Fenestration systems constructed of aluminum alloys therefore have lower performance in terms of being effective barrier to energy transfer (heat loss or gain). Despite the lower energy performance, aluminum is the choice material for commercial framing systems and dominates the commercial/architectural fenestration market because of the reasons mentioned above. In addition, there is no other cost effective and energy efficient replacement material available to take place of aluminum in the commercial/architectural market. Hence it is imperative to improve the performance of aluminum framing system to improve the energy performance of commercial fenestration system and in turn reduce the energy consumption of commercial building and achieve zero energy building by 2025. The objective of this project was to develop high performance, energy efficient commercial

  20. High Performance, Three-Dimensional Bilateral Filtering

    International Nuclear Information System (INIS)

    Bethel, E. Wes

    2008-01-01

    Image smoothing is a fundamental operation in computer vision and image processing. This work has two main thrusts: (1) implementation of a bilateral filter suitable for use in smoothing, or denoising, 3D volumetric data; (2) implementation of the 3D bilateral filter in three different parallelization models, along with parallel performance studies on two modern HPC architectures. Our bilateral filter formulation is based upon the work of Tomasi [11], but extended to 3D for use on volumetric data. Our three parallel implementations use POSIX threads, the Message Passing Interface (MPI), and Unified Parallel C (UPC), a Partitioned Global Address Space (PGAS) language. Our parallel performance studies, which were conducted on a Cray XT4 supercomputer and aquad-socket, quad-core Opteron workstation, show our algorithm to have near-perfect scalability up to 120 processors. Parallel algorithms, such as the one we present here, will have an increasingly important role for use in production visual analysis systems as the underlying computational platforms transition from single- to multi-core architectures in the future.

  1. High Performance, Three-Dimensional Bilateral Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes

    2008-06-05

    Image smoothing is a fundamental operation in computer vision and image processing. This work has two main thrusts: (1) implementation of a bilateral filter suitable for use in smoothing, or denoising, 3D volumetric data; (2) implementation of the 3D bilateral filter in three different parallelization models, along with parallel performance studies on two modern HPC architectures. Our bilateral filter formulation is based upon the work of Tomasi [11], but extended to 3D for use on volumetric data. Our three parallel implementations use POSIX threads, the Message Passing Interface (MPI), and Unified Parallel C (UPC), a Partitioned Global Address Space (PGAS) language. Our parallel performance studies, which were conducted on a Cray XT4 supercomputer and aquad-socket, quad-core Opteron workstation, show our algorithm to have near-perfect scalability up to 120 processors. Parallel algorithms, such as the one we present here, will have an increasingly important role for use in production visual analysis systems as the underlying computational platforms transition from single- to multi-core architectures in the future.

  2. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  3. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  4. Thermal interface pastes nanostructured for high performance

    Science.gov (United States)

    Lin, Chuangang

    Thermal interface materials in the form of pastes are needed to improve thermal contacts, such as that between a microprocessor and a heat sink of a computer. High-performance and low-cost thermal pastes have been developed in this dissertation by using polyol esters as the vehicle and various nanoscale solid components. The proportion of a solid component needs to be optimized, as an excessive amount degrades the performance, due to the increase in the bond line thickness. The optimum solid volume fraction tends to be lower when the mating surfaces are smoother, and higher when the thermal conductivity is higher. Both a low bond line thickness and a high thermal conductivity help the performance. When the surfaces are smooth, a low bond line thickness can be even more important than a high thermal conductivity, as shown by the outstanding performance of the nanoclay paste of low thermal conductivity in the smooth case (0.009 mum), with the bond line thickness less than 1 mum, as enabled by low storage modulus G', low loss modulus G" and high tan delta. However, for rough surfaces, the thermal conductivity is important. The rheology affects the bond line thickness, but it does not correlate well with the performance. This study found that the structure of carbon black is an important parameter that governs the effectiveness of a carbon black for use in a thermal paste. By using a carbon black with a lower structure (i.e., a lower DBP value), a thermal paste that is more effective than the previously reported carbon black paste was obtained. Graphite nanoplatelet (GNP) was found to be comparable in effectiveness to carbon black (CB) pastes for rough surfaces, but it is less effective for smooth surfaces. At the same filler volume fraction, GNP gives higher thermal conductivity than carbon black paste. At the same pressure, GNP gives higher bond line thickness than CB (Tokai or Cabot). The effectiveness of GNP is limited, due to the high bond line thickness. A

  5. Recent astrophysical applications of the Trojan Horse Method to nuclear astrophysics

    International Nuclear Information System (INIS)

    Spitaleri, C.; Cherubini, S.; Crucilla, V.; Gulino, M.; La Cognata, M.; Lamia, L.; Pizzone, R. G.; Puglia, S. M. R.; Rapisarda, G. G.; Romano, S.; Sergi, M. L.; Tumino, A.; Fu, C.; Tribble, R.; Banu, A.; Al-Abdullah, T.; Goldberg, V.; Mukhamedzhanov, A.; Tabacaru, G.; Trache, L.

    2008-01-01

    The Trojan Horse Method (THM) is an unique indirect technique allowing to measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the recent applications of the Trojan Horse Method are presented. The applications aiming to the extraction of the bare astrophysical S b (E) for some two-body processes are discussed

  6. Intelligent Facades for High Performance Green Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Dyson, Anna [Rensselaer Polytechnic Inst., Troy, NY (United States)

    2017-03-01

    Progress Towards Net-Zero and Net-Positive-Energy Commercial Buildings and Urban Districts Through Intelligent Building Envelope Strategies Previous research and development of intelligent facades systems has been limited in their contribution towards national goals for achieving on-site net zero buildings, because this R&D has failed to couple the many qualitative requirements of building envelopes such as the provision of daylighting, access to exterior views, satisfying aesthetic and cultural characteristics, with the quantitative metrics of energy harvesting, storage and redistribution. To achieve energy self-sufficiency from on-site solar resources, building envelopes can and must address this gamut of concerns simultaneously. With this project, we have undertaken a high-performance building integrated combined-heat and power concentrating photovoltaic system with high temperature thermal capture, storage and transport towards multiple applications (BICPV/T). The critical contribution we are offering with the Integrated Concentrating Solar Façade (ICSF) is conceived to improve daylighting quality for improved health of occupants and mitigate solar heat gain while maximally capturing and transferring onsite solar energy. The ICSF accomplishes this multi-functionality by intercepting only the direct-normal component of solar energy (which is responsible for elevated cooling loads) thereby transforming a previously problematic source of energy into a high quality resource that can be applied to building demands such as heating, cooling, dehumidification, domestic hot water, and possible further augmentation of electrical generation through organic Rankine cycles. With the ICSF technology, our team is addressing the global challenge in transitioning commercial and residential building stock towards on-site clean energy self-sufficiency, by fully integrating innovative environmental control systems strategies within an intelligent and responsively dynamic building

  7. Alternative High-Performance Ceramic Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Sundaram, S. K. [Alfred Univ., NY (United States)

    2017-02-01

    This final report (M5NU-12-NY-AU # 0202-0410) summarizes the results of the project titled “Alternative High-Performance Ceramic Waste Forms,” funded in FY12 by the Nuclear Energy University Program (NEUP Project # 12-3809) being led by Alfred University in collaboration with Savannah River National Laboratory (SRNL). The overall focus of the project is to advance fundamental understanding of crystalline ceramic waste forms and to demonstrate their viability as alternative waste forms to borosilicate glasses. We processed single- and multiphase hollandite waste forms based on simulated waste streams compositions provided by SRNL based on the advanced fuel cycle initiative (AFCI) aqueous separation process developed in the Fuel Cycle Research and Development (FCR&D). For multiphase simulated waste forms, oxide and carbonate precursors were mixed together via ball milling with deionized water using zirconia media in a polyethylene jar for 2 h. The slurry was dried overnight and then separated from the media. The blended powders were then subjected to melting or spark plasma sintering (SPS) processes. Microstructural evolution and phase assemblages of these samples were studied using x-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersion analysis of x-rays (EDAX), wavelength dispersive spectrometry (WDS), transmission electron spectroscopy (TEM), selective area x-ray diffraction (SAXD), and electron backscatter diffraction (EBSD). These results showed that the processing methods have significant effect on the microstructure and thus the performance of these waste forms. The Ce substitution into zirconolite and pyrochlore materials was investigated using a combination of experimental (in situ XRD and x-ray absorption near edge structure (XANES)) and modeling techniques to study these single phases independently. In zirconolite materials, a transition from the 2M to the 4M polymorph was observed with increasing Ce content. The resulting

  8. Space astronomy and astrophysics program by NASA

    Science.gov (United States)

    Hertz, Paul L.

    2014-07-01

    The National Aeronautics and Space Administration recently released the NASA Strategic Plan 20141, and the NASA Science Mission Directorate released the NASA 2014 Science Plan3. These strategic documents establish NASA's astrophysics strategic objectives to be (i) to discover how the universe works, (ii) to explore how it began and evolved, and (iii) to search for life on planets around other stars. The multidisciplinary nature of astrophysics makes it imperative to strive for a balanced science and technology portfolio, both in terms of science goals addressed and in missions to address these goals. NASA uses the prioritized recommendations and decision rules of the National Research Council's 2010 decadal survey in astronomy and astrophysics2 to set the priorities for its investments. The NASA Astrophysics Division has laid out its strategy for advancing the priorities of the decadal survey in its Astrophysics 2012 Implementation Plan4. With substantial input from the astrophysics community, the NASA Advisory Council's Astrophysics Subcommittee has developed an astrophysics visionary roadmap, Enduring Quests, Daring Visions5, to examine possible longer-term futures. The successful development of the James Webb Space Telescope leading to a 2018 launch is an Agency priority. One important goal of the Astrophysics Division is to begin a strategic mission, subject to the availability of funds, which follows from the 2010 decadal survey and is launched after the James Webb Space Telescope. NASA is studying a Wide Field Infrared Survey Telescope as its next large astrophysics mission. NASA is also planning to partner with other space agencies on their missions as well as increase the cadence of smaller Principal Investigator led, competitively selected Astrophysics Explorers missions.

  9. High-performance commercial building systems

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  10. Massive magnetic monopoles in cosmology and astrophysics

    International Nuclear Information System (INIS)

    Kolb, E.W.

    1984-01-01

    The astrophysical and cosmological consequences of magnetic monopoles are discussed. The production of monopoles during phase transition in the early universe is addressed, and proposals which have been made to alleviate the monopole problem are summarized. Astrophysical limits on galactic magnetic monopoles are discussed along with experimental efforts to detect monopoles. Finally, monopole-induced proton decay is addressed. 48 references

  11. Highlights of modern astrophysics: Concepts and controversies

    International Nuclear Information System (INIS)

    Shapiro, S.L.; Teukolsky, V.

    1986-01-01

    In this book, physicists and astronomers review issues in astrophysics. The book stresses accomplishments of observational and theoretical work, and demonstrates how to reveal information about stars and galaxies by applying the basic principles of physics. It pinpoints conflicting views and findings on important topics and indicates possibilities for future research in the field of modern astrophysics

  12. Astrophysics with small satellites in Scandinavia

    DEFF Research Database (Denmark)

    Lund, Niels

    2003-01-01

    The small-satellites activities in the Scandinavian countries are briefly surveyed with emphasis on astrophysics research. (C) 2002 COSPAR. Published by Elsevier Science Ltd. All rights reserved.......The small-satellites activities in the Scandinavian countries are briefly surveyed with emphasis on astrophysics research. (C) 2002 COSPAR. Published by Elsevier Science Ltd. All rights reserved....

  13. Development of a high performance liquid chromatography method ...

    African Journals Online (AJOL)

    Development of a high performance liquid chromatography method for simultaneous ... Purpose: To develop and validate a new low-cost high performance liquid chromatography (HPLC) method for ..... Several papers have reported the use of ...

  14. High Performance Home Building Guide for Habitat for Humanity Affiliates

    Energy Technology Data Exchange (ETDEWEB)

    Lindsey Marburger

    2010-10-01

    This guide covers basic principles of high performance Habitat construction, steps to achieving high performance Habitat construction, resources to help improve building practices, materials, etc., and affiliate profiles and recommendations.

  15. Demographics in Astronomy and Astrophysics

    Science.gov (United States)

    Ulvestad, James S.

    2011-05-01

    Astronomy has been undergoing a significant demographic shift over the last several decades, as shown by data presented in the 2000 National Research Council (NRC) report "Federal Funding of Astronomical Research," and the 2010 NRC report, "New Worlds, New Horizons in Astronomy and Astrophysics." For example, the number of advertised postdoctoral positions in astronomy has increased much more rapldly than the number of faculty positions, contributing to a holding pattern of early-career astronomers in multiple postdoctoral positions. This talk will summarize some of the current demographic trends in astronomy, including information about gender and ethnic diversity, and describe some of the possible implications for the future. I thank the members of the Astro2010 Demographics Study Group, as well as numerous white-paper contributors to Astro2010, for providing data and analyses.

  16. Silica aerogel and space astrophysics

    International Nuclear Information System (INIS)

    Koch-Miramond, L.

    1985-09-01

    Silica aerogels have been produced in large and transparent blocks for space astrophysics experiments since the beginning of the 1970's. They were used in cosmic ray experiments on board balloons by the Saclay group. A new space venture where aerogel Cerenkov radiators will play a decisive role is currently being prepared by a large collaboration of European and US Institutes. It will be part of the so-called International Solar Polar Mission (ISPM) which will explore the heliosphere over the full range of solar latitudes from the ecliptic (equatorial) plane to the magnetic poles of the sun. Comments on properties and long term behaviour of silica aerogel cerenkov radiators in space environment are given

  17. Axions in astrophysics and cosmology

    International Nuclear Information System (INIS)

    Sikivie, P.

    1984-07-01

    Axion models often have a spontaneously broken exact discrete symmetry. In that case, they have discretely degenerate vacua and hence domain walls. The properties of the domain walls, the cosmological catastrophe they produce and the ways in which this catastrophe may be avoided are explained. Cosmology and astrophysics provide arguments that imply the axion decay constant should lie in the range 10 8 GeV less than or equal to f/sub a/ less than or equal to 10 12 GeV. Reasons are given why axions are an excellent candidate to constitute the dark matter of galactic halos. Using the coupling of the axions to the electromagnetic field, detectors are described to look for axions floating about in the halo of our galaxy and for axions emitted by the sun

  18. The fundamentals of stellar astrophysics

    International Nuclear Information System (INIS)

    Collins, G.W. II.

    1989-01-01

    A broad overview of theoretical stellar astrophysics is presented in a textbook intended for graduate students. Chapters are devoted to fundamental principles, assumptions, theorems, and polytropes; energy sources and sinks; the flow of energy through the star and the construction of stellar models; the theory of stellar evolution; relativistic stellar structure; the structure of distorted stars; stellar pulsation and oscillation. Also discussed are the flow of radiation through the stellar atmosphere, the solution of the radiative-transfer equation, the environment of the radiation field, the construction of a stellar model atmosphere, the formation and shape of spectral lines, LTE breakdown, illuminated and extended stellar atmospheres, and the transfer of polarized radiation. Diagrams, graphs, and sample problems are provided. 164 refs

  19. Focusing telescopes in nuclear astrophysics

    International Nuclear Information System (INIS)

    Von Ballmoos, P.; Knodlseder, R.; Sazonov, S.; Griffiths, R.; Bastie, P.; Halloin, H.; Pareschi, G.; Ramsey, B.; Jensen, C.; Buis, E.J.; Ulmer, M.; Giommi, P.; Colafrancesco, S.; Comastri, A.; Barret, D.; Leising, M.; Hernanz, M.; Smith, D.; Abrosimov, N.; Smither, B.; Ubertini, P.; Olive, J.F.; Lund, N.; Pisa, A.; Courtois, P.; Roa, D.; Harrison, F.; Pareschi, G.; Frontera, F.; Von Ballmoos, P.; Barriere, N.; Rando, N.; Borde, J.; Hinglais, E.; Cledassou, R.; Duchon, P.; Sghedoni, M.; Huet, B.; Takahashi, T.; Caroli, E.; Quadrinin, L.; Buis, E.J.; Skinner, G.; Krizmanic, J.; Pareschi, G.; Loffredo, G.; Wunderer, C.; Weidenspointner, G.; Wunderer, C.; Koechlin, L.; Bignami, G.; Von Ballmoos, P.; Tueller, J.; Andritschke, T.; Laurens, A.; Evrard, J.

    2005-01-01

    The objective of this workshop is to consider the next generation of instrumentation to be required within the domain of nuclear astrophysics. A small, but growing community has been pursuing various techniques for the focusing of hard X-rays and gamma-rays with the aim of achieving a factor of up to 100 improvement in sensitivity over present technologies. Balloon flight tests of both multilayer mirrors and a Laue lens have been performed and ideas abound. At present, implementation scenarios for space missions are being studied at Esa, CNES, and elsewhere. The workshop will provide a first opportunity for this new community to meet, exchange technological know-how, discuss scientific objectives and synergies, and consolidate implementation approaches within National and European Space Science programs. This document gathers the slides of all the presentations

  20. Focusing telescopes in nuclear astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Von Ballmoos, P; Knodlseder, R; Sazonov, S; Griffiths, R; Bastie, P; Halloin, H; Pareschi, G; Ramsey, B; Jensen, C; Buis, E J; Ulmer, M; Giommi, P; Colafrancesco, S; Comastri, A; Barret, D; Leising, M; Hernanz, M; Smith, D; Abrosimov, N; Smither, B; Ubertini, P; Olive, J F; Lund, N; Pisa, A; Courtois, P; Roa, D; Harrison, F; Pareschi, G; Frontera, F; Von Ballmoos, P; Barriere, N; Rando, N; Borde, J; Hinglais, E; Cledassou, R; Duchon, P; Sghedoni, M; Huet, B; Takahashi, T; Caroli, E; Quadrinin, L; Buis, E J; Skinner, G; Krizmanic, J; Pareschi, G; Loffredo, G; Wunderer, C; Weidenspointner, G; Wunderer, C; Koechlin, L; Bignami, G; Von Ballmoos, P; Tueller, J; Andritschke, T; Laurens, A; Evrard, J

    2005-07-01

    The objective of this workshop is to consider the next generation of instrumentation to be required within the domain of nuclear astrophysics. A small, but growing community has been pursuing various techniques for the focusing of hard X-rays and gamma-rays with the aim of achieving a factor of up to 100 improvement in sensitivity over present technologies. Balloon flight tests of both multilayer mirrors and a Laue lens have been performed and ideas abound. At present, implementation scenarios for space missions are being studied at Esa, CNES, and elsewhere. The workshop will provide a first opportunity for this new community to meet, exchange technological know-how, discuss scientific objectives and synergies, and consolidate implementation approaches within National and European Space Science programs. This document gathers the slides of all the presentations.

  1. Nuclear astrophysics of the sun

    International Nuclear Information System (INIS)

    Kocharov, G.E.

    1980-01-01

    In the first chapter we will discuss the problem of nuclear reactions in the interior of the sun and consider the modern aspects of the neutrino astrophysics of the Sun. The second chapter is devoted to the high energy interactions in the solar atmosphere during the flares. Among a great number of events during the solar flares we shall consider mainly the nuclear reactions. Special attention will be paid to the genetic connection between the different components of solar electromagnetic and corpuscular radiation. The idea of the unity of processes in different parts of the Sun, from hot and dense interior up to the rare plasma of the solar corona will be the main line of the book. (orig./WL) 891 WL/orig.- 892 HIS

  2. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  3. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  4. High performance graphics processors for medical imaging applications

    International Nuclear Information System (INIS)

    Goldwasser, S.M.; Reynolds, R.A.; Talton, D.A.; Walsh, E.S.

    1989-01-01

    This paper describes a family of high- performance graphics processors with special hardware for interactive visualization of 3D human anatomy. The basic architecture expands to multiple parallel processors, each processor using pipelined arithmetic and logical units for high-speed rendering of Computed Tomography (CT), Magnetic Resonance (MR) and Positron Emission Tomography (PET) data. User-selectable display alternatives include multiple 2D axial slices, reformatted images in sagittal or coronal planes and shaded 3D views. Special facilities support applications requiring color-coded display of multiple datasets (such as radiation therapy planning), or dynamic replay of time- varying volumetric data (such as cine-CT or gated MR studies of the beating heart). The current implementation is a single processor system which generates reformatted images in true real time (30 frames per second), and shaded 3D views in a few seconds per frame. It accepts full scale medical datasets in their native formats, so that minimal preprocessing delay exists between data acquisition and display

  5. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  6. Can Knowledge of the Characteristics of "High Performers" Be Generalised?

    Science.gov (United States)

    McKenna, Stephen

    2002-01-01

    Two managers described as high performing constructed complexity maps of their organization/world. The maps suggested that high performance is socially constructed and negotiated in specific contexts and management competencies associated with it are context specific. Development of high performers thus requires personalized coaching more than…

  7. Powder metallurgical high performance materials. Proceedings. Volume 1: high performance P/M metals

    International Nuclear Information System (INIS)

    Kneringer, G.; Roedhammer, P.; Wildner, H.

    2001-01-01

    The proceedings of this sequence of seminars form an impressive chronicle of the continued progress in the understanding of refractory metals and cemented carbides and in their manufacture and application. There the ingenuity and assiduous work of thousands of scientists and engineers striving for progress in the field of powder metallurgy is documented in more than 2000 contributions covering some 30000 pages. The 15th Plansee Seminar was convened under the general theme 'Powder Metallurgical High Performance Materials'. Under this broadened perspective the seminar will strive to look beyond the refractory metals and cemented carbides, which remain at its focus, to novel classes of materials, such as intermetallic compounds, with potential for high temperature applications. (author)

  8. Powder metallurgical high performance materials. Proceedings. Volume 1: high performance P/M metals

    Energy Technology Data Exchange (ETDEWEB)

    Kneringer, G; Roedhammer, P; Wildner, H [eds.

    2001-07-01

    The proceedings of this sequence of seminars form an impressive chronicle of the continued progress in the understanding of refractory metals and cemented carbides and in their manufacture and application. There the ingenuity and assiduous work of thousands of scientists and engineers striving for progress in the field of powder metallurgy is documented in more than 2000 contributions covering some 30000 pages. The 15th Plansee Seminar was convened under the general theme 'Powder Metallurgical High Performance Materials'. Under this broadened perspective the seminar will strive to look beyond the refractory metals and cemented carbides, which remain at its focus, to novel classes of materials, such as intermetallic compounds, with potential for high temperature applications. (author)

  9. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  10. Frontier Research in Astrophysics - II

    Science.gov (United States)

    2016-05-01

    The purpose of this international workshop is to bring together astrophysicists and physicists who are involved in various topics at the forefront of modern astrophysics and particle physics. The workshop will discuss the most recent experimental and theoretical results in order to advance our understanding of the physics governing our Universe. To accomplish the goals of the workshop, we believe it is necessary to use data from ground-based and space-based experiments and results from theoretical developments: work on the forefront of science which has resulted (or promises to result in) high-impact scientific papers. Hence, the main purpose of the workshop is to discuss in a unique and collaborative setting a broad range of topics in modern astrophysics, from the Big Bang to Planets and Exoplanets. We believe that this can provide a suitable framework for each participant who (while obviously not involved in all the topics discussed), will be able to acquire a general view of the main experimental and theoretical results currently obtained. Such an up-to-date view of the current research on cosmic sources can help guide future research projects by the participants, and will encourage collaborative efforts across various topical areas of research. The proceedings will be published in Proceedings of Science (PoS)- SISSA and will provide a powerful resource for all the scientific community and will be especially helpful for PhD students. The following items will be reviewed: Cosmology: Cosmic Background, Dark Matter, Dark Energy, Clusters of Galaxies. Physics of the Diffuse Cosmic Sources. Physics of Cosmic Rays. Physics of Discrete Cosmic Sources. Extragalactic Sources: Active Galaxies, Normal Galaxies, Gamma-Ray Bursts. Galactic Sources: Star Formation, Pre-Main-Sequence and Main- Sequence Stars, the Sun, Cataclysmic Variables and Novae, Supernovae and SNRs, X-Ray Binary Systems, Pulsars, Black Holes, Gamma-Ray Sources, Nucleosynthesis, Asteroseismology

  11. Astrophysics at RIA (ARIA) Working Group

    International Nuclear Information System (INIS)

    Smith, Michael S.; Schatz, Hendrik; Timmes, Frank X.; Wiescher, Michael; Greife, Uwe

    2006-01-01

    The Astrophysics at RIA (ARIA) Working Group has been established to develop and promote the nuclear astrophysics research anticipated at the Rare Isotope Accelerator (RIA). RIA is a proposed next-generation nuclear science facility in the U.S. that will enable significant progress in studies of core collapse supernovae, thermonuclear supernovae, X-ray bursts, novae, and other astrophysical sites. Many of the topics addressed by the Working Group are relevant for the RIKEN RI Beam Factory, the planned GSI-Fair facility, and other advanced radioactive beam facilities

  12. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  13. Local models of astrophysical discs

    Science.gov (United States)

    Latter, Henrik N.; Papaloizou, John

    2017-12-01

    Local models of gaseous accretion discs have been successfully employed for decades to describe an assortment of small-scale phenomena, from instabilities and turbulence, to dust dynamics and planet formation. For the most part, they have been derived in a physically motivated but essentially ad hoc fashion, with some of the mathematical assumptions never made explicit nor checked for consistency. This approach is susceptible to error, and it is easy to derive local models that support spurious instabilities or fail to conserve key quantities. In this paper we present rigorous derivations, based on an asympototic ordering, and formulate a hierarchy of local models (incompressible, Boussinesq and compressible), making clear which is best suited for a particular flow or phenomenon, while spelling out explicitly the assumptions and approximations of each. We also discuss the merits of the anelastic approximation, emphasizing that anelastic systems struggle to conserve energy unless strong restrictions are imposed on the flow. The problems encountered by the anelastic approximation are exacerbated by the disc's differential rotation, but also attend non-rotating systems such as stellar interiors. We conclude with a defence of local models and their continued utility in astrophysical research.

  14. Mass-23 nuclei in astrophysics

    International Nuclear Information System (INIS)

    Fraser, P R; Amos, K; Van der Kniff, D; Canton, L; Karataglidis, S; Svenne, J P

    2015-01-01

    The formation of mass-23 nuclei by radiative capture is of great interest in astrophysics. A topical problem associated with these isobars is the so-called 22 Na puzzle of ONe white dwarf novae, where the abundance of 22 Na observed is not as is predicted by current stellar models, indicating there is more to learn about how the distribution of elements in the universe occurred. Another concerns unexplained variations in elements abundance on the surface of aging red giant stars. One method for theoretically studying nuclear scattering is the Multi-Channel Algebraic Scattering (MCAS) formalism. Studies to date have used a simple collective-rotor prescription to model the target states which couple to projectile nucleons. While, in general, the target states considered all belong to the ground state rotor band, for some systems it is necessary to include coupling to states outside of this band. Herein we discuss an extension of MCAS to allow coupling of different strengths between such states and the ground state band. This consideration is essential when studying the scattering of neutrons from 22 Ne, a necessary step in studying the mass-23 nuclei mentioned above. (paper)

  15. FFT-based high-performance spherical harmonic transformation

    Czech Academy of Sciences Publication Activity Database

    Gruber, Ch.; Novák, P.; Sebera, Josef

    2011-01-01

    Roč. 55, č. 3 (2011), s. 489-500 ISSN 0039-3169 Institutional research plan: CEZ:AV0Z10030501 Keywords : 2-D Fourier expansion * geopotential * spherical harmonics Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 0.700, year: 2011

  16. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  17. Isometric embeddings in cosmology and astrophysics

    Indian Academy of Sciences (India)

    embedding theory, a given spacetime (or 'brane') is embedded in a higher- ..... If one recalls that the motivation (at least in part) for non-compact extra ... to successfully embed (apparently perfect fluid) astrophysical models, we typically need to.

  18. Transport processes in space physics and astrophysics

    CERN Document Server

    Zank, Gary P

    2014-01-01

    Transport Processes in Space Physics and Astrophysics' is aimed at graduate level students to provide the necessary mathematical and physics background to understand the transport of gases, charged particle gases, energetic charged particles, turbulence, and radiation in an astrophysical and space physics context. Subjects emphasized in the work include collisional and collisionless processes in gases (neutral or plasma), analogous processes in turbulence fields and radiation fields, and allows for a simplified treatment of the statistical description of the system. A systematic study that addresses the common tools at a graduate level allows students to progress to a point where they can begin their research in a variety of fields within space physics and astrophysics. This book is for graduate students who expect to complete their research in an area of plasma space physics or plasma astrophysics. By providing a broad synthesis in several areas of transport theory and modeling, the work also benefits resear...

  19. Advances in instrumentation for nuclear astrophysics

    Directory of Open Access Journals (Sweden)

    S. D. Pain

    2014-04-01

    Full Text Available The study of the nuclear physics properties which govern energy generation and nucleosynthesis in the astrophysical phenomena we observe in the universe is crucial to understanding how these objects behave and how the chemical history of the universe evolved to its present state. The low cross sections and short nuclear lifetimes involved in many of these reactions make their experimental determination challenging, requiring developments in beams and instrumentation. A selection of developments in nuclear astrophysics instrumentation is discussed, using as examples projects involving the nuclear astrophysics group at Oak Ridge National Laboratory. These developments will be key to the instrumentation necessary to fully exploit nuclear astrophysics opportunities at the Facility for Rare Isotope Beams which is currently under construction.

  20. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  1. Nuclear astrophysics and nuclei far from stability

    International Nuclear Information System (INIS)

    Schatz, H.

    2003-01-01

    Unstable nuclei play a critical role in a number of astrophysical scenarios and are important for our understanding of the origin of the elements. Among the most important scenarios are the r-process (Supernovae), Novae, X-ray bursters, and Superbursters. For these astrophysical events I review the open questions, recent developments in astronomy, and how nuclear physics, in particular experiments with radioactive beams, needs to contribute to find the answers. (orig.)

  2. Cosmological and astrophysical neutrino mass measurements

    DEFF Research Database (Denmark)

    Abazajian, K.N.; Calabrese, E.; Cooray, A.

    2011-01-01

    Cosmological and astrophysical measurements provide powerful constraints on neutrino masses complementary to those from accelerators and reactors. Here we provide a guide to these different probes, for each explaining its physical basis, underlying assumptions, current and future reach.......Cosmological and astrophysical measurements provide powerful constraints on neutrino masses complementary to those from accelerators and reactors. Here we provide a guide to these different probes, for each explaining its physical basis, underlying assumptions, current and future reach....

  3. Technology Development for a Neutrino Astrophysical Observatory

    International Nuclear Information System (INIS)

    Chaloupka, V.; Cole, T.; Crawford, H.J.; He, Y.D.; Jackson, S.; Kleinfelder, S.; Lai, K.W.; Learned, J.; Ling, J.; Liu, D.; Lowder, D.; Moorhead, M.; Morookian, J.M.; Nygren, D.R.; Price, P.B.; Richards, A.; Shapiro, G.; Shen, B.; Smoot, George F.; Stokstad, R.G.; VanDalen, G.; Wilkes, J.; Wright, F.; Young, K.

    1996-01-01

    We propose a set of technology developments relevant to the design of an optimized Cerenkov detector for the study of neutrino interactions of astrophysical interest. Emphasis is placed on signal processing innovations that enhance significantly the quality of primary data. These technical advances, combined with field experience from a follow-on test deployment, are intended to provide a basis for the engineering design for a kilometer-scale Neutrino Astrophysical Observatory

  4. Astrophysics at nTOF facility

    International Nuclear Information System (INIS)

    Tagliente, G.; Colonna, N.; Maronne, S.; Terlizzi, R.; Abondanno, U.; Fujii, K.; Milazzo, P.M.; Moreau, C.; Belloni, F.; Aerts, G.; Berthoumieux, E.; Andriamonje, S.; Dridi, W.; Gunsing, F.; Pancin, J.; Perrot, L.; Alvarez, H.; Duran, I.; Paradela, C.; Alvarez-Velarde, F.; Cano-Ott, D.; Embid-Segura, M.; Guerrero, C.; Martinez, T.; Villamarin, D.; Vincente, M.C.; Gonzalez-Romero, E.; Andrzejewski, J.; Marganiec, J.; Assimakopoulos, P.; Karamanis, D.; Audouin, L.; Dillman, I.; Heil, M.; Kappeler, F.; Mosconi, M.; Plag, R.; Voss, F.; Walter, S.; Wissak, K.; Badurek, G.; Jericha, E.; Leeb, H.; Oberhummer, H.; Pigni, M.T.; Baumann, P.; David, S.; Kerveno, M.; Rudolf, G.; Lukic, S.; Becvar, F.; Krticka, M.; Bisterzo, S.; Ferrant, L.; Gallino, R.; Calvino, F.; Poch, A.; Pretel, C.; Calviani, M.; Gramegna, F.; Mastinu, P.; Capote, R.; Mengoni, A.; Capote, R.; Lozano, M.; Quesada, J.; Carrapico, C.; Salgado, J.; Santos, C.; Tavora, L.; Vaz, P.; Cennini, P.; Chiaveri, E.; Dahlfors, M.; Kadi, Y.; Sarchiapone, L.; Vlachoudis, V.; Wendler, H.; Chepel, V.; Ferreira-Marques, R.; Goncalves, I.; Lindote, A.; Lopes, I.; Neves, F.; Couture, A.; Cox, J.; O'Brien, S.; Wiescher, M.; Dominga-Pardo, C.; Tain, J.L.; Eleftheriadis, C.; Lamboudis, C.; Savvidis, I.; Stephan, C.; Tassan-Got, L.; Furman, W.; Haas, B.; Haight, R.; Reifarth, R.; Igashira, M.; Koehler, P.; Massimi, C.; Vannini, G.; Papadopoulos, C.; Pavlik, A.; Pavlopoulos, P.; Plomen, A.; Rullhusen, P.; Rauscher, T.; Rubbia, C.; Ventura, A.

    2009-01-01

    The neutron time of flight (n T OF) facility at CERN is a neutron spallation source, its white neutron energy spectrum ranges from thermal to several GeV, covering the full energy range of interest for nuclear astrophysics, in particular for measurements of the neutron capture cross-section required in s-process nucleosynthesis. This contribution gives an overview on the astrophysical program made at n T OF facility, the results and the implications will be considered.

  5. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  6. Hydrodynamic instabilities in astrophysics and ICF

    International Nuclear Information System (INIS)

    Paul Drake, R.

    2005-01-01

    Inertial fusion systems and astrophysical systems both involve hydrodynamic effects, including sources of pressure, shock waves, rarefactions, and plasma flows. In the evolution of such systems, hydrodynamic instabilities naturally evolve. As a result, a fundamental understanding of hydrodynamic instabilities is necessary to understand their behavior. In addition, high-energy-density facilities designed for ICF purposes can be used to provide and experimental basis for understanding astrophysical processes. In this talk. I will discuss the instabilities that appear in astrophysics and ICF from the common perspective of the basic mechanisms at work. Examples will be taken from experiments aimed at ICF, from astrophysical systems, and from experiments using ICF systems to address issues in astrophysics. The high-energy-density research facilities of today can accelerate small but macroscopic amounts of material to velocities above 100 km/s, can heat such material to temperature above 100 eV, can produce pressures far above a million atmospheres (10''12 dybes/cm''2 or 0.1 TPascal), and can do experiments under these conditions that address basic physics issues. This enables on to devise experiments aimed directly at important process such as the Rayleigh Taylor instability at an ablating surface or at an embedded interface that is accelerating, the Richtmeyer Meshkov evolution of shocked interfaces, and the Kelvin-Helmholtz instability of shear flows. The talk will include examples of such phenomena from the laboratory and from astrophysics, and will discuss experiments to study them. (Author)

  7. Nuclear astrophysics: a new era

    Energy Technology Data Exchange (ETDEWEB)

    Wiescher, Michael; Aprahamian, Ani [Department of Physics, University of Notre Dame (United States); Regan, Paddy [Department of Physics, University of Surrey (United Kingdom)

    2002-02-01

    The latest generation of radioactive-ion-beam facilities promises to shed light on the complex nuclear processes that control the evolution of stars and stellar explosions. The most fundamental question in nature is where do we come from, or, put another way, what are we made of? The late Carl Sagan poetically said that we are all made of stardust, but the origin of the elements has fascinated scientists for thousands of years. Many of the greatest medieval and renaissance scientists dabbled in alchemy, trying to create the elements that make up the cosmos, but we had to wait until the early 20th century to recognize that elements are really defined by the number of protons in the nucleus. According to our current understanding, after the big bang most of the normal or baryonic material in the universe consisted of the lightest two elements, hydrogen and helium, with only trace amounts of lithium and beryllium. All the heavier elements that occur naturally on Earth were created from this original material via a series of nuclear reactions in the cores of stars or in stellar explosions. Over the last decade, ground-based telescopes and satellite-based Observatories have opened new windows on the stars across the electromagnetic spectrum, from infrared to gamma radiation. New technology now makes it possible to observe and analyse short-lived stellar explosions. Indeed, the distribution of elements in 'planetary nebula' and in the ejecta of supernovae and novae give a direct glimpse of individual nucleosynthesis processes. In the February issue of Physics World, Michael Wiescher, Paddy Regan and Ani Aprahamian describe how sate-of-the-art facilities are set to plug many of the gaps in our understanding of nuclear astrophysics. (U.K.)

  8. Distance Measurement Solves Astrophysical Mysteries

    Science.gov (United States)

    2003-08-01

    Location, location, and location. The old real-estate adage about what's really important proved applicable to astrophysics as astronomers used the sharp radio "vision" of the National Science Foundation's Very Long Baseline Array (VLBA) to pinpoint the distance to a pulsar. Their accurate distance measurement then resolved a dispute over the pulsar's birthplace, allowed the astronomers to determine the size of its neutron star and possibly solve a mystery about cosmic rays. "Getting an accurate distance to this pulsar gave us a real bonanza," said Walter Brisken, of the National Radio Astronomy Observatory (NRAO) in Socorro, NM. Monogem Ring The Monogem Ring, in X-Ray Image by ROSAT satellite CREDIT: Max-Planck Institute, American Astronomical Society (Click on Image for Larger Version) The pulsar, called PSR B0656+14, is in the constellation Gemini, and appears to be near the center of a circular supernova remnant that straddles Gemini and its neighboring constellation, Monoceros, and is thus called the Monogem Ring. Since pulsars are superdense, spinning neutron stars left over when a massive star explodes as a supernova, it was logical to assume that the Monogem Ring, the shell of debris from a supernova explosion, was the remnant of the blast that created the pulsar. However, astronomers using indirect methods of determining the distance to the pulsar had concluded that it was nearly 2500 light-years from Earth. On the other hand, the supernova remnant was determined to be only about 1000 light-years from Earth. It seemed unlikely that the two were related, but instead appeared nearby in the sky purely by a chance juxtaposition. Brisken and his colleagues used the VLBA to make precise measurements of the sky position of PSR B0656+14 from 2000 to 2002. They were able to detect the slight offset in the object's apparent position when viewed from opposite sides of Earth's orbit around the Sun. This effect, called parallax, provides a direct measurement of

  9. Control switching in high performance and fault tolerant control

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2010-01-01

    The problem of reliability in high performance control and in fault tolerant control is considered in this paper. A feedback controller architecture for high performance and fault tolerance is considered. The architecture is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization. By usi...

  10. Mechanical Properties of High Performance Cementitious Grout (II)

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.

    The present report is an update of the report “Mechanical Properties of High Performance Cementitious Grout (I)” [1] and describes tests carried out on the high performance grout MASTERFLOW 9500, marked “WMG 7145 FP”, developed by BASF Construction Chemicals A/S and designed for use in grouted...

  11. Development of new high-performance stainless steels

    International Nuclear Information System (INIS)

    Park, Yong Soo

    2002-01-01

    This paper focused on high-performance stainless steels and their development status. Effect of nitrogen addition on super-stainless steel was discussed. Research activities at Yonsei University, on austenitic and martensitic high-performance stainless, steels, and the next-generation duplex stainless steels were introduced

  12. YT: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Turk, Matthew J.; /San Diego, CASS; Smith, Britton D.; /Michigan State U.; Oishi, Jeffrey S.; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Skory, Stephen; Skillman, Samuel W.; /Colorado U., CASA; Abel, Tom; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Norman, Michael L.; /aff San Diego, CASS

    2011-06-23

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  13. yt: A MULTI-CODE ANALYSIS TOOLKIT FOR ASTROPHYSICAL SIMULATION DATA

    International Nuclear Information System (INIS)

    Turk, Matthew J.; Norman, Michael L.; Smith, Britton D.; Oishi, Jeffrey S.; Abel, Tom; Skory, Stephen; Skillman, Samuel W.

    2011-01-01

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  14. A low cost, high performance, 1.2m off-axis telescope built with NG-Xinetics silicon carbide

    Science.gov (United States)

    Rey, Justin J.; Wellman, John A.; Egan, Richard G.; Wollensak, Richard J.

    2011-09-01

    The search for extrasolar habitable planets is one of three major astrophysics priorities identified for the next decade. These missions demand very high performance visible-wavelength optical imaging systems. Such high performance space telescopes are typically extremely expensive and can be difficult for government agencies to afford in today's economic climate, and most lower cost systems offer little benefit because they fall short on at least one of the following three key performance parameters: imaging wavelength, total system-level wavefront error and aperture diameter. Northrop Grumman Xinetics has developed a simple, lightweight, low-cost telescope design that will address the near-term science objectives of this astrophysics theme with the required optical performance, while reducing the telescope cost by an order of magnitude. Breakthroughs in SiC mirror manufacturing, integrated wavefront sensing, and high TRL deformable mirror technology have finally been combined within the same organization to offer a complete end-to-end telescope system in the lower end of the Class D cost range. This paper presents the latest results of real OAP polishing and metrology data, an optimized optical design, and finite element derived WFE

  15. Plasma Astrophysics, part II Reconnection and Flares

    CERN Document Server

    Somov, Boris V

    2007-01-01

    This well-illustrated monograph is devoted to classic fundamentals, current practice, and perspectives of modern plasma astrophysics. The first part is unique in covering all the basic principles and practical tools required for understanding and working in plasma astrophysics. The second part presents the physics of magnetic reconnection and flares of electromagnetic origin in space plasmas within the solar system; single and double stars, relativistic objects, accretion disks, and their coronae are also covered. This book is designed mainly for professional researchers in astrophysics. However, it will also be interesting and useful to graduate students in space sciences, geophysics, as well as advanced students in applied physics and mathematics seeking a unified view of plasma physics and fluid mechanics.

  16. Plasma Astrophysics, Part I Fundamentals and Practice

    CERN Document Server

    Somov, Boris V

    2006-01-01

    This well-illustrated monograph is devoted to classic fundamentals, current practice, and perspectives of modern plasma astrophysics. The first part is unique in covering all the basic principles and practical tools required for understanding and working in plasma astrophysics. The second part presents the physics of magnetic reconnection and flares of electromagnetic origin in space plasmas within the solar system; single and double stars, relativistic objects, accretion disks, and their coronae are also covered. This book is designed mainly for professional researchers in astrophysics. However, it will also be interesting and useful to graduate students in space sciences, geophysics, as well as advanced students in applied physics and mathematics seeking a unified view of plasma physics and fluid mechanics.

  17. Astrophysical disks Collective and Stochastic Phenomena

    CERN Document Server

    Fridman, Alexei M; Kovalenko, Ilya G

    2006-01-01

    The book deals with collective and stochastic processes in astrophysical discs involving theory, observations, and the results of modelling. Among others, it examines the spiral-vortex structure in galactic and accretion disks , stochastic and ordered structures in the developed turbulence. It also describes sources of turbulence in the accretion disks, internal structure of disk in the vicinity of a black hole, numerical modelling of Be envelopes in binaries, gaseous disks in spiral galaxies with shock waves formation, observation of accretion disks in a binary system and mass distribution of luminous matter in disk galaxies. The editors adaptly brought together collective and stochastic phenomena in the modern field of astrophysical discs, their formation, structure, and evolution involving the methodology to deal with, the results of observation and modelling, thereby advancing the study in this important branch of astrophysics and benefiting Professional Researchers, Lecturers, and Graduate Students.

  18. The Astrophysical Multimessenger Observatory Network (AMON)

    Science.gov (United States)

    Smith. M. W. E.; Fox, D. B.; Cowen, D. F.; Meszaros, P.; Tesic, G.; Fixelle, J.; Bartos, I.; Sommers, P.; Ashtekar, Abhay; Babu, G. Jogesh; hide

    2013-01-01

    We summarize the science opportunity, design elements, current and projected partner observatories, and anticipated science returns of the Astrophysical Multimessenger Observatory Network (AMON). AMON will link multiple current and future high-energy, multimessenger, and follow-up observatories together into a single network, enabling near real-time coincidence searches for multimessenger astrophysical transients and their electromagnetic counterparts. Candidate and high-confidence multimessenger transient events will be identified, characterized, and distributed as AMON alerts within the network and to interested external observers, leading to follow-up observations across the electromagnetic spectrum. In this way, AMON aims to evoke the discovery of multimessenger transients from within observatory subthreshold data streams and facilitate the exploitation of these transients for purposes of astronomy and fundamental physics. As a central hub of global multimessenger science, AMON will also enable cross-collaboration analyses of archival datasets in search of rare or exotic astrophysical phenomena.

  19. Astrophysical observations: lensing and eclipsing Einstein's theories.

    Science.gov (United States)

    Bennett, Charles L

    2005-02-11

    Albert Einstein postulated the equivalence of energy and mass, developed the theory of special relativity, explained the photoelectric effect, and described Brownian motion in five papers, all published in 1905, 100 years ago. With these papers, Einstein provided the framework for understanding modern astrophysical phenomena. Conversely, astrophysical observations provide one of the most effective means for testing Einstein's theories. Here, I review astrophysical advances precipitated by Einstein's insights, including gravitational redshifts, gravitational lensing, gravitational waves, the Lense-Thirring effect, and modern cosmology. A complete understanding of cosmology, from the earliest moments to the ultimate fate of the universe, will require developments in physics beyond Einstein, to a unified theory of gravity and quantum physics.

  20. Astrophysical hints of axion-like particles

    Science.gov (United States)

    Roncadelli, M.; Galanti, G.; Tavecchio, F.; Bonnoli, G.

    2015-01-01

    After reviewing three astrophysical hints of the existence of axion-like particles (ALPs), we describe in more detail a new similar hint involving flat spectrum radio quasars (FSRQs). Detection of FSRQs above about 20GeV pose a challenge to very-high-energy (VHE) astrophysics, because at those energies the ultraviolet emission from their broad line region should prevent photons produced by the central engine to leave the source. Although a few astrophysical explanations have been put forward, they are totally ad hoc. We show that a natural explanation instead arises within the conventional models of FSRQs provided that photon-ALP oscillations occur inside the source. Our analysis takes the FSRQ PKR 1222+206 as an example, and it looks tantalizing that basically the same choice of the free model parameters adopted in this case is consistent with those that provide the other three hints of the existence of ALPs.

  1. The Astrophysics Science Division Annual Report 2008

    Science.gov (United States)

    Oegerle, William; Reddy, Francis; Tyler, Pat

    2009-01-01

    The Astrophysics Science Division (ASD) at Goddard Space Flight Center (GSFC) is one of the largest and most diverse astrophysical organizations in the world, with activities spanning a broad range of topics in theory, observation, and mission and technology development. Scientific research is carried out over the entire electromagnetic spectrum from gamma rays to radio wavelengths as well as particle physics and gravitational radiation. Members of ASD also provide the scientific operations for three orbiting astrophysics missions WMAP, RXTE, and Swift, as well as the Science Support Center for the Fermi Gamma-ray Space Telescope. A number of key technologies for future missions are also under development in the Division, including X-ray mirrors, and new detectors operating at gamma-ray, X-ray, ultraviolet, infrared, and radio wavelengths. This report includes the Division's activities during 2008.

  2. Doppler tomography in fusion plasmas and astrophysics

    DEFF Research Database (Denmark)

    Salewski, Mirko; Geiger, B.; Heidbrink, W. W.

    2015-01-01

    Doppler tomography is a well-known method in astrophysics to image the accretion flow, often in the shape of thin discs, in compact binary stars. As accretion discs rotate, all emitted line radiation is Doppler-shifted. In fast-ion Dα (FIDA) spectroscopy measurements in magnetically confined plasma......, the Dα-photons are likewise Doppler-shifted ultimately due to gyration of the fast ions. In either case, spectra of Doppler-shifted line emission are sensitive to the velocity distribution of the emitters. Astrophysical Doppler tomography has lead to images of accretion discs of binaries revealing bright...... and limits, analogies and differences in astrophysical and fusion plasma Doppler tomography and what can be learned by comparison of these applications....

  3. Energy Design Guidelines for High Performance Schools: Tropical Island Climates

    Energy Technology Data Exchange (ETDEWEB)

    None

    2004-11-01

    Design guidelines outline high performance principles for the new or retrofit design of K-12 schools in tropical island climates. By incorporating energy improvements into construction or renovation plans, schools can reduce energy consumption and costs.

  4. Decal electronics for printed high performance cmos electronic systems

    KAUST Repository

    Hussain, Muhammad Mustafa; Sevilla, Galo Torres; Cordero, Marlon Diaz; Kutbee, Arwa T.

    2017-01-01

    High performance complementary metal oxide semiconductor (CMOS) electronics are critical for any full-fledged electronic system. However, state-of-the-art CMOS electronics are rigid and bulky making them unusable for flexible electronic applications

  5. Brain inspired high performance electronics on flexible silicon

    KAUST Repository

    Sevilla, Galo T.; Rojas, Jhonathan Prieto; Hussain, Muhammad Mustafa

    2014-01-01

    Brain's stunning speed, energy efficiency and massive parallelism makes it the role model for upcoming high performance computation systems. Although human brain components are a million times slower than state of the art silicon industry components

  6. Enabling High-Performance Computing as a Service

    KAUST Repository

    AbdelBaky, Moustafa; Parashar, Manish; Kim, Hyunjoo; Jordan, Kirk E.; Sachdeva, Vipin; Sexton, James; Jamjoom, Hani; Shae, Zon-Yin; Pencheva, Gergina; Tavakoli, Reza; Wheeler, Mary F.

    2012-01-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a

  7. Mechanical Properties of High Performance Cementitious Grout Masterflow 9200

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.

    The present report describes tests carried out on the high performance grout Masterflow 9200, developed by BASF Construction Chemicals A/S and designed for use in grouted connections of windmill foundations....

  8. Implementation of a high performance parallel finite element micromagnetics package

    International Nuclear Information System (INIS)

    Scholz, W.; Suess, D.; Dittrich, R.; Schrefl, T.; Tsiantos, V.; Forster, H.; Fidler, J.

    2004-01-01

    A new high performance scalable parallel finite element micromagnetics package has been implemented. It includes solvers for static energy minimization, time integration of the Landau-Lifshitz-Gilbert equation, and the nudged elastic band method

  9. High Performance Low Mass Nanowire Enabled Heatpipe, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Illuminex Corporation proposes a NASA Phase I SBIR project to develop high performance, lightweight, low-profile heat pipes with enhanced thermal transfer properties...

  10. High Performance Thin-Film Composite Forward Osmosis Membrane

    KAUST Repository

    Yip, Ngai Yin; Tiraferri, Alberto; Phillip, William A.; Schiffman, Jessica D.; Elimelech, Menachem

    2010-01-01

    obstacle hindering further advancements of this technology. This work presents the development of a high performance thin-film composite membrane for forward osmosis applications. The membrane consists of a selective polyamide active layer formed

  11. High Performance Low Mass Nanowire Enabled Heatpipe, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Heat pipes are widely used for passive, two-phase electronics cooling. As advanced high power, high performance electronics in space based and terrestrial...

  12. Laboratory Astrophysics Division of The AAS (LAD)

    Science.gov (United States)

    Salama, Farid; Drake, R. P.; Federman, S. R.; Haxton, W. C.; Savin, D. W.

    2012-10-01

    The purpose of the Laboratory Astrophysics Division (LAD) is to advance our understanding of the Universe through the promotion of fundamental theoretical and experimental research into the underlying processes that drive the Cosmos. LAD represents all areas of astrophysics and planetary sciences. The first new AAS Division in more than 30 years, the LAD traces its history back to the recommendation from the scientific community via the White Paper from the 2006 NASA-sponsored Laboratory Astrophysics Workshop. This recommendation was endorsed by the Astronomy and Astrophysics Advisory Committee (AAAC), which advises the National Science Foundation (NSF), the National Aeronautics and Space Administration (NASA), and the U.S. Department of Energy (DOE) on selected issues within the fields of astronomy and astrophysics that are of mutual interest and concern to the agencies. In January 2007, at the 209th AAS meeting, the AAS Council set up a Steering Committee to formulate Bylaws for a Working Group on Laboratory Astrophysics (WGLA). The AAS Council formally established the WGLA with a five-year mandate in May 2007, at the 210th AAS meeting. From 2008 through 2012, the WGLA annually sponsored Meetings in-a-Meeting at the AAS Summer Meetings. In May 2011, at the 218th AAS meeting, the AAS Council voted to convert the WGLA, at the end of its mandate, into a Division of the AAS and requested draft Bylaws from the Steering Committee. In January 2012, at the 219th AAS Meeting, the AAS Council formally approved the Bylaws and the creation of the LAD. The inaugural gathering and the first business meeting of the LAD were held at the 220th AAS meeting in Anchorage in June 2012. You can learn more about LAD by visiting its website at http://lad.aas.org/ and by subscribing to its mailing list.

  13. Laboratory Astrophysics Division of the AAS (LAD)

    Science.gov (United States)

    Salama, Farid; Drake, R. P.; Federman, S. R.; Haxton, W. C.; Savin, D. W.

    2012-01-01

    The purpose of the Laboratory Astrophysics Division (LAD) is to advance our understanding of the Universe through the promotion of fundamental theoretical and experimental research into the underlying processes that drive the Cosmos. LAD represents all areas of astrophysics and planetary sciences. The first new AAS Division in more than 30 years, the LAD traces its history back to the recommendation from the scientific community via the White Paper from the 2006 NASA-sponsored Laboratory Astrophysics Workshop. This recommendation was endorsed by the Astronomy and Astrophysics Advisory Committee (AAAC), which advises the National Science Foundation (NSF), the National Aeronautics and Space Administration (NASA), and the U.S. Department of Energy (DOE) on selected issues within the fields of astronomy and astrophysics that are of mutual interest and concern to the agencies. In January 2007, at the 209th AAS meeting, the AAS Council set up a Steering Committee to formulate Bylaws for a Working Group on Laboratory Astrophysics (WGLA). The AAS Council formally established the WGLA with a five-year mandate in May 2007, at the 210th AAS meeting. From 2008 through 2012, the WGLA annually sponsored Meetings in-a-Meeting at the AAS Summer Meetings. In May 2011, at the 218th AAS meeting, the AAS Council voted to convert the WGLA, at the end of its mandate, into a Division of the AAS and requested draft Bylaws from the Steering Committee. In January 2012, at the 219th AAS Meeting, the AAS Council formally approved the Bylaws and the creation of the LAD. The inaugural gathering and the first business meeting of the LAD were held at the 220th AAS meeting in Anchorage in June 2012. You can learn more about LAD by visiting its website at http://lad.aas.org/ and by subscribing to its mailing list.

  14. High Performing Greenways Design: A Case Study of Gainesville, GA

    OpenAIRE

    AKPINAR, Abdullah

    2015-01-01

    Greenways play a significant role in structuring and developing our living environment in urban as well as suburban areas. They provide many recreational, environmental, ecological, social, educational, and economical benefits to cities. This article questions what makes high performing greenways by exploring the concept, history, and development of greenways in the United States. The paper illustrates the concept of linked open spaces and high performing urban greenways in residential commun...

  15. High Performing Greenways Design: A Case Study of Gainesville, GA

    OpenAIRE

    AKPINAR, Abdullah

    2014-01-01

    Greenways play a significant role in structuring and developing our living environment in urban as well as suburban areas. They provide many recreational, environmental, ecological, social, educational, and economical benefits to cities. This article questions what makes high performing greenways by exploring the concept, history, and development of greenways in the United States. The paper illustrates the concept of linked open spaces and high performing urban greenways in residential commun...

  16. Highlighting High Performance: Whitman Hanson Regional High School; Whitman, Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    2006-06-01

    This brochure describes the key high-performance building features of the Whitman-Hanson Regional High School. The brochure was paid for by the Massachusetts Technology Collaborative as part of their Green Schools Initiative. High-performance features described are daylighting and energy-efficient lighting, indoor air quality, solar and wind energy, building envelope, heating and cooling systems, water conservation, and acoustics. Energy cost savings are also discussed.

  17. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  18. Energy Design Guidelines for High Performance Schools: Tropical Island Climates

    Energy Technology Data Exchange (ETDEWEB)

    2004-11-01

    The Energy Design Guidelines for High Performance Schools--Tropical Island Climates provides school boards, administrators, and design staff with guidance to help them make informed decisions about energy and environmental issues important to school systems and communities. These design guidelines outline high performance principles for the new or retrofit design of your K-12 school in tropical island climates. By incorporating energy improvements into their construction or renovation plans, schools can significantly reduce energy consumption and costs.

  19. MICA: The Meta-Institute for Computational Astrophysics

    Science.gov (United States)

    McMillan, Stephen L. W.; Djorgovski, S. G.; Hut, P.; Vesperini, E.; Knop, R.; Portegies Zwart, S.

    2009-05-01

    We describe MICA, the Meta Institute for Computational Astrophysics, the first professional scientific and educational, non-profit organization based in virtual worlds [VWs]. Most MICA activities are currently conducted in Second Life, arguably the most popular and best developed VW; we plan to expand our presence into other VWs as those venues evolve. The goals of MICA include (1) exploration, development and promotion of VWs and virtual reality [VR] technologies for professional research in astronomy and related fields; (2) development of novel networking venues and mechanisms for virtual scientific communication and interaction, including professional meetings, visualization, and telecollaboration; (3) use of VWs and VR technologies for education and public outreach; and (4) exchange of ideas and joint efforts with other scientific disciplines in promoting these goals for science and scholarship in general. We present representative example of MICA activities and achievements, and outline plans for expansion of the organization. For more information on MICA, please visit http://mica-vw.org .

  20. Byurakan Astrophysical Observatory as Cultural Centre

    Science.gov (United States)

    Mickaelian, A. M.; Farmanyan, S. V.

    2017-07-01

    NAS RA V. Ambartsumian Byurakan Astrophysical Observatory is presented as a cultural centre for Armenia and the Armenian nation in general. Besides being scientific and educational centre, the Observatory is famous for its unique architectural ensemble, rich botanical garden and world of birds, as well as it is one of the most frequently visited sightseeing of Armenia. In recent years, the Observatory has also taken the initiative of the coordination of the Cultural Astronomy in Armenia and in this field, unites the astronomers, historians, archaeologists, ethnographers, culturologists, literary critics, linguists, art historians and other experts. Keywords: Byurakan Astrophysical Observatory, architecture, botanic garden, tourism, Cultural Astronomy.

  1. Theoretical nuclear structure and astrophysics at FAIR

    International Nuclear Information System (INIS)

    Rodríguez, Tomás R

    2014-01-01

    Next generation of radioactive ion beam facilities like FAIR will open a bright future for nuclear structure and nuclear astrophysics research. In particular, very exotic nuclei (mainly neutron rich) isotopes will be produced and a lot of new exciting experimental data will help to test and improve the current nuclear models. In addition, these data (masses, reaction cross sections, beta decay half-lives, etc.) combined with the development of better theoretical approaches will be used as the nuclear physics input for astrophysical simulations. In this presentation I will review some of the state-of-the-art nuclear structure methods and their applications.

  2. Sources and astrophysical effects of gravitational waves

    International Nuclear Information System (INIS)

    Rees, M.J.

    1974-01-01

    The probable sources of short intense gravitational wave emissions are discussed and it is concluded, on the basis of current astrophysical ideas, that the number of events detected by an apparatus such as Weber's would not be more than one pulse par century. Some proposed explanations of a higher event rate are examined briefly but it is suggested that the sensitivity would probably have to be improved by a factor 10 8 if a few events per year due to extragalactic supernovae are to be detectable. The article concludes by mentioning several other kinds of gravitational waves of potential interest in astrophysics

  3. Advances in astronomy and astrophysics 9

    CERN Document Server

    Kopal, Zdenek

    1972-01-01

    Advances in Astronomy and Astrophysics, Volume 9 covers reviews on the advances in astronomy and astrophysics. The book presents reviews on the Roche model and its applications to close binary systems. The text then describes the part played by lunar eclipses in the evolution of astronomy; the classical theory of lunar eclipses; deviations from geometrical theory; and the methods of photometric observations of eclipses. The problems of other phenomena related in one way or another to lunar eclipses are also considered. The book further tackles the infrared observation on the eclipsed moon, as

  4. Advances in astronomy and astrophysics 7

    CERN Document Server

    Kopal, Zdenek

    2013-01-01

    Advances in Astronomy and Astrophysics, Volume 7 covers reviews about the advances in astronomy and astrophysics. The book presents reviews on the scattering of electrons by diatomic molecules and on Babcock's theory of the 22-year solar cycle and the latitude drift of the sunspot zone. The text then describes reviews on the structures of the terrestrial planets (Earth, Venus, Mars, Mercury) and on type III solar radio bursts. The compact and dispersed cosmic matter is also considered with regard to the search for new cosmic objects and phenomena and on the nature of the ref shift from compact

  5. Astrophysical relevance of γ transition energies

    International Nuclear Information System (INIS)

    Rauscher, Thomas

    2008-01-01

    The relevant γ energy range is explicitly identified where additional γ strength must be located to have an impact on astrophysically relevant reactions. It is shown that folding the energy dependences of the transmission coefficients and the level density leads to maximal contributions for γ energies of 2≤E γ ≤4 unless quantum selection rules allow isolated states to contribute. Under this condition, electric dipole transitions dominate. These findings allow us to more accurately judge the relevance of modifications of the γ strength for astrophysics

  6. On the saturation of astrophysical dynamos

    DEFF Research Database (Denmark)

    Dorch, Bertil; Archontis, Vasilis

    2004-01-01

    In the context of astrophysical dynamos we illustrate that the no-cosines flow, with zero mean helicity, can drive fast dynamo action and we study the dynamo's mode of operation during both the linear and non-linear saturation regimes. It turns out that in addition to a high growth rate in the li......In the context of astrophysical dynamos we illustrate that the no-cosines flow, with zero mean helicity, can drive fast dynamo action and we study the dynamo's mode of operation during both the linear and non-linear saturation regimes. It turns out that in addition to a high growth rate...

  7. Bibliometric indicators of young authors in astrophysics

    DEFF Research Database (Denmark)

    Havemann, Frank; Larsen, Birger

    2015-01-01

    We test 16 bibliometric indicators with respect to their validity at the level of the individual researcher by estimating their power to predict later successful researchers. We compare the indicators of a sample of astrophysics researchers who later co-authored highly cited papers before...... their first landmark paper with the distributions of these indicators over a random control group of young authors in astronomy and astrophysics. We find that field and citation-window normalisation substantially improves the predicting power of citation indicators. The sum of citation numbers normalised...

  8. Magnetic processes in astrophysics theory, simulations, experiments

    CERN Document Server

    Rüdiger, Günther; Hollerbach, Rainer

    2013-01-01

    In this work the authors draw upon their expertise in geophysical and astrophysical MHD to explore the motion of electrically conducting fluids, the so-called dynamo effect, and describe the similarities and differences between different magnetized objects. They also explain why magnetic fields are crucial to the formation of the stars, and discuss promising experiments currently being designed to investigate some of the relevant physics in the laboratory. This interdisciplinary approach will appeal to a wide audience in physics, astrophysics and geophysics. This second edition covers such add

  9. White Paper on Nuclear Astrophysics and Low Energy Nuclear Physics - Part 1. Nuclear Astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Arcones, Almudena [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, Jutta E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Others, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-04

    This white paper informs the nuclear astrophysics community and funding agencies about the scientific directions and priorities of the field and provides input from this community for the 2015 Nuclear Science Long Range Plan. It summarizes the outcome of the nuclear astrophysics town meeting that was held on August 21 - 23, 2014 in College Station at the campus of Texas A&M University in preparation of the NSAC Nuclear Science Long Range Plan. It also reflects the outcome of an earlier town meeting of the nuclear astrophysics community organized by the Joint Institute for Nuclear Astrophysics (JINA) on October 9 - 10, 2012 Detroit, Michigan, with the purpose of developing a vision for nuclear astrophysics in light of the recent NRC decadal surveys in nuclear physics (NP2010) and astronomy (ASTRO2010). The white paper is furthermore informed by the town meeting of the Association of Research at University Nuclear Accelerators (ARUNA) that took place at the University of Notre Dame on June 12 - 13, 2014. In summary we find that nuclear astrophysics is a modern and vibrant field addressing fundamental science questions at the intersection of nuclear physics and astrophysics. These questions relate to the origin of the elements, the nuclear engines that drive life and death of stars, and the properties of dense matter. A broad range of nuclear accelerator facilities, astronomical observatories, theory efforts, and computational capabilities are needed. With the developments outlined in this white paper, answers to long-standing key questions are well within reach in the coming decade.

  10. Cactus and Visapult: An ultra-high performance grid-distributedvisualization architecture using connectionless protocols

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; Shalf, John

    2002-08-31

    This past decade has seen rapid growth in the size,resolution, and complexity of Grand Challenge simulation codes. Thistrend is accompanied by a trend towards multinational, multidisciplinaryteams who carry out this research in distributed teams, and thecorresponding growth of Grid infrastructure to support these widelydistributed Virtual Organizations. As the number and diversity ofdistributed teams grow, the need for visualization tools to analyze anddisplay multi-terabyte, remote data becomes more pronounced and moreurgent. One such tool that has been successfully used to address thisproblem is Visapult. Visapult is a parallel visualization tool thatemploys Grid-distributed components, latency tolerant visualization andgraphics algorithms, along with high performance network I/O in order toachieve effective remote analysis of massive datasets. In this paper wediscuss improvements to network bandwidth utilization and responsivenessof the Visapult application that result from using connectionlessprotocols to move data payload between the distributed Visapultcomponents and a Grid-enabled, high performance physics simulation usedto study gravitational waveforms of colliding black holes: The Cactuscode. These improvements have boosted Visapult's network efficiency to88-96 percent of the maximum theoretical available bandwidth onmulti-gigabit Wide Area Networks, and greatly enhanced interactivity.Such improvements are critically important for future development ofeffective interactive Grid applications.

  11. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  12. CRKSPH: A new meshfree hydrodynamics method with applications to astrophysics

    Science.gov (United States)

    Owen, John Michael; Raskin, Cody; Frontiere, Nicholas

    2018-01-01

    The study of astrophysical phenomena such as supernovae, accretion disks, galaxy formation, and large-scale structure formation requires computational modeling of, at a minimum, hydrodynamics and gravity. Developing numerical methods appropriate for these kinds of problems requires a number of properties: shock-capturing hydrodynamics benefits from rigorous conservation of invariants such as total energy, linear momentum, and mass; lack of obvious symmetries or a simplified spatial geometry to exploit necessitate 3D methods that ideally are Galilean invariant; the dynamic range of mass and spatial scales that need to be resolved can span many orders of magnitude, requiring methods that are highly adaptable in their space and time resolution. We have developed a new Lagrangian meshfree hydrodynamics method called Conservative Reproducing Kernel Smoothed Particle Hydrodynamics, or CRKSPH, in order to meet these goals. CRKSPH is a conservative generalization of the meshfree reproducing kernel method, combining the high-order accuracy of reproducing kernels with the explicit conservation of mass, linear momentum, and energy necessary to study shock-driven hydrodynamics in compressible fluids. CRKSPH's Lagrangian, particle-like nature makes it simple to combine with well-known N-body methods for modeling gravitation, similar to the older Smoothed Particle Hydrodynamics (SPH) method. Indeed, CRKSPH can be substituted for SPH in existing SPH codes due to these similarities. In comparison to SPH, CRKSPH is able to achieve substantially higher accuracy for a given number of points due to the explicitly consistent (and higher-order) interpolation theory of reproducing kernels, while maintaining the same conservation principles (and therefore applicability) as SPH. There are currently two coded implementations of CRKSPH available: one in the open-source research code Spheral, and the other in the high-performance cosmological code HACC. Using these codes we have applied

  13. Evolution and seismic tools for stellar astrophysics

    CERN Document Server

    Monteiro, Mario JPFG

    2008-01-01

    A collection of articles published by the journal "Astrophysics and Space Science, Volume 316, Number 1-4", August 2008. This work covers 10 evolution codes and 9 oscillation codes. It is suitable for researchers and research students working on the modeling of stars and on the implementation of seismic test of stellar models.

  14. The Trojan Horse Method in nuclear astrophysics

    International Nuclear Information System (INIS)

    Spitaleri, C.; Cherubini, S.; Del Zoppo, A.; Di Pietrob, A.; Figuerab, P.; Gulino, M.; Lattuadab, M.; Miljanic, Dstroke; Musumarra, A.; Pellegriti, M.G.; Pizzone, R.G.; Rolfs, C.; Romano, S.; Tudisco, S.; Tumino, A.

    2003-01-01

    The basic features of the Trojan Horse Method are discussed together with a review of recent applications, aimed to extract the bare astrophysical S(E)-factor for several two-body processes. In this framework information on electron screening potential U e was obtained from the comparison with direct experiments

  15. Nuclear astrophysics and the Trojan Horse Method

    Energy Technology Data Exchange (ETDEWEB)

    Spitaleri, C. [University of Catania, Dipartimento di Fisica e Astronomia, Catania (Italy); Laboratori Nazionali del Sud - INFN, Catania (Italy); La Cognata, M.; Pizzone, R.G. [Laboratori Nazionali del Sud - INFN, Catania (Italy); Lamia, L. [University of Catania, Dipartimento di Fisica e Astronomia, Catania (Italy); Mukhamedzhanov, A.M. [Texas A and M University, Cyclotron Institute, College Station, TX (United States)

    2016-04-15

    In this review, we discuss the new recent results of the Trojan Horse Method that is used to determine reaction rates for nuclear processes in several astrophysical scenarios. The theory behind this technique is shortly presented. This is followed by an overview of some new experiments that have been carried out using this indirect approach. (orig.)

  16. Studying shocks in model astrophysical flows

    International Nuclear Information System (INIS)

    Chakrabarti, S.K.

    1989-01-01

    We briefly discuss some properties of the shocks in the existing models for quasi two-dimensional astrophysical flows. All of these models which allow the study of shock analytically have some unphysical characteristics due to inherent assumptions made. We propose a hybrid model for a thin flow which has fewer unpleasant features and is suitable for the study of shocks. (author). 5 refs

  17. Neutrino Masses and Mixings and Astrophysics

    Science.gov (United States)

    Fuller, George M.

    1998-10-01

    Here we discuss the implications of light neutrino masses and neutrino flavor/type mixing for dark matter, big bang nucleosynthesis, and models of heavy element nucleosynthesis in super novae. We will also argue the other way and discuss possible constraints on neutrino physics from these astrophysical considerations.

  18. Nuclear astrophysics experiments with Pohang neutron facility

    International Nuclear Information System (INIS)

    Kim, Yeong Duk; Yoo, Gwang Ho

    1998-01-01

    Nuclear astrophysics experiments for fundamental understanding of Big Bang nucleosynthesis was performed at Pohang Neutron Facility. Laboratory experiments, inhomogeneous Big Bang nucleosynthesis and S-process were used for nucleosynthesis. For future study, more study on S-process for the desired data and nuclear network calculation are necessary

  19. Astrophysics, cosmology and high energy physics

    International Nuclear Information System (INIS)

    Rees, M.J.

    1983-01-01

    A brief survey is given of some topics in astrophysics and cosmology, with special emphasis on the inter-relation between the properties of the early Universe and recent ideas in high energy physics, and on simple order-of-magnitude arguments showing how the scales and dimensions of cosmic phenomena are related to basic physical constants. (orig.)

  20. Particle physics-astrophysics working group

    International Nuclear Information System (INIS)

    Cronin, J.W.; Kolb, E.W.

    1989-01-01

    The working group met each afternoon and listened to mini-symposia on a broad range of subjects covering all aspects of particle physics---astrophysics both theoretical and experimental. This paper reports that as a result, a number of papers which follow were commissioned to reflect the present status and future prospects of the field

  1. Workshop on gravitational waves and relativistic astrophysics

    Indian Academy of Sciences (India)

    Discussions related to gravitational wave experiments viz. LIGO and LISA as well as to observations of supermassive black holes dominated the workshop sessions on gravitational waves and relativistic astrophysics in the ICGC-2004. A summary of seven papers that were presented in these workshop sessions has been ...

  2. Exotic nuclear beta transitions astrophysical examples

    CERN Document Server

    Takahashi, K

    1981-01-01

    A theoretical study of nuclear beta -transitions under various astrophysical circumstances is reviewed by illustrative examples: 1) continuum-state electron captures in a matter in the nuclear statistical equiplibrium, and ii) bound-state beta -decays in stars in connection with a cosmochronometer and with the s-process branchings. (45 refs).

  3. Challenges and opportunities in laboratory plasma astrophysics

    Science.gov (United States)

    Drake, R. Paul

    2017-06-01

    We are in a period of explosive success and opportunity in the laboratory study of plasma phenomena that are relevant to astrophysics. In this talk I will share with you several areas in which recent work, often foreshadowed 20 or 30 years ago, has produced dramatic initial success with prospects for much more. To begin, the talk will provide a brief look at the types of devices used and the regimes they access, showing how they span many orders of magnitude in parameters of interest. It will then illustrate the types of work one can do with laboratory plasmas that are relevant to astrophysics, which range from direct measurement of material properties to the production of scaled models of certain dynamics to the pursuit of complementary understanding. Examples will be drawn from the flow of energy and momentum in astrophysics, the formation and structure of astrophysical systems, and magnetization and its consequences. I hope to include some discussion of collisionless shocks, very dense plasmas, work relevant to the end of the Dark Ages, reconnection, and dynamos. The talk will conclude by highlighting some topics where it seems that we may be on the verge of exciting new progress.The originators of work discussed, and collaborators and funding sources when appropriate, will be included in the talk.

  4. Statistics and Informatics in Space Astrophysics

    Science.gov (United States)

    Feigelson, E.

    2017-12-01

    The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.

  5. Balance in the NASA Astrophysics Program

    Science.gov (United States)

    Elvis, Martin

    2017-08-01

    The Decadal studies are usually instructed to come up with a “balanced program” for the coming decade of astrophysics initiatives, both on the ground and in space. The meaning of “balance” is left up to the Decadal panels. One meaning is that there should be a diversity of mission costs in the portfolio. Another that there should be a diversity of science questions addressed. A third is that there should be a diversity of signals (across electromagnetic wavebands, and of non-em carriers). It is timely for the astronomy community to debate the meaning of balance in the NASA astrophysics program as the “Statement of Task” (SoT) that defines the goals and process of the 2020 Astrophysics Decadal review are now being formulated.Here I propose some ways in which the Astro2020 SoT could be made more specific in order to make balance more evident and so avoid the tendency for a single science question, and a single mission to answer that question, to dominate the program. As an example of an alternative ambitious approach, I present a proof-of-principle program of 6, mostly “probe-class” missions, that would fit the nominal funding profile for the 2025-2035 NASA Astrophysics Program, while being more diverse in ambitious science goals and in wavelength coverage.

  6. Microphysics, cosmology, and high energy astrophysics

    International Nuclear Information System (INIS)

    Hoyle, F.

    1974-01-01

    The discussion of microphysics, cosmology, and high energy astrophysics includes particle motion in an electromagnetic field, conformal transformations, conformally invariant theory of gravitation, particle orbits, Friedman models with k = 0, +-1, the history and present status of steady-state cosmology, and the nature of mass. (U.S.)

  7. Pulsars as tools for fundamental physics & astrophysics

    NARCIS (Netherlands)

    Cordes, J.M.; Kramer, M.; Lazio, T.J.W.; Stappers, B.W.; Backer, D.C.; Johnston, S.

    2004-01-01

    The sheer number of pulsars discovered by the SKA, in combination with the exceptional timing precision it can provide, will revolutionize the field of pulsar astrophysics. The SKA will provide a complete census of pulsars in both the Galaxy and in Galactic globular clusters that can be used to

  8. Cosmological grand unification monopoles: astrophysical constraints

    International Nuclear Information System (INIS)

    Fry, J.N.

    1982-01-01

    I review the general arguments which suggest that relic GU magnetic monopoles should emerge from the early universe, and I discuss several astrophysical settings in which their effects could be, but are not, observed. This places limits on their possible flux, and their abundance bound to more ordinary material

  9. Astrophysical contributions of the International Ultraviolet Explorer

    International Nuclear Information System (INIS)

    Kondo, Y.; Boggess, A.; Maran, S.P.

    1989-01-01

    Findings that have been made by the IUE in a variety of astrophysical areas are reviewed. Results on stellar chromospheres and transition regions, evolutionary processes in interacting binaries, winds from early-type stars, the ISM, SN 1987A, active galactic nuclei, and solar system objects are addressed. 158 refs

  10. Minicourses in Astrophysics, Modular Approach, Vol. II.

    Science.gov (United States)

    Illinois Univ., Chicago.

    This is the second of a two-volume minicourse in astrophysics. It contains chapters on the following topics: stellar nuclear energy sources and nucleosynthesis; stellar evolution; stellar structure and its determination; and pulsars. Each chapter gives much technical discussion, mathematical treatment, diagrams, and examples. References are…

  11. Teaching Astrophysics to Upper Level Undergraduates

    Science.gov (United States)

    Van Dorn Bradt, Hale

    2010-03-01

    A Socratic peer-instruction method for teaching upper level undergraduates is presented. Basically, the instructor sits with the students and guides their presentations of the material. My two textbooks* (on display) as well as many others are amenable to this type of teaching. *Astronomy Methods - A Physical Approach to Astronomical Observations (CUP 2004) *Astrophysics Processes-The Physics of Astronomical Phenomena (CUP 2008)

  12. Astronomy and astrophysics in the new millennium: Panel reports

    National Research Council Canada - National Science Library

    Astronomy and Astrophysics Survey Committee, Board on Physics and Astronomy, Space Studies Board, National Research Council

    2001-01-01

    In preparing the report, Astronomy and Astrophysics in the New Millenium , the AASC made use of a series of panel reports that address various aspects of ground- and space-based astronomy and astrophysic...

  13. Journal of Astrophysics and Astronomy | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Since 1977, papers in Astrophysics and Astronomy appeared as a special section in Pramana. ... The journal publishes original research papers on all aspects of astrophysics and ... Articles are also visible in Web of Science immediately.

  14. Study of aluminum emission spectra in astrophysical plasmas

    International Nuclear Information System (INIS)

    Jin Zhan; Zhang Jie

    2001-01-01

    High temperature, high density and strong magnetic fields in plasmas produced by ultra-high intensity and ultrashort laser pulses are similar to the main characteristics of astrophysical plasmas. This makes it possible to simulate come astrophysical processes at laboratories. The author presents the theoretic simulation of aluminum emission spectra in astrophysical plasmas. It can be concluded that using laser produced plasmas, the authors can obtain rich information on astrophysical spectroscopy, which is unobservable for astronomer

  15. High Performance Fiber Reinforced Cement Composites 6 HPFRCC 6

    CERN Document Server

    Reinhardt, Hans; Naaman, A

    2012-01-01

    High Performance Fiber Reinforced Cement Composites (HPFRCC) represent a class of cement composites whose stress-strain response in tension undergoes strain hardening behaviour accompanied by multiple cracking, leading to a high strain prior to failure. The primary objective of this International Workshop was to provide a compendium of up-to-date information on the most recent developments and research advances in the field of High Performance Fiber Reinforced Cement Composites. Approximately 65 contributions from leading world experts are assembled in these proceedings and provide an authoritative perspective on the subject. Special topics include fresh and hardening state properties; self-compacting mixtures; mechanical behavior under compressive, tensile, and shear loading; structural applications; impact, earthquake and fire resistance; durability issues; ultra-high performance fiber reinforced concrete; and textile reinforced concrete. Target readers: graduate students, researchers, fiber producers, desi...

  16. High performance leadership in unusually challenging educational circumstances

    Directory of Open Access Journals (Sweden)

    Andy Hargreaves

    2015-04-01

    Full Text Available This paper draws on findings from the results of a study of leadership in high performing organizations in three sectors. Organizations were sampled and included on the basis of high performance in relation to no performance, past performance, performance among similar peers and performance in the face of limited resources or challenging circumstances. The paper concentrates on leadership in four schools that met the sample criteria.  It draws connections to explanations of the high performance ofEstoniaon the OECD PISA tests of educational achievement. The article argues that leadership in these four schools that performed above expectations comprised more than a set of competencies. Instead, leadership took the form of a narrative or quest that pursued an inspiring dream with relentless determination; took improvement pathways that were more innovative than comparable peers; built collaboration and community including with competing schools; and connected short-term success to long-term sustainability.

  17. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  18. Journal of Astrophysics and Astronomy | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Astrophysics and Astronomy. Anjan A. Sen. Articles written in Journal of Astrophysics and Astronomy. Volume 37 Issue 4 December 2016 pp 33 Review. Cosmology and Astrophysics using the Post-Reionization HI · Tapomoy Guha Sarkar Anjan A. Sen · More Details Abstract Fulltext PDF.

  19. 3rd Session of the Sant Cugat Forum on Astrophysics

    CERN Document Server

    Gravitational wave astrophysics

    2015-01-01

    This book offers review chapters written by invited speakers of the 3rd Session of the Sant Cugat Forum on Astrophysics — Gravitational Waves Astrophysics. All chapters have been peer reviewed. The book goes beyond normal conference proceedings in that it provides a wide panorama of the astrophysics of gravitational waves and serves as a reference work for researchers in the field.

  20. Journal of Astrophysics and Astronomy | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Indian Institute of Astrophysics, Koramangala, Bangalore 560 034, India. Kavli Institute for Astronomy & Astrophysics, Peking University, Beijing 100871, China. National Research Council of Canada, Herzberg Institute of Astrophysics, 5071 West Saanich Road, Victoria, British Columbia, Canada. Thirty Meter Project Office, ...

  1. Contemporary high performance computing from petascale toward exascale

    CERN Document Server

    Vetter, Jeffrey S

    2013-01-01

    Contemporary High Performance Computing: From Petascale toward Exascale focuses on the ecosystems surrounding the world's leading centers for high performance computing (HPC). It covers many of the important factors involved in each ecosystem: computer architectures, software, applications, facilities, and sponsors. The first part of the book examines significant trends in HPC systems, including computer architectures, applications, performance, and software. It discusses the growth from terascale to petascale computing and the influence of the TOP500 and Green500 lists. The second part of the

  2. Turbostratic stacked CVD graphene for high-performance devices

    Science.gov (United States)

    Uemura, Kohei; Ikuta, Takashi; Maehashi, Kenzo

    2018-03-01

    We have fabricated turbostratic stacked graphene with high-transport properties by the repeated transfer of CVD monolayer graphene. The turbostratic stacked CVD graphene exhibited higher carrier mobility and conductivity than CVD monolayer graphene. The electron mobility for the three-layer turbostratic stacked CVD graphene surpassed 10,000 cm2 V-1 s-1 at room temperature, which is five times greater than that for CVD monolayer graphene. The results indicate that the high performance is derived from maintenance of the linear band dispersion, suppression of the carrier scattering, and parallel conduction. Therefore, turbostratic stacked CVD graphene is a superior material for high-performance devices.

  3. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  4. High performance computing and communications: FY 1997 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage, with bipartisan support, of the High-Performance Computing Act of 1991, signed on December 9, 1991. The original Program, in which eight Federal agencies participated, has now grown to twelve agencies. This Plan provides a detailed description of the agencies` FY 1996 HPCC accomplishments and FY 1997 HPCC plans. Section 3 of this Plan provides an overview of the HPCC Program. Section 4 contains more detailed definitions of the Program Component Areas, with an emphasis on the overall directions and milestones planned for each PCA. Appendix A provides a detailed look at HPCC Program activities within each agency.

  5. 75 FR 2893 - NASA Advisory Council; Science Committee; Astrophysics Subcommittee; Meeting

    Science.gov (United States)

    2010-01-19

    ... Committee; Astrophysics Subcommittee; Meeting AGENCY: National Aeronautics and Space Administration. ACTION... of the Astrophysics Subcommittee of the NASA Advisory Council (NAC). This Subcommittee reports to the... following topics: --Astrophysics Division Update --Updates on Select Astrophysics Missions --Discussion of...

  6. 78 FR 2293 - NASA Advisory Council; Science Committee; Astrophysics Subcommittee; Meeting

    Science.gov (United States)

    2013-01-10

    ... Committee; Astrophysics Subcommittee; Meeting AGENCY: National Aeronautics and Space Administration. ACTION... amended, the National Aeronautics and Space Administration (NASA) announces a meeting of the Astrophysics... meeting includes the following topics: --Astrophysics Division Update --NASA Astrophysics Roadmapping It...

  7. 78 FR 66384 - NASA Advisory Council; Science Committee; Astrophysics Subcommittee; Meeting

    Science.gov (United States)

    2013-11-05

    ... Committee; Astrophysics Subcommittee; Meeting AGENCY: National Aeronautics and Space Administration. ACTION... amended, the National Aeronautics and Space Administration (NASA) announces a meeting of the Astrophysics...: --Astrophysics Division Update --Presentation of Astrophysics Roadmap --Reports from Program Analysis Groups...

  8. 75 FR 51116 - NASA Advisory Council; Science Committee; Astrophysics Subcommittee; Meeting

    Science.gov (United States)

    2010-08-18

    ... Committee; Astrophysics Subcommittee; Meeting AGENCY: National Aeronautics and Space Administration. ACTION... amended, the National Aeronautics and Space Administration (NASA) announces a meeting of the Astrophysics... topics: --Astrophysics Division Update --2010 Astronomy and Astrophysics Decadal Survey --Update on...

  9. Distributed GIS Computing for High Performance Simulation and Visualization, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Today, the ability of sensors to generate geographical data is virtually limitless. Although NASA now provides (together with other agencies such as the USGS) a...

  10. NASA Astrophysics Funds Strategic Technology Development

    Science.gov (United States)

    Seery, Bernard D.; Ganel, Opher; Pham, Bruce

    2016-01-01

    The COR and PCOS Program Offices (POs) reside at the NASA Goddard Space Flight Center (GSFC), serving as the NASA Astrophysics Division's implementation arm for matters relating to the two programs. One aspect of the PO's activities is managing the COR and PCOS Strategic Astrophysics Technology (SAT) program, helping mature technologies to enable and enhance future astrophysics missions. For example, the SAT program is expected to fund key technology developments needed to close gaps identified by Science and Technology Definition Teams (STDTs) planned to study several large mission concept studies in preparation for the 2020 Decadal Survey.The POs are guided by the National Research Council's "New Worlds, New Horizons in Astronomy and Astrophysics" Decadal Survey report, NASA's Astrophysics Implementation Plan, and the visionary Astrophysics Roadmap, "Enduring Quests, Daring Visions." Strategic goals include dark energy, gravitational waves, and X-ray observatories. Future missions pursuing these goals include, e.g., US participation in ESA's Euclid, Athena, and L3 missions; Inflation probe; and a large UV/Optical/IR (LUVOIR) telescope.To date, 65 COR and 71 PCOS SAT proposals have been received, of which 15 COR and 22 PCOS projects were funded. Notable successes include maturation of a new far-IR detector, later adopted by the SOFIA HAWC instrument; maturation of the H4RG near-IR detector, adopted by WFIRST; development of an antenna-coupled transition-edge superconducting bolometer, a technology deployed by BICEP2/BICEP3/Keck to measure polarization in the CMB signal; advanced UV reflective coatings implemented on the optics of GOLD and ICON, two heliophysics Explorers; and finally, the REXIS instrument on OSIRIS-REx is incorporating CCDs with directly deposited optical blocking filters developed by another SAT-funded project.We discuss our technology development process, with community input and strategic prioritization informing calls for SAT proposals and

  11. VI School on Cosmic Rays and Astrophysics

    International Nuclear Information System (INIS)

    2017-01-01

    VI School on Cosmic Rays and Astrophysics 17-25 November 2015, Chiapas, Mexico The VI School on Cosmic Rays and Astrophysics was held at the MCTP, at the Autonomous University of Chiapas (UNACH), Tuxtla Gutiérrez, Chiapas, Mexico thanks to the Science for Development ICTP-UNACH-UNESCO Regional Seminar, 17-25 November 2015 (http://mctp.mx/e-VI-School-on-Cosmic-Rays-and-Astrophysics.html). The School series started in La Paz, Bolivia in 2004 and it has been, since then, hosted by several Latin American countires: 1.- La Paz, Bolivia (August, 2004), 2.- Puebla, Mexico (September, 2006), 3.- Arequipa, Peru (September, 2008), 4.- Santo André, Brazil (September, 2010), 5.- La Paz, Bolivia (August, 2012). It aims to promote Cosmic Ray (CR) Physics and Astrophysics in the Latin American community and to provide a general overview of theoretical and experimental issues on these topics. It is directed to undergraduates, postgraduates and active researchers in the field. The lectures introduce fundamental Cosmic Ray Physics and Astrophysics with a review of standards of the field. It is expected the school continues happening during the next years following a tradition. In this edition, the list of seminars included topics such as experimental techniques of CR detection, development of CR showers and hadronic interactions, composition and energy spectrum of primary CR, Gamma-Ray Bursts (GRBs), neutrino Astrophysics, spacecraft detectors, simulations, solar modulation, and the current state of development and results of several astroparticle physics experiments such as The Pierre Auger Observatory in Argentina, HAWC in Mexico, KASCADE and KASCADE Grande, HESS, IceCube, JEM-EUSO, Fermi-LAT, and others. This time the school has been complemented with the ICTP-UNACH-UNESCO Seminar of theory on Particle and Astroparticle Physics. The organization was done by MCTP, the Mesoamerican Centre for Theoretical Physics. The school had 46 participants, 30 students from Honduras, Brazil

  12. High-performance-vehicle technology. [fighter aircraft propulsion

    Science.gov (United States)

    Povinelli, L. A.

    1979-01-01

    Propulsion needs of high performance military aircraft are discussed. Inlet performance, nozzle performance and cooling, and afterburner performance are covered. It is concluded that nonaxisymmetric nozzles provide cleaner external lines and enhanced maneuverability, but the internal flows are more complex. Swirl afterburners show promise for enhanced performance in the high altitude, low Mach number region.

  13. Optical interconnection networks for high-performance computing systems

    International Nuclear Information System (INIS)

    Biberman, Aleksandr; Bergman, Keren

    2012-01-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. (review article)

  14. Frictional behaviour of high performance fibrous tows: Friction experiments

    NARCIS (Netherlands)

    Cornelissen, Bo; Rietman, Bert; Akkerman, Remko

    2013-01-01

    Tow friction is an important mechanism in the production and processing of high performance fibrous tows. The frictional behaviour of these tows is anisotropic due to the texture of the filaments as well as the tows. This work describes capstan experiments that were performed to measure the

  15. Determination of Caffeine in Beverages by High Performance Liquid Chromatography.

    Science.gov (United States)

    DiNunzio, James E.

    1985-01-01

    Describes the equipment, procedures, and results for the determination of caffeine in beverages by high performance liquid chromatography. The method is simple, fast, accurate, and, because sample preparation is minimal, it is well suited for use in a teaching laboratory. (JN)

  16. Enabling High-Performance Computing as a Service

    KAUST Repository

    AbdelBaky, Moustafa

    2012-10-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a representative HPC application. © 2012 IEEE.

  17. High Performance Skiing. How to Become a Better Alpine Skier.

    Science.gov (United States)

    Yacenda, John

    This book is intended for people who desire to improve their skiing by exploring high performance techniques leading to: (1) more consistent performance; (2) less fatigue and more endurance; (3) greater strength and flexibility; (4) greater versatility; (5) greater confidence in all skiing conditions; and (6) the knowledge to participate in…

  18. Computer science of the high performance; Informatica del alto rendimiento

    Energy Technology Data Exchange (ETDEWEB)

    Moraleda, A.

    2008-07-01

    The high performance computing is taking shape as a powerful accelerator of the process of innovation, to drastically reduce the waiting times for access to the results and the findings in a growing number of processes and activities as complex and important as medicine, genetics, pharmacology, environment, natural resources management or the simulation of complex processes in a wide variety of industries. (Author)

  19. Sensitive high performance liquid chromatographic method for the ...

    African Journals Online (AJOL)

    A new simple, sensitive, cost-effective and reproducible high performance liquid chromatographic (HPLC) method for the determination of proguanil (PG) and its metabolites, cycloguanil (CG) and 4-chlorophenylbiguanide (4-CPB) in urine and plasma is described. The extraction procedure is a simple three-step process ...

  20. Contemporary high performance computing from petascale toward exascale

    CERN Document Server

    Vetter, Jeffrey S

    2015-01-01

    A continuation of Contemporary High Performance Computing: From Petascale toward Exascale, this second volume continues the discussion of HPC flagship systems, major application workloads, facilities, and sponsors. The book includes of figures and pictures that capture the state of existing systems: pictures of buildings, systems in production, floorplans, and many block diagrams and charts to illustrate system design and performance.

  1. High-Performance Management Practices and Employee Outcomes in Denmark

    DEFF Research Database (Denmark)

    Cristini, Annalisa; Eriksson, Tor; Pozzoli, Dario

    High-performance work practices are frequently considered to have positive effects on corporate performance, but what do they do for employees? After showing that organizational innovation is indeed positively associated with firm performance, we investigate whether high-involvement work practices...

  2. Fatigue Behaviour of High Performance Cementitious Grout Masterflow 9500

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.

    The present report describes the fatigue behaviour of the high performance grout MASTERFLOW 9500 subjected to cyclic loading, in air as well as submerged in water, at various frequencies and levels of maximum stress. Part of the results were also reported in [1] together with other mechanical...

  3. Two Profiles of the Dutch High Performing Employee

    Science.gov (United States)

    de Waal, A. A.; Oudshoorn, Michella

    2015-01-01

    Purpose: The purpose of this study is to explore the profile of an ideal employee, to be more precise the behavioral characteristics of the Dutch high-performing employee (HPE). Organizational performance depends for a large part on the commitment of employees. Employees provide their knowledge, skills, experiences and creativity to the…

  4. A high performance electrometer amplifier of hybrid design

    International Nuclear Information System (INIS)

    Rao, N.V.; Nazare, C.K.

    1979-01-01

    A high performance, reliable, electrometer amplifier of hybrid design for low current measurements in mass spectrometers has been developed. The short term instability with a 5 x 10 11 ohms input resistor is less than 1 x 10sup(-15) Amp. The drift is better than 1 mV/hour. The design steps are illustrated with a typical amplifier performance details. (auth.)

  5. Development and validation of a reversed phase High Performance ...

    African Journals Online (AJOL)

    A simple, rapid, accurate and economical isocratic Reversed Phase High Performance Liquid Chromatography (RPHPLC) method was developed, validated and used for the evaluation of content of different brands of paracetamol tablets. The method was validated according to ICH guidelines and may be adopted for the ...

  6. Dynamic Social Networks in High Performance Football Coaching

    Science.gov (United States)

    Occhino, Joseph; Mallett, Cliff; Rynne, Steven

    2013-01-01

    Background: Sports coaching is largely a social activity where engagement with athletes and support staff can enhance the experiences for all involved. This paper examines how high performance football coaches develop knowledge through their interactions with others within a social learning theory framework. Purpose: The key purpose of this study…

  7. Resolution of RNA using high-performance liquid chromatography

    NARCIS (Netherlands)

    Mclaughlin, L.W.; Bischoff, Rainer

    1987-01-01

    High-performance liquid chromatographic techniques can be very effective for the resolution and isolation of nucleic acids. The characteristic ionic (phosphodiesters) and hydrophobic (nucleobases) properties of RNAs can be exploited for their separation. In this respect anion-exchange and

  8. Mallow carotenoids determined by high-performance liquid chromatography

    Science.gov (United States)

    Mallow (corchorus olitorius) is a green vegetable, which is widely consumed either fresh or dry by Middle East population. This study was carried out to determine the contents of major carotenoids quantitatively in mallow, by using a High Performance Liquid Chromatography (HPLC) equipped with a Bis...

  9. High-Performance Liquid Chromatography-Mass Spectrometry.

    Science.gov (United States)

    Vestal, Marvin L.

    1984-01-01

    Reviews techniques for online coupling of high-performance liquid chromatography with mass spectrometry, emphasizing those suitable for application to nonvolatile samples. Also summarizes the present status, strengths, and weaknesses of various techniques and discusses potential applications of recently developed techniques for combined liquid…

  10. Developments on HNF based high performance and green solid propellants

    NARCIS (Netherlands)

    Keizers, H.L.J.; Heijden, A.E.D.M. van der; Vliet, L.D. van; Welland-Veltmans, W.H.M.; Ciucci, A.

    2001-01-01

    Worldwide developments are ongoing to develop new and more energetic composite solid propellant formulations for space transportation and military applications. Since the 90's, the use of HNF as a new high performance oxidiser is being reinvestigated. Within European development programmes,

  11. High-Performance Matrix-Vector Multiplication on the GPU

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg

    2012-01-01

    In this paper, we develop a high-performance GPU kernel for one of the most popular dense linear algebra operations, the matrix-vector multiplication. The target hardware is the most recent Nvidia Tesla 20-series (Fermi architecture), which is designed from the ground up for scientific computing...

  12. High-performance carbon nanotube-reinforced bioplastic

    CSIR Research Space (South Africa)

    Ramontja, J

    2009-12-01

    Full Text Available -1 High-Performance Carbon Nanotube-Reinforced Bioplastic 1. James Ramontja1,2, 2. Suprakas Sinha Ray1,*, 3. Sreejarani K. Pillai1, 4. Adriaan S. Luyt2 1. 1 DST/CSIR Nanotechnology Innovation Centre, National Centre for Nano-Structured Materials...

  13. High performance current controller for particle accelerator magnets supply

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Bidoggia, Benoit; Munk-Nielsen, Stig

    2013-01-01

    The electromagnets in modern particle accelerators require high performance power supply whose output is required to track the current reference with a very high accuracy (down to 50 ppm). This demands very high bandwidth controller design. A converter based on buck converter topology is used...

  14. The design of high performance weak current integrated amplifier

    International Nuclear Information System (INIS)

    Chen Guojie; Cao Hui

    2005-01-01

    A design method of high performance weak current integrated amplifier using ICL7650 operational amplifier is introduced. The operating principle of circuits and the step of improving amplifier's performance are illustrated. Finally, the experimental results are given. The amplifier has programmable measurement range of 10 -9 -10 -12 A, automatic zero-correction, accurate measurement, and good stability. (authors)

  15. Monitoring aged reversed-phase high performance liquid chromatography columns

    NARCIS (Netherlands)

    Bolck, A; Smilde, AK; Bruins, CHP

    1999-01-01

    In this paper, a new approach for the quality assessment of routinely used reversed-phase high performance liquid chromatography columns is presented. A used column is not directly considered deteriorated when changes in retention occur. If attention is paid to the type and magnitude of the changes,

  16. Ultra high performance liquid chromatography of seized drugs

    NARCIS (Netherlands)

    Lurie, I.S.

    2010-01-01

    The primary goal of this thesis is to investigate the use of ultra high performance liquid chromatography (UHPLC) for the analysis of seized drugs. This goal was largely achieved and significant progress was made in achieving improved separation and detection of drugs of forensic interest.

  17. Comparative Studies of Some Polypores Using High Performance ...

    African Journals Online (AJOL)

    ... these polypores in a previous work. The ability of the polypores to produce triterpenoids is affected by their age, period of collection, geographical location and method of drying, which also affected the High Performance Liquid Chromatography characteristics of their secondary metabolites. African Research Review Vol.

  18. Manufacturing Advantage: Why High-Performance Work Systems Pay Off.

    Science.gov (United States)

    Appelbaum, Eileen; Bailey, Thomas; Berg, Peter; Kalleberg, Arne L.

    A study examined the relationship between high-performance workplace practices and the performance of plants in the following manufacturing industries: steel, apparel, and medical electronic instruments and imaging. The multilevel research methodology combined the following data collection activities: (1) site visits; (2) collection of plant…

  19. Quantification of Tea Flavonoids by High Performance Liquid Chromatography

    Science.gov (United States)

    Freeman, Jessica D.; Niemeyer, Emily D.

    2008-01-01

    We have developed a laboratory experiment that uses high performance liquid chromatography (HPLC) to quantify flavonoid levels in a variety of commercial teas. Specifically, this experiment analyzes a group of flavonoids known as catechins, plant-derived polyphenolic compounds commonly found in many foods and beverages, including green and black…

  20. High performance co-polyimide nanofiber reinforced composites

    NARCIS (Netherlands)

    Yao, Jian; Li, Guang; Bastiaansen, Cees; Peijs, Ton

    2015-01-01

    Electrospun co-polyimide BPDA (3, 3′, 4, 4′-Biphenyltetracarboxylic dianhydride)/PDA (p-Phenylenediamine)/ODA (4, 4′-oxydianiline) nanofiber reinforced flexible composites were manufactured by impregnating these high performance nanofibers with styrene-butadiene-styrene (SBS) triblock copolymer

  1. High performance flexible CMOS SOI FinFETs

    KAUST Repository

    Fahad, Hossain M.; Sevilla, Galo T.; Ghoneim, Mohamed T.; Hussain, Muhammad Mustafa

    2014-01-01

    We demonstrate the first ever CMOS compatible soft etch back based high performance flexible CMOS SOI FinFETs. The move from planar to non-planar FinFETs has enabled continued scaling down to the 14 nm technology node. This has been possible due

  2. Flexible nanoscale high-performance FinFETs

    KAUST Repository

    Sevilla, Galo T.; Ghoneim, Mohamed T.; Fahad, Hossain M.; Rojas, Jhonathan Prieto; Hussain, Aftab M.; Hussain, Muhammad Mustafa

    2014-01-01

    With the emergence of the Internet of Things (IoT), flexible high-performance nanoscale electronics are more desired. At the moment, FinFET is the most advanced transistor architecture used in the state-of-the-art microprocessors. Therefore, we show

  3. Radioactivity monitor for high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Reeve, D.R.; Crozier, A.

    1977-01-01

    The coupling of a homogeneous radioactivity monitor to a liquid chromatograph involves compromises between the sensitivity of the monitor and the resolution and speed of analysis of the chromatograph. The theoretical relationships between these parameters are considered and expressions derived which make it possible to calculate suitable monitor operating conditions for most types of high-performance liquid chromatography

  4. Cobra Strikes! High-Performance Car Inspires Students, Markets Program

    Science.gov (United States)

    Jenkins, Bonita

    2008-01-01

    Nestled in the Lower Piedmont region of upstate South Carolina, Piedmont Technical College (PTC) is one of 16 technical colleges in the state. Automotive technology is one of its most popular programs. The program features an instructive, motivating activity that the author describes in this article: building a high-performance car. The Cobra…

  5. Neural Correlates of High Performance in Foreign Language Vocabulary Learning

    Science.gov (United States)

    Macedonia, Manuela; Muller, Karsten; Friederici, Angela D.

    2010-01-01

    Learning vocabulary in a foreign language is a laborious task which people perform with varying levels of success. Here, we investigated the neural underpinning of high performance on this task. In a within-subjects paradigm, participants learned 92 vocabulary items under two multimodal conditions: one condition paired novel words with iconic…

  6. A high-performance, low-cost, leading edge discriminator

    Indian Academy of Sciences (India)

    Abstract. A high-performance, low-cost, leading edge discriminator has been designed with a timing performance comparable to state-of-the-art, commercially available discrim- inators. A timing error of 16 ps is achieved under ideal operating conditions. Under more realistic operating conditions the discriminator displays a ...

  7. Buffer-Free High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a simple, economical and reproducible high performance liquid chromatographic (HPLC) method for the determination of theophylline in pharmaceutical dosage forms. Method: Caffeine was used as the internal standard and reversed phase C-18 column was used to elute the drug and ...

  8. Solid-Phase Extraction Combined with High Performance Liquid ...

    African Journals Online (AJOL)

    Methods: Solid-phase extraction method was employed for the extraction of the estrogen from milk and high performance liquid chromatography-diode array detector (HPLC-DAD) was used for the determination of estrogen. Results: Optimal chromatographic conditions were achieved on an Eclipse XDB-C18 column at a ...

  9. White paper on nuclear astrophysics and low energy nuclear physics Part 1: Nuclear astrophysics

    International Nuclear Information System (INIS)

    Arcones, Almudena; Bardayan, Dan W.

    2016-01-01

    This white paper informs the nuclear astrophysics community and funding agencies about the scientific directions and priorities of the field and provides input from this community for the 2015 Nuclear Science Long Range Plan. It also summarizes the outcome of the nuclear astrophysics town meeting that was held on August 21–23, 2014 in College Station at the campus of Texas A&M University in preparation of the NSAC Nuclear Science Long Range Plan. It also reflects the outcome of an earlier town meeting of the nuclear astrophysics community organized by the Joint Institute for Nuclear Astrophysics (JINA) on October 9–10, 2012 Detroit, Michigan, with the purpose of developing a vision for nuclear astrophysics in light of the recent NRC decadal surveys in nuclear physics (NP2010) and astronomy (ASTRO2010). Our white paper is informed informed by the town meeting of the Association of Research at University Nuclear Accelerators (ARUNA) that took place at the University of Notre Dame on June 12–13, 2014. In summary we find that nuclear astrophysics is a modern and vibrant field addressing fundamental science questions at the intersection of nuclear physics and astrophysics. These questions relate to the origin of the elements, the nuclear engines that drive life and death of stars, and the properties of dense matter. A broad range of nuclear accelerator facilities, astronomical observatories, theory efforts, and computational capabilities are needed. Answers to long standing key questions are well within reach in the coming decade because of the developments outlined in this white paper.

  10. High-Performance 3D Articulated Robot Display

    Science.gov (United States)

    Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Kurien, James A.; Abramyan, Lucy

    2011-01-01

    In the domain of telerobotic operations, the primary challenge facing the operator is to understand the state of the robotic platform. One key aspect of understanding the state is to visualize the physical location and configuration of the platform. As there is a wide variety of mobile robots, the requirements for visualizing their configurations vary diversely across different platforms. There can also be diversity in the mechanical mobility, such as wheeled, tracked, or legged mobility over surfaces. Adaptable 3D articulated robot visualization software can accommodate a wide variety of robotic platforms and environments. The visualization has been used for surface, aerial, space, and water robotic vehicle visualization during field testing. It has been used to enable operations of wheeled and legged surface vehicles, and can be readily adapted to facilitate other mechanical mobility solutions. The 3D visualization can render an articulated 3D model of a robotic platform for any environment. Given the model, the software receives real-time telemetry from the avionics system onboard the vehicle and animates the robot visualization to reflect the telemetered physical state. This is used to track the position and attitude in real time to monitor the progress of the vehicle as it traverses its environment. It is also used to monitor the state of any or all articulated elements of the vehicle, such as arms, legs, or control surfaces. The visualization can also render other sorts of telemetered states visually, such as stress or strains that are measured by the avionics. Such data can be used to color or annotate the virtual vehicle to indicate nominal or off-nominal states during operation. The visualization is also able to render the simulated environment where the vehicle is operating. For surface and aerial vehicles, it can render the terrain under the vehicle as the avionics sends it location information (GPS, odometry, or star tracking), and locate the vehicle

  11. Critical ionisation velocity effects in astrophysical plasmas

    International Nuclear Information System (INIS)

    Raadu, M.A.

    1979-08-01

    Critical ionisation velocity effects are relevant to astrophysical situations where neutral gas moves through a magnetised plasma. The experimental significance of the critical velocity is well established and the physical basis is now becoming clear. The underlying mechanism depends on the combined effects of electron impact ionisation and electron energisation by collective plasma interactions. For low density plasmas a theory based on a circular process involving electron heating through a modified two stream instability has been developed. Several applications of critical velocity effects to astrophysical plasmas have been discussed in the literature. The importance of the effect in any particular case may be determined from a detailed consideration of energy and momentum balance, using appropriate atomic rate coefficients and taking full account of collective plasma processes. (Auth.)

  12. International Conference on Particle Physics and Astrophysics

    CERN Document Server

    2015-01-01

    The International Conference on Particle Physics and Astrophysics (ICPPA-2015) will be held in Moscow, Russia, from October 5 to 10, 2015. The conference is organized by Center of Basic Research and Particle Physics of National Research Nuclear University “MEPhI”. The aim of the Conference is to promote contacts between scientists and development of new ideas in fundamental research. Therefore we will bring together experts and young scientists working on experimental and theoretical aspects of nuclear, particle, astroparticle physics and cosmology. ICPPA-2015, aims to present the most recent results in astrophysics and collider physics and reports from the main experiments currently taking data. The working languages of the conference are English and Russian.

  13. Underground nuclear astrophysics at the Dresden Felsenkeller

    Energy Technology Data Exchange (ETDEWEB)

    Bemmerer, Daniel; Ilgner, Christoph; Junghans, Arnd R.; Mueller, Stefan; Rimarzig, Bernd; Schwengner, Ronald; Szuecs, Tamas; Wagner, Andreas [Helmholtz-Zentrum Dresden-Rossendorf (HZDR), Dresden (Germany); Cowan, Thomas E.; Gohl, Stefan; Grieger, Marcel; Reinicke, Stefan; Roeder, Marko; Schmidt, Konrad; Stoeckel, Klaus; Takacs, Marcell P.; Wagner, Louis [Helmholtz-Zentrum Dresden-Rossendorf (HZDR), Dresden (Germany); Technische Universitaet Dresden (Germany); Reinhardt, Tobias P.; Zuber, Kai [Technische Universitaet Dresden (Germany)

    2015-07-01

    Favored by the low background underground, accelerator-based experiments are an important tool to study nuclear astrophysics reactions involving stable charged particles. This technique has been used with great success at the 0.4 MV LUNA accelerator in the Gran Sasso laboratory in Italy. However, the nuclear reactions of helium and carbon burning and the neutron source reactions for the astrophysical s-process require higher beam energies, as well as the continuation of solar fusion studies. As a result, NuPECC strongly recommended the installation of one or more higher-energy underground accelerators. Such a project is underway in Dresden. A 5 MV Pelletron accelerator is currently being refurbished by installing an ion source on the high voltage terminal, enabling intensive helium beams. The preparation of the underground site is funded, and the civil engineering project is being updated. The science case, operational strategy and project status are reported.

  14. Numerical Methods for Radiation Magnetohydrodynamics in Astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Klein, R I; Stone, J M

    2007-11-20

    We describe numerical methods for solving the equations of radiation magnetohydrodynamics (MHD) for astrophysical fluid flow. Such methods are essential for the investigation of the time-dependent and multidimensional dynamics of a variety of astrophysical systems, although our particular interest is motivated by problems in star formation. Over the past few years, the authors have been members of two parallel code development efforts, and this review reflects that organization. In particular, we discuss numerical methods for MHD as implemented in the Athena code, and numerical methods for radiation hydrodynamics as implemented in the Orion code. We discuss the challenges introduced by the use of adaptive mesh refinement in both codes, as well as the most promising directions for future developments.

  15. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  16. Status reports of supercomputing astrophysics in Japan

    International Nuclear Information System (INIS)

    Nakamura, Takashi; Nagasawa, Mikio

    1990-01-01

    The Workshop on Supercomputing Astrophysics was held at National Laboratory for High Energy Physics (KEK, Tsukuba) from August 31 to September 2, 1989. More than 40 participants of physicists, astronomers were attendant and discussed many topics in the informal atmosphere. The main purpose of this workshop was focused on the theoretical activities in computational astrophysics in Japan. It was also aimed to promote effective collaboration between the numerical experimentists working on supercomputing technique. The various subjects of the presented papers of hydrodynamics, plasma physics, gravitating systems, radiative transfer and general relativity are all stimulating. In fact, these numerical calculations become possible now in Japan owing to the power of Japanese supercomputer such as HITAC S820, Fujitsu VP400E and NEC SX-2. (J.P.N.)

  17. Modern fluid dynamics for physics and astrophysics

    CERN Document Server

    Regev, Oded; Yecko, Philip A

    2016-01-01

    This book grew out of the need to provide students with a solid introduction to modern fluid dynamics. It offers a broad grounding in the underlying principles and techniques used, with some emphasis on applications in astrophysics and planetary science. The book comprehensively covers recent developments, methods and techniques, including, for example, new ideas on transitions to turbulence (via transiently growing stable linear modes), new approaches to turbulence (which remains the enigma of fluid dynamics), and the use of asymptotic approximation methods, which can give analytical or semi-analytical results and complement fully numerical treatments. The authors also briefly discuss some important considerations to be taken into account when developing a numerical code for computer simulation of fluid flows. Although the text is populated throughout with examples and problems from the field of astrophysics and planetary science, the text is eminently suitable as a general introduction to fluid dynamics. It...

  18. The Future of Gamma Ray Astrophysics

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Over the past decade, gamma ray astrophysics has entered the astrophysical mainstream. Extremely successful space-borne (GeV) and ground-based (TeV) detectors, combined with a multitude of partner telescopes, have revealed a fascinating “astroscape" of active galactic nuclei, pulsars, gamma ray bursts, supernova remnants, binary stars, star-forming galaxies, novae much more, exhibiting major pathways along which large energy releases can flow. From  a basic physics perspective, exquisitely sensitive measurements have constrained the nature of dark matter, the cosmological origin of magnetic field and the properties of black holes. These advances have motivated the development of new facilities, including HAWC, DAMPE, CTA and SVOM, which will further our understanding of the high energy universe. Topics that will receive special attention include merging neutron star binaries, clusters of galaxies, galactic cosmic rays and putative, TeV dark matter.

  19. Neutrino particle astrophysics: status and outlook

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The discovery of astrophysical neutrinos at high energy by IceCube raises a host of questions: What are the sources? Is there a Galactic as well as an extragalactic component? How does the astrophysical spectrum continue to lower energy where the dominant signal is from atmospheric neutrinos? Is there a measureable flux of cosmogenic neutrinos at higher energy? What is the connection to cosmic rays? At what level and in what energy region should we expect to see evidence of the π0 decay photons that must accompany the neutrinos at production? Such questions are stimulating much theoretical activity and many multi-wavelength follow-up observations as well as driving plans for new detectors. My goal in this presentation will be to connect the neutrino data and their possible interpretations to ongoing multi-messenger observations and to the design of future detectors.

  20. Exploring Astrophysical Magnetohydrodynamics in the Laboratory

    Science.gov (United States)

    Manuel, Mario

    2014-10-01

    Plasma evolution in many astrophysical systems is dominated by magnetohydrodynamics. Specifically of interest to this talk are collimated outflows from accretion systems. Away from the central object, the Euler equations can represent the plasma dynamics well and may be scaled to a laboratory system. We have performed experiments to investigate the effects of a background magnetic field on an otherwise hydrodynamically collimated plasma. Laser-irradiated, cone targets produce hydrodynamically collimated plasma jets and a pulse-powered solenoid provides a constant background magnetic field. The application of this field is shown to completely disrupt the original flow and a new magnetically-collimated, hollow envelope is produced. Results from these experiments and potential implications for their astrophysical analogs will be discussed.