WorldWideScience

Sample records for aim program compilation

  1. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  2. Advanced Industrial Materials (AIM) Program: Compilation of project summaries and significant accomplishments, FY 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    In many ways, the Advanced Industrial Materials (AIM) Program underwent a major transformation in Fiscal Year 1995 and these changes have continued to the present. When the Program was established in 1990 as the Advanced Industrial Concepts (AIC) Materials Program, the mission was to conduct applied research and development to bring materials and processing technologies from the knowledge derived from basic research to the maturity required for the end use sectors for commercialization. In 1995, the Office of Industrial Technologies (OIT) made radical changes in structure and procedures. All technology development was directed toward the seven ``Vision Industries`` that use about 80% of industrial energy and generated about 90% of industrial wastes. The mission of AIM has, therefore, changed to ``Support development and commercialization of new or improved materials to improve productivity, product quality, and energy efficiency in the major process industries.`` Though AIM remains essentially a National Laboratory Program, it is essential that each project have industrial partners, including suppliers to, and customers of, the seven industries. Now, well into FY 1996, the transition is nearly complete and the AIM Program remains reasonably healthy and productive, thanks to the superb investigators and Laboratory Program Managers. This report contains the technical details of some very remarkable work by the best materials scientists and engineers in the world. Subject areas covered are: advanced metals and composites; advanced ceramics and composites; polymers and biobased materials; and new materials and processes.

  3. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  4. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  5. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  6. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  7. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  8. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  9. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  10. Run-Time and Compiler Support for Programming in Adaptive Parallel Environments

    Directory of Open Access Journals (Sweden)

    Guy Edjlali

    1997-01-01

    Full Text Available For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have developed a run-time library to provide this support. We discuss how the run-time library can be used by compilers of high-performance Fortran (HPF-like languages to generate code for an adaptive environment. We present performance results for a Navier-Stokes solver and a multigrid template run on a network of workstations and an IBM SP-2. Our experiments show that if the number of processors is not varied frequently, the cost of data redistribution is not significant compared to the time required for the actual computation. Overall, our work establishes the feasibility of compiling HPF for a network of nondedicated workstations, which are likely to be an important resource for parallel programming in the future.

  11. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  12. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  13. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  14. SERKON program for compiling a multigroup library to be used in BETTY calculation

    International Nuclear Information System (INIS)

    Nguyen Phuoc Lan.

    1982-11-01

    A SERKON-type program was written to compile data sets generated by FEDGROUP-3 into a multigroup library for BETTY calculation. A multigroup library was generated from the ENDF/B-IV data file and tested against the TRX-1 and TRX-2 lattices with good results. (author)

  15. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  16. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  17. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    Science.gov (United States)

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  18. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  19. Bellman's GAP--a language and compiler for dynamic programming in sequence analysis.

    Science.gov (United States)

    Sauthoff, Georg; Möhl, Mathias; Janssen, Stefan; Giegerich, Robert

    2013-03-01

    Dynamic programming is ubiquitous in bioinformatics. Developing and implementing non-trivial dynamic programming algorithms is often error prone and tedious. Bellman's GAP is a new programming system, designed to ease the development of bioinformatics tools based on the dynamic programming technique. In Bellman's GAP, dynamic programming algorithms are described in a declarative style by tree grammars, evaluation algebras and products formed thereof. This bypasses the design of explicit dynamic programming recurrences and yields programs that are free of subscript errors, modular and easy to modify. The declarative modules are compiled into C++ code that is competitive to carefully hand-crafted implementations. This article introduces the Bellman's GAP system and its language, GAP-L. It then demonstrates the ease of development and the degree of re-use by creating variants of two common bioinformatics algorithms. Finally, it evaluates Bellman's GAP as an implementation platform of 'real-world' bioinformatics tools. Bellman's GAP is available under GPL license from http://bibiserv.cebitec.uni-bielefeld.de/bellmansgap. This Web site includes a repository of re-usable modules for RNA folding based on thermodynamics.

  20. The Concert system - Compiler and runtime technology for efficient concurrent object-oriented programming

    Science.gov (United States)

    Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak

    1993-01-01

    Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.

  1. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  2. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  3. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  4. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  5. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  6. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  7. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  8. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  9. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  10. Bellman’s GAP—a language and compiler for dynamic programming in sequence analysis

    Science.gov (United States)

    Sauthoff, Georg; Möhl, Mathias; Janssen, Stefan; Giegerich, Robert

    2013-01-01

    Motivation: Dynamic programming is ubiquitous in bioinformatics. Developing and implementing non-trivial dynamic programming algorithms is often error prone and tedious. Bellman’s GAP is a new programming system, designed to ease the development of bioinformatics tools based on the dynamic programming technique. Results: In Bellman’s GAP, dynamic programming algorithms are described in a declarative style by tree grammars, evaluation algebras and products formed thereof. This bypasses the design of explicit dynamic programming recurrences and yields programs that are free of subscript errors, modular and easy to modify. The declarative modules are compiled into C++ code that is competitive to carefully hand-crafted implementations. This article introduces the Bellman’s GAP system and its language, GAP-L. It then demonstrates the ease of development and the degree of re-use by creating variants of two common bioinformatics algorithms. Finally, it evaluates Bellman’s GAP as an implementation platform of ‘real-world’ bioinformatics tools. Availability: Bellman’s GAP is available under GPL license from http://bibiserv.cebitec.uni-bielefeld.de/bellmansgap. This Web site includes a repository of re-usable modules for RNA folding based on thermodynamics. Contact: robert@techfak.uni-bielefeld.de Supplementary information: Supplementary data are available at Bioinformatics online PMID:23355290

  11. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  12. Program for accident and incident management support, AIMS

    International Nuclear Information System (INIS)

    Putra, M.A.

    1993-12-01

    A prototype of an advisory computer program is presented which could be used in monitoring and analyzing an ongoing incident in a nuclear power plant. The advisory computer program, called the Accident and Incident Management Support (AIMS), focuses on processing a set of data that is to be transmitted from a nuclear power plant to a national or regional emergency center during an incident. The AIMS program will assess the reactor conditions by processing the measured plant parameters. The applied model of the power plant contains a level of complexity that is comparable with the simplified plant model that the power plant operator uses. A standardized decay heat function and a steam water property library is used in the integral balance equations for mass and energy. A simulation of the station blackout accident of the Borssele plant is used to test the program. The program predicts successively: (1) the time of dryout of the steam generators, (2) the time of saturation of the primary system, and (3) the onset of core uncovery. The coolant system with the actual water levels will be displayed on the screen. (orig./HP)

  13. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  14. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  15. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  16. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  17. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  18. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  19. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  20. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  1. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  2. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  3. Advanced Industrial Materials (AIM) Program: Annual progress report FY 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    In many ways, the Advanced Industrial Materials (AIM) Program underwent a major transformation in Fiscal Year 1995 and these changes have continued to the present. When the Program was established in 1990 as the Advanced Industrial Concepts (AIC) Materials Program, the mission was to conduct applied research and development to bring materials and processing technologies from the knowledge derived from basic research to the maturity required for the end use sectors for commercialization. In 1995, the Office of Industrial Technologies (OIT) made radical changes in structure and procedures. All technology development was directed toward the seven ``Vision Industries`` that use about 80% of industrial energy and generated about 90% of industrial wastes. The mission of AIM has, therefore, changed to ``Support development and commercialization of new or improved materials to improve productivity, product quality, and energy efficiency in the major process industries.`` Though AIM remains essentially a National Laboratory Program, it is essential that each project have industrial partners, including suppliers to, and customers of, the seven industries. Now, well into FY 1996, the transition is nearly complete and the AIM Program remains reasonably healthy and productive, thanks to the superb investigators and Laboratory Program Managers. This Annual Report for FY 1995 contains the technical details of some very remarkable work by the best materials scientists and engineers in the world. Areas covered here are: advanced metals and composites; advanced ceramics and composites; polymers and biobased materials; and new materials and processes.

  4. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  5. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  6. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  7. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  8. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  9. Advanced Industrial Materials (AIM) program. Annual progress report. FY 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    The Advanced Industrial Materials (AIM) Program underwent a major transformation in Fiscal Year 1995 and these changes have continued to the present. When the Program was established in 1990 as the Advanced Industrial Concepts (AIC) Materials Program, the mission was to conduct applied research and development to bring materials and processing technologies from the knowledge derived from basic research to the maturity required for the end use sectors for commercialization. In 1995, the Office of Industrial Technologies (OIT) made radical changes in structure and procedures. All technology development was directed toward the seven `Vision Industries` that use about 80% of industrial energy and generated about 90% of industrial wastes. These are: aluminium; chemical; forest products; glass; metal casting; refineries; and steel. OIT is working with these industries, through appropriate organizations, to develop Visions of the desired condition of each industry some 20 or 25 years in the future and then to prepare Road Maps and Implementation Plans to enable them to reach their goals. The mission of AIM has, therefore, changed to `Support development and commercialization of new or improved materials to improve productivity, product quality, and energy efficiency in the major process industries.` Though AIM remains essentially a National Laboratory Program, it is necessary that each project have industrial partners, including suppliers to, and customers of, the seven industries. Now, well into FY 1996, the transition is nearly complete and the AIM Program remains healthy and productive, thanks to the superb investigators and Laboratory Program Managers. Separate abstracts have been indexed into the energy database for articles from this report.

  10. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  11. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  12. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  13. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  14. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  15. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  16. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  17. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  18. Advanced Industrial Materials (AIM) Program annual progress report, FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    The Advanced Industrial Materials (AIM) Program is a part of the Office of Industrial Technologies (OIT), Energy Efficiency and Renewable Energy, US Department of Energy (DOE). The mission of AIM is to support development and commercialization of new or improved materials to improve energy efficiency, productivity, product quality, and reduced waste in the major process industries. OIT has embarked on a fundamentally new way of working with industries--the Industries of the Future (IOF) strategy--concentrating on the major process industries that consume about 90% of the energy and generate about 90% of the waste in the industrial sector. These are the aluminum, chemical, forest products, glass, metalcasting, and steel industries. OIT has encouraged and assisted these industries in developing visions of what they will be like 20 or 30 years into the future, defining the drivers, technology needs, and barriers to realization of their visions. These visions provide a framework for development of technology roadmaps and implementation plans, some of which have been completed. The AIM Program supports IOF by conducting research and development on materials to solve problems identified in the roadmaps. This is done by National Laboratory/industry/university teams with the facilities and expertise needed to develop new and improved materials. Each project in the AIM Program has active industrial participation and support.

  19. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  20. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  1. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  2. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  3. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  4. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  5. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  6. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  7. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  8. Discussion on the Criterion for the Safety Certification Basis Compilation - Brazilian Space Program Case

    Science.gov (United States)

    Niwa, M.; Alves, N. C.; Caetano, A. O.; Andrade, N. S. O.

    2012-01-01

    The recent advent of the commercial launch and re- entry activities, for promoting the expansion of human access to space for tourism and hypersonic travel, in the already complex ambience of the global space activities, brought additional difficulties over the development of a harmonized framework of international safety rules. In the present work, with the purpose of providing some complementary elements for global safety rule development, the certification-related activities conducted in the Brazilian space program are depicted and discussed, focusing mainly on the criterion for certification basis compilation. The results suggest that the composition of a certification basis with the preferential use of internationally-recognized standards, as is the case of ISO standards, can be a first step toward the development of an international safety regulation for commercial space activities.

  9. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  10. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  11. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  12. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  13. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  14. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  15. Using the RE-AIM framework to evaluate physical activity public health programs in México.

    Science.gov (United States)

    Jauregui, Edtna; Pacheco, Ann M; Soltero, Erica G; O'Connor, Teresia M; Castro, Cynthia M; Estabrooks, Paul A; McNeill, Lorna H; Lee, Rebecca E

    2015-02-19

    Physical activity (PA) public health programming has been widely used in Mexico; however, few studies have documented individual and organizational factors that might be used to evaluate their public health impact. The RE-AIM framework is an evaluation tool that examines individual and organizational factors of public health programs. The purpose of this study was to use the RE-AIM framework to determine the degree to which PA programs in Mexico reported individual and organizational factors and to investigate whether reporting differed by the program's funding source. Public health programs promoting PA were systematically identified during 2008-2013 and had to have an active program website. Initial searches produced 23 possible programs with 12 meeting inclusion criteria. A coding sheet was developed to capture behavioral, outcome and RE-AIM indicators from program websites. In addition to targeting PA, five (42%) programs also targeted dietary habits and the most commonly reported outcome was change in body composition (58%). Programs reported an average of 11.1 (±3.9) RE-AIM indicator items (out of 27 total). On average, 45% reported reach indicators, 34% reported efficacy/effectiveness indicators, 60% reported adoption indicators, 40% reported implementation indicators, and 35% reported maintenance indicators. The proportion of RE-AIM indicators reported did not differ significantly for programs that were government supported (M = 10, SD = 3.1) and programs that were partially or wholly privately or corporately supported (M = 12.0, SD = 4.4). While reach and adoption of these programs were most commonly reported, there is a need for stronger evaluation of behavioral and health outcomes before the public health impact of these programs can be established.

  16. Advanced Industrial Materials (AIM) Program. Annual progress report, FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Sorrell, C.A.

    1995-05-01

    The Advanced Industrial Materials Program is a part of the Office of Industrial Technologies (OIT), Energy Efficiency and Renewable Energy in the Department of Energy. The mission of the AIM Program is to conduct applied research, development, and applications engineering work, in partnership with industry, to commercialize new or improved materials and materials processing methods that will improve energy efficiency, productivity, and competitiveness. AIM is responsible for identifying, supporting, and coordinating multidisciplinary projects to solve identified industrial needs and transferring the technology to the industrial sector. Program investigators in the DOE National Laboratories are working closely with approximately 100 companies, including 15 partners in Cooperative Research and Development Agreements. Work is being done in a wide variety of materials technologies, including intermetallic alloys, ceramic composites, metal composites, polymers, engineered porous materials, and surface modification. The Program supports other efforts in the Office of Industrial Technologies to assist the energy consuming process industries, including forest products, glass, steel, aluminum, foundries, chemicals, and refineries. To support OITs {open_quotes}Industries of the Future{close_quotes} initiatives and to improve the relevance of materials research, assessments of materials needs and opportunities in the process industries are being made. These assessments are being used for program planning and priority setting; support of work to satisfy those needs is being provided. Many new materials that have come into the marketplace in recent years, or that will be available for commercial use within a few more years, offer substantial benefits to society. This document contains 28 reports on advanced materials research. Individual reports have been processed separately for entry onto the Department of Energy databases.

  17. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  18. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  19. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  20. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  1. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  2. ACE - an algebraic compiler and encoder for the Chalk River datatron computer

    International Nuclear Information System (INIS)

    Kennedy, J.M.; Okazaki, E.A.; Millican, M.

    1960-03-01

    ACE is a program written for the Chalk River Datatron (Burroughs 205) Computer to enable the machine to compile a program for solving a problem from instructions supplied by the user in a notation related much more closely to algebra than to the machine's own code. (author)

  3. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  4. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  5. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  6. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  7. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  8. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  9. Technique to increase performance of C-program for control systems. Compiler technique for low-cost CPU; Seigyoyo C gengo program no kosokuka gijutsu. Tei cost CPU no tame no gengo compiler gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Y [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    The software of automotive control systems has become increasingly large and complex. High level languages (primarily C) and the compilers become more important to reduce coding time. Most compilers represent real number in the floating point format specified by IEEE standard 754. Most microprocessors in the automotive industry have no hardware for the operation using the IEEE standard due to the cost requirements, resulting in the slow execution speed and large code size. Alternative formats to increase execution speed and reduce code size are proposed. Experimental results for the alternative formats show the improvement in execution speed and code size. 4 refs., 3 figs., 2 tabs.

  10. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  11. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  12. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  13. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  14. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  15. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  16. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  17. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation model.

    Science.gov (United States)

    Mongkolwat, Pattanasak; Kleper, Vladimir; Talbot, Skip; Rubin, Daniel

    2014-12-01

    Knowledge contained within in vivo imaging annotated by human experts or computer programs is typically stored as unstructured text and separated from other associated information. The National Cancer Informatics Program (NCIP) Annotation and Image Markup (AIM) Foundation information model is an evolution of the National Institute of Health's (NIH) National Cancer Institute's (NCI) Cancer Bioinformatics Grid (caBIG®) AIM model. The model applies to various image types created by various techniques and disciplines. It has evolved in response to the feedback and changing demands from the imaging community at NCI. The foundation model serves as a base for other imaging disciplines that want to extend the type of information the model collects. The model captures physical entities and their characteristics, imaging observation entities and their characteristics, markups (two- and three-dimensional), AIM statements, calculations, image source, inferences, annotation role, task context or workflow, audit trail, AIM creator details, equipment used to create AIM instances, subject demographics, and adjudication observations. An AIM instance can be stored as a Digital Imaging and Communications in Medicine (DICOM) structured reporting (SR) object or Extensible Markup Language (XML) document for further processing and analysis. An AIM instance consists of one or more annotations and associated markups of a single finding along with other ancillary information in the AIM model. An annotation describes information about the meaning of pixel data in an image. A markup is a graphical drawing placed on the image that depicts a region of interest. This paper describes fundamental AIM concepts and how to use and extend AIM for various imaging disciplines.

  18. PEMILIHAN PROGRAM PENGENTASAN KEMISKINAN MELALUI PENGEMBANGAN MODEL PEMBERDAYAAN MASYARAKAT DENGAN PENDEKATAN SISTEM

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2015-06-01

    Full Text Available This research aims to compile the programs for poverty alleviation by community empowerment model and review the determination program as effectiveness evaluation poverty alleviation program which still can’t be worked properly. Stages the compiling program of poverty alleviation is mapping the socioeconomic conditions of the poor, basic infrastructure conditions, socio-cultural issues, and potential issues; identifying the hopes and predicting the economic development opportunities; creating the poverty alleviation program by SWOT analysis and planning implementation program with KPD. Based on the result of SWOT and scoring analysis, the selected programs are training and assistance, the establishment of cooperative saving and loans, clean water for poor households, rural development with the utilization of clean water, household waste management, and package education program A, B, and C.

  19. Some measurements of Java-to-bytecode compiler performance in the Java Virtual Machine

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies.

  20. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  1. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  2. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  3. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  4. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  5. Materials Sciences Programs

    International Nuclear Information System (INIS)

    1977-01-01

    A compilation and index of the ERDA materials sciences program is presented. This compilation is intended for use by administrators, managers, and scientists to help coordinate research and as an aid in selecting new programs

  6. Programming system for analytic geometry

    International Nuclear Information System (INIS)

    Raymond, Jacques

    1970-01-01

    After having outlined the characteristics of computing centres which do not comply with engineering tasks, notably the time required by all different tasks to be performed when developing a software (assembly, compilation, link edition, loading, run), and identified constraints specific to engineering, the author identifies the characteristics a programming system should have to suit engineering tasks. He discussed existing conversational systems and their programming language, and their main drawbacks. Then, he presents a system which aims at facilitating programming and addressing problems of analytic geometry and trigonometry

  7. Using a Mixed-Methods RE-AIM Framework to Evaluate Community Health Programs for Older Latinas.

    Science.gov (United States)

    Schwingel, Andiara; Gálvez, Patricia; Linares, Deborah; Sebastião, Emerson

    2017-06-01

    This study used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to evaluate a promotora-led community health program designed for Latinas ages 50 and older that sought to improve physical activity, nutrition, and stress management. A mixed-methods evaluation approach was administered at participant and organizational levels with a focus on the efficacy, adoption, implementation, and maintenance components of the RE-AIM theoretical model. The program was shown to be effective at improving participants' eating behaviors, increasing their physical activity levels, and lowering their depressive symptoms. Promotoras felt motivated and sufficiently prepared to deliver the program. Some implementation challenges were reported. More child care opportunities and an increased focus on mental well-being were suggested. The promotora delivery model has promise for program sustainability with both promotoras and participants alike expressing interest in leading future programs.

  8. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    Science.gov (United States)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  9. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  10. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  11. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  12. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  13. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  14. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection. (RWR)

  15. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection

  16. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  17. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection. (RWR)

  18. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection

  19. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  20. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  1. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  2. A 3-Month Jump-Landing Training Program: A Feasibility Study Using the RE-AIM Framework

    NARCIS (Netherlands)

    Aerts, I.; Cumps, E.; Verhagen, E.A.L.M.; Mathieu, N.; Van Schuerbeeck, S.; Meeusen, R.

    2013-01-01

    Context: Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. Objective: To evaluate the applicability of a 3-month jumplanding training program in basketball players, using the RE-AIM (reach,

  3. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  4. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  5. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  6. The Advanced Industrial Materials (AIM) program office of industrial technologies fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Sorrell, C.A.

    1997-04-01

    In many ways, the Advanced Industrial Materials (AIM) Program underwent a major transformation in FY95 and these changes have continued to the present. When the Program was established in 1990 as the Advanced Industrial Concepts (AIC) Materials Program, the mission was to conduct applied research and development to bring materials and processing technologies from the knowledge derived from basic research to the maturity required for the end use sectors for commercialization. In 1995, the Office of Industrial Technologies (OIT) made radical changes in structure and procedures. All technology development was directed toward the seven `Vision Industries` that use about 80% of industrial energy and generated about 90% of industrial wastes. These are: aluminium; chemical; forest products; glass; metal casting; refineries; and steel. OIT is working with these industries, through appropriate organizations, to develop Visions of the desired condition of each industry some 20 to 25 years in the future and then to prepare Road Maps and Implementation Plans to enable them to reach their goals. The mission of AIM has, therefore, changed to `Support development and commercialization of new or improved materials to improve productivity, product quality, and energy efficiency in the major process industries.`

  7. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  8. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  9. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  10. Feasibility of Python in teaching programming

    Directory of Open Access Journals (Sweden)

    Rafael Martínez Estévez

    2014-03-01

    Full Text Available Given the diversity of the objectives of the programming courses in the Cuban educational system and the training of teachers, it is not easy to decide the language to be used in each case. The intention of this article is to bring to debate to our context a trend that has been growing in the last decade: Python as a first programming language. The aim of this study is to compile some inter national experiences in the use of Python in introductory programming courses, also analyzing their advantages and disadvantages.

  11. Array abstractions for GPU programming

    DEFF Research Database (Denmark)

    Dybdal, Martin

    The shift towards massively parallel hardware platforms for highperformance computing tasks has introduced a need for improved programming models that facilitate ease of reasoning for both users and compiler optimization. A promising direction is the field of functional data-parallel programming......, for which functional invariants can be utilized by optimizing compilers to perform large program transformations automatically. However, the previous work in this area allow users only limited ability to reason about the performance of algorithms. For this reason, such languages have yet to see wide...... industrial adoption. We present two programming languages that attempt at both supporting industrial applications and providing reasoning tools for hierarchical data-parallel architectures, such as GPUs. First, we present TAIL, an array based intermediate language and compiler framework for compiling a large...

  12. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  13. Technology Commercialization Program 1991

    Energy Technology Data Exchange (ETDEWEB)

    1991-11-01

    This reference compilation describes the Technology Commercialization Program of the Department of Energy, Defense Programs. The compilation consists of two sections. Section 1, Plans and Procedures, describes the plans and procedures of the Defense Programs Technology Commercialization Program. The second section, Legislation and Policy, identifies legislation and policy related to the Program. The procedures for implementing statutory and regulatory requirements are evolving with time. This document will be periodically updated to reflect changes and new material.

  14. Compilation of anatomical, physiological and dietary characteristics for a Filipino Reference Man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, C.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Shiraishi, K.

    1998-01-01

    The Asian Reference Man is the study of the biological characteristics of the different ethnic populations in the Asian Region. Its aim is to update the existing International Reference Values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian Reference Man and is represented by the Philippine Nuclear Research Institute. The biological parameters included in the study are the physical, anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are presented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper Compiled Program. (author)

  15. Compilation of anatomical, physiological and dietary characteristics for a Filipino reference man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, G.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Kawamura, H.; Shiraishi, K.

    1995-01-01

    The Asian reference man is a study of the biological characteristics of the different ethnic populations in the Asian region. Its aim is to update the existing international values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian reference man and is presented by the Philippine Nuclear Research Institute. The biological parameters included in this study are the physical anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are represented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper compiled Program. (author). 18 refs., 12 tabs., 1 fig

  16. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  17. A 3-month jump-landing training program: a feasibility study using the RE-AIM framework.

    Science.gov (United States)

    Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain

    2013-01-01

    Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Randomized controlled trial. National and regional basketball teams. Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries.

  18. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  19. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  20. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  1. The Danish national return-to-work program - aims, content, and design of the process and effect evaluation

    NARCIS (Netherlands)

    Aust, Birgit; Helverskov, Trine; Nielsen, Maj Britt D.; Bjorner, Jakob Bue; Rugulies, Reiner; Nielsen, Karina; Sorensen, Ole H.; Grundtvig, Gry; Andersen, Malene F.; Hansen, Jorgen V.; Buchardt, Helle L.; Nielsen, Lisbeth; Lund, Trine L.; Andersen, Irene; Andersen, Mogens H.; Clausen, Aksel S.; Heinesen, Eskil; Mortensen, Ole S.; Ektor-Andersen, John; Orbaek, Palle; Winzor, Glen; Bultmann, Ute; Poulsen, Otto M.

    The Danish national return-to-work (RTW) program aims to improve the management of municipal sickness benefit in Denmark. A study is currently ongoing to evaluate the RTW program. The purpose of this article is to describe the study protocol. The program includes 21 municipalities encompassing

  2. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  3. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  4. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  5. LISP software generative compilation within the frame of a SLIP system; La compilation generative de programmes LISP dans le cadre d'un systeme SLIP

    Energy Technology Data Exchange (ETDEWEB)

    Sitbon, Andre

    1968-04-24

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language.

  6. Selecting informative food items for compiling food-frequency questionnaires: Comparison of procedures

    NARCIS (Netherlands)

    Molag, M.L.; Vries, J.H.M. de; Duif, N.; Ocké, M.C.; Dagnelie, P.C.; Goldbohm, R.A.; Veer, P. van 't

    2010-01-01

    The authors automated the selection of foods in a computer system that compiles and processes tailored FFQ. For the selection of food items, several methods are available. The aim of the present study was to compare food lists made by MOM2, which identifies food items with highest between-person

  7. Compiling Scientific Programs for Scalable Parallel Systems

    National Research Council Canada - National Science Library

    Kennedy, Ken

    2001-01-01

    ...). The research performed in this project included new techniques for recognizing implicit parallelism in sequential programs, a powerful and precise set-based framework for analysis and transformation...

  8. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  9. Compilation and evaluation of atomic and molecular data relevant to controlled thermonuclear research needs: USA programs

    International Nuclear Information System (INIS)

    Barnett, C.F.

    1976-01-01

    The U.S. role in the compilation and evaluation of atomic data for controlled thermonuclear research is discussed in the following three areas: (1) atomic structure data, (2) atomic collision data, and (3) surface data

  10. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  11. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  12. Proceedings Fifth International Workshop on Verification and Program Transformation

    OpenAIRE

    Lisitsa, Alexei; Nemytykh, Andrei P.; Proietti, Maurizio

    2017-01-01

    We extend a technique called Compiling Control. The technique transforms coroutining logic programs into logic programs that, when executed under the standard left-to-right selection rule (and not using any delay features) have the same computational behavior as the coroutining program. In recent work, we revised Compiling Control and reformulated it as an instance of Abstract Conjunctive Partial Deduction. This work was mostly focused on the program analysis performed in Compiling Control. I...

  13. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  14. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  15. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  16. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  17. Depleted uranium hexafluoride management program : data compilation for the Paducah site

    International Nuclear Information System (INIS)

    Hartmann, H.

    2001-01-01

    This report is a compilation of data and analyses for the Paducah site, near Paducah, Kentucky. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Paducah site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  18. Depleted uranium hexafluoride management program : data compilation for the Portsmouth site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the Portsmouth site, near Portsmouth, Ohio. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Portsmouth site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) management activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  19. PPC - an interactive preprocessor/compiler for the DSNP simulation language

    International Nuclear Information System (INIS)

    Mahannah, J.A.; Schor, A.L.

    1986-01-01

    The PPC preprocessor/compiler was developed for the dynamic simulator for nuclear power plant DSNP simulation language. The goal of PPC is to provide an easy-to-use, interactive programming environment that will aid both the beginner and well-seasoned DSNP programmer. PPC simplifies the steps of the simulation development process for any user. All will benefit from the on-line help facilities, easy manipulation of modules, the elimination of syntax errors, and the general systematic approach. PPC is a very structured and modular program that allows for easy expansion and modification. Written entirely in C, it is fast, compact, and portable. Used as a front end, it greatly enhances the DSNP desirability as a simulation tool for education and research

  20. HAL/S - The programming language for Shuttle

    Science.gov (United States)

    Martin, F. H.

    1974-01-01

    HAL/S is a higher order language and system, now operational, adopted by NASA for programming Space Shuttle on-board software. Program reliability is enhanced through language clarity and readability, modularity through program structure, and protection of code and data. Salient features of HAL/S include output orientation, automatic checking (with strictly enforced compiler rules), the availability of linear algebra, real-time control, a statement-level simulator, and compiler transferability (for applying HAL/S to additional object and host computers). The compiler is described briefly.

  1. Maps compiled by the ESSO Minerals Company during their exploration program for uranium in South Africa

    International Nuclear Information System (INIS)

    Bertolini, A.; Pretorius, L.; Weideman, M.; Scheepers, T.

    1985-09-01

    The report is a bibliography of approximately one thousand maps. The maps contain information of ESSO Minerals Company's prospecting activities for mainly uranium in South Africa. ESSO explorated for uranium in the Karoo, Northwestern Cape and the Bushveld. The bibliography contains two indexes. The one is a list of prospects and projects as per geological province and the other is an alphabetic list of projects and prospects. Three geological provinces are distiguished, namely, the Bushveld province, Karoo province and Namaqualand province. The annotations contain information on the location and geographic area of the map, the name of the project or prospect, the title, a statement of resposibility (this includes the compiles i.e. geologists, and/or draftsmen), the statement of scale which is always expressed as a ratio, the date of compilation and/or revision and a few keywords to indicate the topical subject matter

  2. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  3. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  4. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-07-28

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLS compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.

  5. National Energy Software Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.M.; Butler, M.K.; De Bruler, M.M.

    1979-05-01

    This is the third complete revision of program abstracts undertaken by the Center. Programs of the IBM 7040, 7090, and CDC 3600 vintage have been removed. Historical data and information on abstract format, program package contents, and subject classification are given. The following subject areas are included in the library: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; electronics, engineering equipment, and energy systems studies; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; data. (RWR)

  6. National Energy Software Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Brown, J.M.; Butler, M.K.; De Bruler, M.M.

    1979-05-01

    This is the third complete revision of program abstracts undertaken by the Center. Programs of the IBM 7040, 7090, and CDC 3600 vintage have been removed. Historical data and information on abstract format, program package contents, and subject classification are given. The following subject areas are included in the library: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; electronics, engineering equipment, and energy systems studies; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; data

  7. Compiling the functional data-parallel language SaC for Microgrids of Self-Adaptive Virtual Processors

    NARCIS (Netherlands)

    Grelck, C.; Herhut, S.; Jesshope, C.; Joslin, C.; Lankamp, M.; Scholz, S.-B.; Shafarenko, A.

    2009-01-01

    We present preliminary results from compiling the high-level, functional and data-parallel programming language SaC into a novel multi-core design: Microgrids of Self-Adaptive Virtual Processors (SVPs). The side-effect free nature of SaC in conjunction with its data-parallel foundation make it an

  8. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  9. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  10. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  11. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  12. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  13. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  14. LISP software generative compilation within the frame of a SLIP system

    International Nuclear Information System (INIS)

    Sitbon, Andre

    1968-01-01

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language

  15. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  16. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  17. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  18. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  19. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  20. Type Soundness in the Dart Programming Language

    DEFF Research Database (Denmark)

    Strocco, Fabio

    Many mainstream programming languages are dynamically typed. This allows for rapid software development and programming flexibility because it gives programmers the freedom to use powerful programming patterns that are not allowed in statically typed programming languages. Nevertheless......, this freedom does not come without drawbacks: static bugs detection, IDE support, and compiler optimization techniques are harder to implement. In the last decades, the research literature and mainstream programming languages have been aiming to reach a trade-off between statically typed and dynamically typed...... languages. We investigate the trade-off, focusing on the area of optional typing, which allows programmers to choose when to use static type checking in parts of pro- grams. Our primary focus is Dart, an optionally typed programming language with a type system that is unsound by design. What makes Dart...

  1. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  2. Basic circuit compilation techniques for an ion-trap quantum machine

    International Nuclear Information System (INIS)

    Maslov, Dmitri

    2017-01-01

    We study the problem of compilation of quantum algorithms into optimized physical-level circuits executable in a quantum information processing (QIP) experiment based on trapped atomic ions. We report a complete strategy: starting with an algorithm in the form of a quantum computer program, we compile it into a high-level logical circuit that goes through multiple stages of decomposition into progressively lower-level circuits until we reach the physical execution-level specification. We skip the fault-tolerance layer, as it is not within the scope of this work. The different stages are structured so as to best assist with the overall optimization while taking into account numerous optimization criteria, including minimizing the number of expensive two-qubit gates, minimizing the number of less expensive single-qubit gates, optimizing the runtime, minimizing the overall circuit error, and optimizing classical control sequences. Our approach allows a trade-off between circuit runtime and quantum error, as well as to accommodate future changes in the optimization criteria that may likely arise as a result of the anticipated improvements in the physical-level control of the experiment. (paper)

  3. 1989 OCRWM [Office of Civilian Radioactive Waste Management] Bulletin compilation and index

    International Nuclear Information System (INIS)

    1990-02-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1989 calendar year. A table of contents and one index have been provided to assist in finding information contained in this year's Bulletins. The pages have been numbered consecutively at the bottom for easy reference. 7 figs

  4. Compiler generation and autotuning of communication-avoiding operators for geometric multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Venkat, Anand [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-04-17

    This paper describes a compiler approach to introducing communication-avoiding optimizations in geometric multigrid (GMG), one of the most popular methods for solving partial differential equations. Communication-avoiding optimizations reduce vertical communication through the memory hierarchy and horizontal communication across processes or threads, usually at the expense of introducing redundant computation. We focus on applying these optimizations to the smooth operator, which successively reduces the error and accounts for the largest fraction of the GMG execution time. Our compiler technology applies both novel and known transformations to derive an implementation comparable to manually-tuned code. To make the approach portable, an underlying autotuning system explores the tradeoff between reduced communication and increased computation, as well as tradeoffs in threading schemes, to automatically identify the best implementation for a particular architecture and at each computation phase. Results show that we are able to quadruple the performance of the smooth operation on the finest grids while attaining performance within 94% of manually-tuned code. Overall we improve the overall multigrid solve time by 2.5× without sacrificing programer productivity.

  5. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  6. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  7. Preventing Run-Time Bugs at Compile-Time Using Advanced C++

    Energy Technology Data Exchange (ETDEWEB)

    Neswold, Richard [Fermilab

    2018-01-01

    When writing software, we develop algorithms that tell the computer what to do at run-time. Our solutions are easier to understand and debug when they are properly modeled using class hierarchies, enumerations, and a well-factored API. Unfortunately, even with these design tools, we end up having to debug our programs at run-time. Worse still, debugging an embedded system changes its dynamics, making it tough to find and fix concurrency issues. This paper describes techniques using C++ to detect run-time bugs *at compile time*. A concurrency library, developed at Fermilab, is used for examples in illustrating these techniques.

  8. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  9. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  10. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  11. Programming languages for circuit design.

    Science.gov (United States)

    Pedersen, Michael; Yordanov, Boyan

    2015-01-01

    This chapter provides an overview of a programming language for Genetic Engineering of Cells (GEC). A GEC program specifies a genetic circuit at a high level of abstraction through constraints on otherwise unspecified DNA parts. The GEC compiler then selects parts which satisfy the constraints from a given parts database. GEC further provides more conventional programming language constructs for abstraction, e.g., through modularity. The GEC language and compiler is available through a Web tool which also provides functionality, e.g., for simulation of designed circuits.

  12. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  13. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  14. Inference of Program Properties with Attribute Grammars, Revisited

    NARCIS (Netherlands)

    Middelkoop, A.

    2012-01-01

    A programming language is an essential ingredient for writing concise, maintainable, and error-free computer programs. A compiler takes a text written in such a language and compiles into machine instructions, and is usually implemented as a number of traversals over the abstract syntax of the

  15. Irradiation of red meat. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ''Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control''. 146 refs

  16. Irradiation of red meat. A compilation of technical data for its authorization and control

    Energy Technology Data Exchange (ETDEWEB)

    International consultative group on food irradiation

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ``Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control``. 146 refs.

  17. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  18. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  19. Programming in Fortran M

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Olson, R.; Tuecke, S.

    1993-08-01

    Fortran M is a small set of extensions to Fortran that supports a modular approach to the construction of sequential and parallel programs. Fortran M programs use channels to plug together processes which may be written in Fortran M or Fortran 77. Processes communicate by sending and receiving messages on channels. Channels and processes can be created dynamically, but programs remain deterministic unless specialized nondeterministic constructs are used. Fortran M programs can execute on a range of sequential, parallel, and networked computers. This report incorporates both a tutorial introduction to Fortran M and a users guide for the Fortran M compiler developed at Argonne National Laboratory. The Fortran M compiler, supporting software, and documentation are made available free of charge by Argonne National Laboratory, but are protected by a copyright which places certain restrictions on how they may be redistributed. See the software for details. The latest version of both the compiler and this manual can be obtained by anonymous ftp from Argonne National Laboratory in the directory pub/fortran-m at info.mcs.anl.gov.

  20. Higher-Order Program Generation

    DEFF Research Database (Denmark)

    Rhiger, Morten

    for OCaml, a dialect of ML, that provides run-time code generation for OCaml programs. We apply these byte-code combinators in semantics-directed compilation for an imperative language and in run-time specialization using type-directed partial evaluation. Finally, we present an approach to compiling goal......This dissertation addresses the challenges of embedding programming languages, specializing generic programs to specific parameters, and generating specialized instances of programs directly as executable code. Our main tools are higher-order programming techniques and automatic program generation....... It is our thesis that they synergize well in the development of customizable software. Recent research on domain-specific languages propose to embed them into existing general-purpose languages. Typed higher-order languages have proven especially useful as meta languages because they provide a rich...

  1. “Frontload” in complex project program management to aim for lifetime sustainability of offshore windmill parks

    DEFF Research Database (Denmark)

    Brink, Tove

    2015-01-01

    This paper reveals how project program management can aim for lifetime sustainability of offshore windmill parks through innovation. The research is based on a qualitative focus group interview with 11 enterprises and 6 individual semi-constructed interviews with 6 enterprises. Offshore windmill...

  2. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  3. SEGY to ASCII: Conversion and Plotting Program

    Science.gov (United States)

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  4. A compilation of structural property data for computer impact calculation (1/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nagata, Norio.

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 1 involve structural property data and data processing computer program. (author)

  5. Software support for Motorola 68000 microprocessor at CERN. CERN convention for programming the MC68000 family

    International Nuclear Information System (INIS)

    Cailliau, R.; Carpenter, B.

    1984-01-01

    The CERN convention for programming the MC68000 family of microprocessors gives a set of rules describing the layout of the memory and stack frames used by routines as they should appear before and after their calling sequences. It does not deal with the instructions used to achieve these states. The aim of the convention is to allow programming language mixing as well as debugging of programs built from units written in different languages. It is to be followed by programmers and programming-language compilers. (orig.)

  6. Testing New Programming Paradigms with NAS Parallel Benchmarks

    Science.gov (United States)

    Jin, H.; Frumkin, M.; Schultz, M.; Yan, J.

    2000-01-01

    Over the past decade, high performance computing has evolved rapidly, not only in hardware architectures but also with increasing complexity of real applications. Technologies have been developing to aim at scaling up to thousands of processors on both distributed and shared memory systems. Development of parallel programs on these computers is always a challenging task. Today, writing parallel programs with message passing (e.g. MPI) is the most popular way of achieving scalability and high performance. However, writing message passing programs is difficult and error prone. Recent years new effort has been made in defining new parallel programming paradigms. The best examples are: HPF (based on data parallelism) and OpenMP (based on shared memory parallelism). Both provide simple and clear extensions to sequential programs, thus greatly simplify the tedious tasks encountered in writing message passing programs. HPF is independent of memory hierarchy, however, due to the immaturity of compiler technology its performance is still questionable. Although use of parallel compiler directives is not new, OpenMP offers a portable solution in the shared-memory domain. Another important development involves the tremendous progress in the internet and its associated technology. Although still in its infancy, Java promisses portability in a heterogeneous environment and offers possibility to "compile once and run anywhere." In light of testing these new technologies, we implemented new parallel versions of the NAS Parallel Benchmarks (NPBs) with HPF and OpenMP directives, and extended the work with Java and Java-threads. The purpose of this study is to examine the effectiveness of alternative programming paradigms. NPBs consist of five kernels and three simulated applications that mimic the computation and data movement of large scale computational fluid dynamics (CFD) applications. We started with the serial version included in NPB2.3. Optimization of memory and cache usage

  7. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1993 annual

    International Nuclear Information System (INIS)

    1994-04-01

    This compilation contains 47 ACRS reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1993. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area

  8. JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX

    Science.gov (United States)

    Omidian, Hossein; Lemieux, Guy G. F.

    2018-04-01

    Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.

  9. High Speed Simulation Framework for Reliable Logic Programs

    International Nuclear Information System (INIS)

    Lee, Wan-Bok; Kim, Seog-Ju

    2006-01-01

    This paper shows a case study of designing a PLC logic simulator that was developed to simulate and verify PLC control programs for nuclear plant systems. The nuclear control system requires strict restrictions rather than normal process control system does, since it works with nuclear power plants requiring high reliability under severe environment. One restriction is the safeness of the control programs which can be assured by exploiting severe testing. Another restriction is the simulation speed of the control programs, that should be fast enough to control multi devices concurrently in real-time. To cope with these restrictions, we devised a logic compiler which generates C-code programs from given PLC logic programs. Once the logic program was translated into C-code, the program could be analyzed by conventional software analysis tools and could be used to construct a fast logic simulator after cross-compiling, in fact, that is a kind of compiled-code simulation

  10. Modern programming language

    Science.gov (United States)

    Feldman, G. H.; Johnson, J. A.

    1980-01-01

    Structural-programming language is especially-tailored for producing assembly language programs for MODCOMP II and IV mini-computes. Modern programming language consists of set of simple and powerful control structures that include sequencing alternative selection, looping, sub-module linking, comment insertion, statement continuation, and compilation termination capabilities.

  11. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs

  12. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs.

  13. Design and Implementation of the Futhark Programming Language

    DEFF Research Database (Denmark)

    Henriksen, Troels

    In this thesis we describe the design and implementation of Futhark, a small data-parallel purely functional array language that offers a machine-neutral programming model, and an optimising compiler that generates efficient OpenCL code for GPUs. The overall philosophy is based on seeking a middle...... a lightweight system of size-dependent types that enables the compiler to reason symbolically about the size of arrays in the program, and that reuses general-purpose compiler optimisations to infer relationships between sizes. Third, we furnish Futhark with novel parallel combinators capable of expressing...... reasoning. Fifth, we perform an evaluation on 21 benchmarks that demonstrates the impact of the language and compiler features, and shows application-level performance that is in many cases competitive with hand-written GPU code. Sixth, we make the Futhark compiler freely available with full source code...

  14. Taking a Teen Pregnancy Prevention Program to the Home: The AIM 4 Teen Moms Experience, Implementation Report

    OpenAIRE

    Subuhi Asheer; Ellen Kisker

    2014-01-01

    This report discusses findings from the first 18 months of a program implementation evaluation of AIM 4 Teen Moms, a teen pregnancy intervention designed to delay rapid repeat pregnancies among parenting teen mothers in Los Angeles.

  15. C++QEDv2: The multi-array concept and compile-time algorithms in the definition of composite quantum systems

    Science.gov (United States)

    Vukics, András

    2012-06-01

    C++QED is a versatile framework for simulating open quantum dynamics. It allows to build arbitrarily complex quantum systems from elementary free subsystems and interactions, and simulate their time evolution with the available time-evolution drivers. Through this framework, we introduce a design which should be generic for high-level representations of composite quantum systems. It relies heavily on the object-oriented and generic programming paradigms on one hand, and on the other hand, compile-time algorithms, in particular C++ template-metaprogramming techniques. The core of the design is the data structure which represents the state vectors of composite quantum systems. This data structure models the multi-array concept. The use of template metaprogramming is not only crucial to the design, but with its use all computations pertaining to the layout of the simulated system can be shifted to compile time, hence cutting on runtime. Program summaryProgram title: C++QED Catalogue identifier: AELU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:http://cpc.cs.qub.ac.uk/licence/aelu_v1_0.html. The C++QED package contains other software packages, Blitz, Boost and FLENS, all of which may be distributed freely but have individual license requirements. Please see individual packages for license conditions. No. of lines in distributed program, including test data, etc.: 597 974 No. of bytes in distributed program, including test data, etc.: 4 874 839 Distribution format: tar.gz Programming language: C++ Computer: i386-i686, x86_64 Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60 MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1 MB. The memory storing

  16. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  17. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  18. Process evaluation of a multifaceted health program aiming to improve physical activity levels and dietary patterns among construction workers

    NARCIS (Netherlands)

    Viester, L.; Verhagen, E.A.L.M.; Bongers, P.M.; Beek, A.J. van der

    2014-01-01

    Objective: To evaluate the process of a health promotion program, aiming to improve physical activity levels and diet among construction workers. Methods: The process evaluation was conducted after the RE-AIM framework for the evaluation of the public health impact of health promotion interventions.

  19. Experiences in Data-Parallel Programming

    Directory of Open Access Journals (Sweden)

    Terry W. Clark

    1997-01-01

    Full Text Available To efficiently parallelize a scientific application with a data-parallel compiler requires certain structural properties in the source program, and conversely, the absence of others. A recent parallelization effort of ours reinforced this observation and motivated this correspondence. Specifically, we have transformed a Fortran 77 version of GROMOS, a popular dusty-deck program for molecular dynamics, into Fortran D, a data-parallel dialect of Fortran. During this transformation we have encountered a number of difficulties that probably are neither limited to this particular application nor do they seem likely to be addressed by improved compiler technology in the near future. Our experience with GROMOS suggests a number of points to keep in mind when developing software that may at some time in its life cycle be parallelized with a data-parallel compiler. This note presents some guidelines for engineering data-parallel applications that are compatible with Fortran D or High Performance Fortran compilers.

  20. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  1. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  2. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  3. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  4. [Steps aimed at upgrading a pharmaceutical care sector: the case of surgery].

    Science.gov (United States)

    Guérin, A; Thibault, M; Nguyen, C; Lebel, D; Bussières, J-F

    2014-07-01

    While the concept of clinical pharmacy was developed in the 1960s, clinical programs are characterized by their great variety and disparity when it comes to the presence of pharmacists in healthcare sectors. This article aims to describe a method in which pharmaceutical care sectors in healthcare facilities can be upgraded. This is a descriptive study supporting the upgrade of pharmaceutical care practiced in the surgery sector of a 500-bed mother-child university hospital center, the CHU Sainte-Justine. The pharmacy department employs more than 70 healthcare professionals. The study involved these proposed upgrading steps: firstly, a review of the literature; secondly, a description of the profile of the sector; thirdly, a description of the upgrading of pharmacist practice in surgery. A total of 137 articles were compiled, seven of which were selected to evaluate the impact and eight a description of the pharmacist's role in surgery. The authors did not identify any particular pharmaceutical activity based on very good quality data (A). However, there were five based on good quality data (B) and seven that lacked adequate proof (C, D) in relation to the practice of surgery. Nevertheless, a number of other authors described the development of the pharmacist's clinical role in surgery. There are few data on the impact of pharmacists in surgery. This descriptive study proposes a number of steps aimed at upgrading pharmaceutical care within a Quebec university hospital center. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  5. Compilation and synthesis for embedded reconfigurable systems an aspect-oriented approach

    CERN Document Server

    Diniz, Pedro; Coutinho, José; Petrov, Zlatko

    2013-01-01

    This book provides techniques to tackle the design challenges raised by the increasing diversity and complexity of emerging, heterogeneous architectures for embedded systems. It describes an approach based on techniques from software engineering called aspect-oriented programming, which allow designers to control today’s sophisticated design tool chains, while maintaining a single application source code.  Readers are introduced to the basic concepts of an aspect-oriented, domain specific language that enables control of a wide range of compilation and synthesis tools in the partitioning and mapping of an application to a heterogeneous (and possibly multi-core) target architecture.  Several examples are presented that illustrate the benefits of the approach developed for applications from avionics and digital signal processing. Using the aspect-oriented programming techniques presented in this book, developers can reuse extensive sections of their designs, while preserving the original application source-...

  6. Analysis of the structure and operation of the level F PL/1 compiler: problems raised by the coupling of programmes written in different languages

    International Nuclear Information System (INIS)

    Rambou Sek, Jiri

    1974-01-01

    As the PL/1 programming language allows the treatment of a large range of commercial and scientific problems, it also raises specific problems for the development of a compiler on the one hand, and for the implementation of a management system within an installation comprising software written in different programming languages. The author reports the analysis of the level F PL/1 compiler developed by IBM, discusses issues related to the management of a PL/1 software within the framework of the OS/360 operating system. He reports the study of the linking conditions between PL/1 and the main existing programming languages under OS/360. He presents an interface system which allows a common exploitation of software written in different programming languages. He describes the syntax of the DECLARE statement, and its analysis by the interface system. The last part reports the generation of different data description vectors which are necessary for argument transmission [fr

  7. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  8. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  9. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  10. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  11. System programming languages

    OpenAIRE

    Šmit, Matej

    2016-01-01

    Most operating systems are written in the C programming language. Similar is with system software, for example, device drivers, compilers, debuggers, disk checkers, etc. Recently some new programming languages emerged, which are supposed to be suitable for system programming. In this thesis we present programming languages D, Go, Nim and Rust. We defined the criteria which are important for deciding whether programming language is suitable for system programming. We examine programming langua...

  12. Depleted uranium hexafluoride management program : data compilation for the K-25 site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the K-25 site on the Oak Ridge Reservation, Oak Ridge, Tennessee. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the K-25 site and summarizes the potential environmental impacts that could result from continued cylinder storage and preparation of cylinders for shipment at the site. It is probable that the cylinders at the K-25 site will be shipped to another site for conversion. Because conversion and long-term storage of the entire inventory at the K-25 site are highly unlikely, these data are not presented in this report. DOE's preferred alternative is to begin converting the depleted uranium hexafluoride inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  13. Programming a DSP card for generating an ECG signal with possibility of anomalies

    International Nuclear Information System (INIS)

    Hamrouni, Sayma

    2013-01-01

    This project consists of programming a DSP designed to generate an ECG signal with a probability of anomaly. To begin with, we get to know the characteristics of a DSP card and its architecture. As a second step, we programmed the DSP32C using the compiler D3CC associated with Textpad in order to obtain an analog signal in the respective outputs. And then finally, we developed a graphical user interface using the programming software LabVIEW that aims controlling the good operation of DSP. The tests previously made have proved the good operation of the application.

  14. Mineralogy and geochemistry of rocks and fracture fillings from Forsmark and Oskarshamn: Compilation of data for SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Drake, Henrik; Sandstroem, Bjoern [Isochron GeoConsulting HB, Goeteborg (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden)

    2006-11-15

    This report is a compilation of the so far available data for the safety assessment SR-Can carried out by SKB. The data consists of mineralogy, geochemistry, porosity, density and redox properties for both dominating rock types and fracture fillings at the Forsmark and Oskarshamn candidate areas. In addition to the compilation of existing information, the aim has been to identify missing data and to clarify some conception of e.g. deformation zones. The objective of the report is to present the available data requested for the modelling of the chemical stability of the two sites. The report includes no interpretation of the data.

  15. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  16. The Research and Compilation of City Maps in the National Geomatics Atlas of the PEOPLE'S Republic of China

    Science.gov (United States)

    Wang, G.; Wang, D.; Zhou, W.; Chen, M.; Zhao, T.

    2018-04-01

    The research and compilation of new century version of the National Huge Atlas of the People's Republic of China is the special basic work project by Ministry of Science and Technology of the People's Republic of China. Among them, the research and compilation of the National Geomatics Atlas of the People's Republic of China is its main content. The National Geomatics Atlas of China consists of 4 groups of maps and place name index. The 4 groups of maps are separately nationwide thematic map group, provincial fundamental geographical map group, landcover map group and city map group. The city map group is an important component part of the National Geomatics Atlas of China and mainly shows the process of urbanization in China. This paper, aim at design and compilation of 39 city-wide maps, briefly introduces mapping area research and scale design, mapping technical route, content selection and cartographic generalization, symbol design and visualization of map, etc.

  17. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  18. A compilation of reports of the Advisory Committee on Reactor Safeguards. 1994 annual. Volume 16

    International Nuclear Information System (INIS)

    1995-04-01

    This compilation contains 30 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1994. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the U.S. Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area

  19. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  20. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  1. Healthcare team training programs aimed at improving depression management in primary care: A systematic review.

    Science.gov (United States)

    Vöhringer, Paul A; Castro, Ariel; Martínez, Pablo; Tala, Álvaro; Medina, Simón; Rojas, Graciela

    2016-08-01

    Although evidence from Latin America and the Caribbean suggests that depression can be effectively treated in primary care settings, depression management remains unevenly performed. This systematic review evaluates all the international evidence on healthcare team training programs aimed at improving the outcomes of patients with depression. Three databases were searched for articles in English or Spanish indexed up to November 20, 2014. Studies were included if they fulfilled the following conditions: clinical trials, meta-analyses, or systematic reviews; and if they evaluated a training or educational program intended to improve the management of depression by primary healthcare teams, and assessed change in depressive symptoms, diagnosis or response rates, referral rates, patients' satisfaction and/or quality of life, and the effectiveness of treatments. Nine studies were included in this systematic review. Five trials tested the effectiveness of multi-component interventions (training included), and the remaining studies evaluated the effectiveness of specific training programs for depression management. All the studies that implemented multi-component interventions were efficacious, and half of the training trials were shown to be effective. Contribution of training programs alone to the effectiveness of multi-component interventions is yet to be established. The lack of specificity regarding health providers' characteristics might be a confounding factor. The review conducted suggests that stand-alone training programs are less effective than multi-component interventions. In applying the evidence gathered from developed countries to Latin America and the Caribbean, these training programs must consider and address local conditions of mental health systems, and therefore multi-component interventions may be warranted. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. The Danish national return-to-work program - aims, content, and design of the process and effect evaluation

    DEFF Research Database (Denmark)

    Aust, Birgit; Helverskov, Trine; Nielsen, Maj Britt D.

    2012-01-01

    approximately 19 500 working-age adults on long-term sickness absence, regardless of reason for sickness absence or employment status. It consists of three core elements: (i) establishment of multidisciplinary RTW teams, (ii) introduction of standardized workability assessments and sickness absence management......The Danish national return-to-work (RTW) program aims to improve the management of municipal sickness benefit in Denmark. A study is currently ongoing to evaluate the RTW program. The purpose of this article is to describe the study protocol. The program includes 21 municipalities encompassing...... procedures, and (iii) a comprehensive training course for the RTW teams. The effect evaluation is based on a parallel group randomized trial and a stratified cluster controlled trial and focuses on register-based primary outcomes - duration of sickness absence and RTW - and questionnaire-based secondary...

  3. GENGTC-JB: a computer program to calculate temperature distribution for cylindrical geometry capsule

    International Nuclear Information System (INIS)

    Someya, Hiroyuki; Kobayashi, Toshiki; Niimi, Motoji; Hoshiya, Taiji; Harayama, Yasuo

    1987-09-01

    In design of JMTR irradiation capsules contained specimens, a program (named GENGTC) has been generally used to evaluate temperature distributions in the capsules. The program was originally compiled by ORNL(U.S.A.) and consisted of very simple calculation methods. From the incorporated calculation methods, the program is easy to use, and has many applications to the capsule design. However, it was considered to replace original computing methods with advanced ones, when the program was checked from a standpoint of the recent computer abilities, and also to be complicated in data input. Therefore, the program was versioned up as aim to make better calculations and improve input method. The present report describes revised calculation methods and input/output guide of the version-up program. (author)

  4. AIM satellite-based research bridges the unique scientific aspects of the mission to informal education programs globally

    Science.gov (United States)

    Robinson, D.; Maggi, B.

    2003-04-01

    The Education and Public Outreach (EPO) component of the satellite-based research mission "Aeronomy of Ice In the Mesosphere" (AIM) will bridge the unique scientific aspects of the mission to informal education organizations. The informal education materials developed by the EPO will utilize AIM data and educate the public about the environmental implications associated with the data. This will assist with creating a scientifically literate workforce and in developing a citizenry capable of making educated decisions related to environmental policies and laws. The objective of the AIM mission is to understand the mechanisms that cause Polar Mesospheric Clouds (PMCs) to form, how their presence affects the atmosphere, and how change in the atmosphere affects them. PMCs are sometimes known as Noctilucent Clouds (NLCs) because of their visibility during the night from appropriate locations. The phenomenon of PMCs is an observable indicator of global change, a concern to all citizens. Recent sightings of these clouds over populated regions have compelled AIM educators to expand informal education opportunities to communities worldwide. Collaborations with informal organizations include: Museums/Science Centers; NASA Sun-Earth Connection Forum; Alaska Native Ways of Knowing Project; Amateur Noctilucent Cloud Observers Organization; National Parks Education Programs; After School Science Clubs; Public Broadcasting Associations; and National Public Radio. The Native Ways of Knowing Project is an excellent example of informal collaboration with the AIM EPO. This Alaska based project will assist native peoples of the state with photographing NLCs for the EPO website. It will also aid the EPO with developing materials for informal organizations that incorporate traditional native knowledge and science, related to the sky. Another AIM collaboration that will offer citizens lasting informal education opportunities is the one established with the United States National Parks

  5. SKI review of SKB research programs 1992. Compilation of scientific reports

    International Nuclear Information System (INIS)

    1993-03-01

    Swedish Nuclear Power Inspectorate (SKI) has reviewed the research programs 1992 of the Swedish Nuclear Fuel and Waste Management Co (SKB). This report presents the examination of the individual programs

  6. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  7. Materials Sciences programs. Fiscal year 1982

    International Nuclear Information System (INIS)

    1982-09-01

    The purpose of this report is to provide a convenient compilation and index of the DOE Materials Sciences Division programs. This compilation is intended for use by administrators, managers, and scientists to help coordinate research and as an aid in selecting new programs. The report is divided into five sections. Section A contains all laboratory projects, Section B has all contract research projects, Section C has information on DOE collaborative research centers, Section D shows distribution of funding, and Section E has various indices

  8. A comparative study of programming languages for next-generation astrodynamics systems

    Science.gov (United States)

    Eichhorn, Helge; Cano, Juan Luis; McLean, Frazer; Anderl, Reiner

    2018-03-01

    Due to the computationally intensive nature of astrodynamics tasks, astrodynamicists have relied on compiled programming languages such as Fortran for the development of astrodynamics software. Interpreted languages such as Python, on the other hand, offer higher flexibility and development speed thereby increasing the productivity of the programmer. While interpreted languages are generally slower than compiled languages, recent developments such as just-in-time (JIT) compilers or transpilers have been able to close this speed gap significantly. Another important factor for the usefulness of a programming language is its wider ecosystem which consists of the available open-source packages and development tools such as integrated development environments or debuggers. This study compares three compiled languages and three interpreted languages, which were selected based on their popularity within the scientific programming community and technical merit. The three compiled candidate languages are Fortran, C++, and Java. Python, Matlab, and Julia were selected as the interpreted candidate languages. All six languages are assessed and compared to each other based on their features, performance, and ease-of-use through the implementation of idiomatic solutions to classical astrodynamics problems. We show that compiled languages still provide the best performance for astrodynamics applications, but JIT-compiled dynamic languages have reached a competitive level of speed and offer an attractive compromise between numerical performance and programmer productivity.

  9. The technical results of the Swedish nuclear weapons programme - a compilation of FOAs annual reports 1945-1972

    International Nuclear Information System (INIS)

    Oliver, L.; Stenholm, L.

    2002-02-01

    The aim with this report is to summarise FOAs nuclear weapons related research that was performed 1945-1972. The report is a compilation of FOAs annual reports that originally were in a classified form but have now - mostly - been declassified. References to separate reports in the different research areas are included in the report

  10. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  11. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  12. ARIANE: a scientific programming assisting system

    International Nuclear Information System (INIS)

    Kavenoky, A.; Lautard, J.J.; Robeau, M.F.

    1982-06-01

    The ARIANE system had been designed to make easier development, maintenance and operation of scientific programs; ARIANE is divided into three elementary functions: 1/ a pre-compiler processes a super-set of FORTRAN allowing virtual memory simulation (LAGD translator) and the OTOMAT library is used at run-time to perform the storage management, 2/ a dynamic loader permits the cancellation of the standard linkage-editor step and of the generation of overlays, 3/ the logical chaining of the mathematical modules is controlled by the ARIANE language: the user submits to the ARIANE compiler a program describing the logical algorithm to be perfomed; the compiler output is executed. The ARIANE system had been designed for IBM computers running under OS/VS1 or VS2; a Cray version had been generated and is now operational [fr

  13. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  14. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  15. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  16. Computer Program Development Specification for Ada Integrated Environment. Ada Compiler Phases B5-AIE (1). COMP (1).

    Science.gov (United States)

    1982-11-05

    INTERMETRICS INCORPORATED * 733 CONCORD AVENUE e CAMBRIDGE, MASSACHUSETTS 02138 1 6171 861-1340 B5-AIE(l) .COt4P(1) SET U _j ISTING AST.’ ISTINGDIN...header EZMAP DS A-OAMAP ’ address of exception handler map -- Code (instructions and literals) follows BODY ZQU * entry point to the unit Ist ...exceed a figure to be determined. (3) VIH limits the compiler to 200 subdomains accessible at once. This limits the number of units that may be WITHd

  17. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  18. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  19. Compile-Time Debugging of C Programs Working on Trees

    DEFF Research Database (Denmark)

    Elgaard, Jacob; Møller, Anders; Schwartzbach, Michael I.

    2000-01-01

    of an initial store that leads to an error is automatically generated. This extends previous work that uses a similar technique to verify a simpler syntax manipulating only list structures. In that case, programs are translated into WS1S formulas. A naive generalization to recursive data-types determines...

  20. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  1. Materials Sciences programs, Fiscal year 1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-02-01

    This report provides a compilation and index of the DOE Materials Sciences Division programs; the compilation is to assist administrators, managers, and scientists to help coordinate research. The report is divided into 7 sections: laboratory projects, contract research projects, small business innovation research, major user facilities, other user facilities, funding level distributions, and indexes.

  2. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  3. Materials Sciences Programs. Fiscal Year 1985

    International Nuclear Information System (INIS)

    1985-09-01

    The purpose of this report is to provide a convenient compilation and index of the DOE Materials Sciences Division programs. This compilation is primarily intended for use by administrators, managers, and scientists to help coordinate research. The report is divided into six sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the Small Business Innovation Research Program, Sections D and E have information on DOE collaborative research centers, Section F gives distribution of funding, and Section G has various indexes

  4. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  5. Monte Carlo programs and other utilities for high energy physics

    International Nuclear Information System (INIS)

    Palounek, A.P.T.; Youssef, S.

    1990-05-01

    The Software Standards and Documentation Group of the Workshop on Physics and Detector Simulation for SSC Experiments has compiled a list of physics generators, detector simulations, and related programs. This is not meant to be an exhaustive compilation, nor is any judgment made about program quality; it is a starting point or a more complete bibliography. Where possible we have included an author and source for the code. References for most programs are in the final section

  6. Gulf Coast geopressured-geothermal program summary report compilation. Volume 4: Bibliography (annotated only for all major reports)

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    This bibliography contains US Department of Energy sponsored Geopressured-Geothermal reports published after 1984. Reports published prior to 1984 are documented in the Geopressured Geothermal bibliography Volumes 1, 2, and 3 that the Center for Energy Studies at the University of Texas at Austin compiled in May 1985. It represents reports, papers and articles covering topics from the scientific and technical aspects of geopressured geothermal reservoirs to the social, environmental, and legal considerations of exploiting those reservoirs for their energy resources.

  7. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  8. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  9. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  10. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  11. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  12. A type-driven approach to concrete meta programming.

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen)

    2005-01-01

    textabstractApplications that manipulate programs as data are called meta programs. Examples of meta programs are compilers, source-to-source translators and code generators. Meta programming can be supported by the ability to represent program fragments in concrete syntax instead of abstract

  13. MetaJC++: A flexible and automatic program transformation technique using meta framework

    Science.gov (United States)

    Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  14. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  15. Strategic bioenergy research. A knowledge compilation and synthesis of research projects funded by the Swedish Energy Agency's fuel program 2007-2011; Strategisk bioenergiforskning. En kunskapssammanstaellning och syntes av forskningsprojekt finansierade av Energimyndighetens braensleprogram 2007-2011

    Energy Technology Data Exchange (ETDEWEB)

    Gode, Jenny; Gustavsson, Mathias; Hoeglund, Jonas; Hellsten, Sofie; Martinsson, Fredrik; Stadmark, Johanna [IVL Svenska Miljoeinstitutet, Stockholm (Sweden)

    2012-11-01

    During 2007-2011 the Swedish Energy Agency has run the program 'Sustainable supply and processing of biofuels'. To summarise the state of knowledge, identify knowledge gaps and analyse the results in a broader context, three different synthesis reports have been performed in the program's final phase. This report is one of these synthesis reports and concerns the area of strategic bioenergy research. In this context, 'strategic' means research that is of significance from the system, marketing and/or policy perspective. The work is based on research conducted mainly in the research programme 'Sustainable supply and processing of biofuels'. This report constitutes the final report of the synthesis project on strategic bioenergy research and includes knowledge compilation, identification of knowledge gaps and synthesis. The results of the synthesis project provide a basis for planning new research programs in the auspices of the Swedish Energy Agency. The two other synthesis projects concern forest fuels as well as energy crops and fuel quality. The report covers a rather broad field of research, e.g. environmental impact, carbon balances, nitrous oxide, bioenergy systems, scenarios, trade and marketing, standardization and certification. The work has been based on project plans and publications for a predefined number of projects, as well as on interviews and discussions with project leaders. Furthermore, several seminars and workshops also provided information for the compilation. Other studies have also been taken into account to some extent.

  16. Clean translation of an imperative reversible programming language

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock

    2011-01-01

    We describe the translation techniques used for the code generation in a compiler from the high-level reversible imperative programming language Janus to the low-level reversible assembly language PISA. Our translation is both semantics preserving (correct), in that target programs compute exactly...... the same functions as their source programs (cleanly, with no extraneous garbage output), and efficient, in that target programs conserve the complexities of source programs. In particular, target programs only require a constant amount of temporary garbage space. The given translation methods are generic......, and should be applicable to any (imperative) reversible source language described with reversible flowcharts and reversible updates. To our knowledge, this is the first compiler between reversible languages where the source and target languages were independently developed; the first exhibiting both...

  17. 2014 Water Power Program Peer Review: Hydropower Technologies, Compiled Presentations (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    2014-02-01

    This document represents a collection of all presentations given during the EERE Wind and Water Power Program's 2014 Hydropower Peer Review. The purpose of the meeting was to evaluate DOE-funded hydropower and marine and hydrokinetic R&D projects for their contribution to the mission and goals of the Water Power Program and to assess progress made against stated objectives.

  18. Materials Sciences programs, fiscal year 1978: Office of Basic Energy Services

    International Nuclear Information System (INIS)

    1978-09-01

    A compilation and index are provided of the the DOE Materials Sciences Division programs. This compilation is intended for use by administrators, managers, and scientists to help coordinate research and as an aid in selecting new programs. The report is divided into Sections A and B, listing all the projects, Section C, a summary of funding levels, and Section D, an index

  19. Materials Sciences Programs. Fiscal Year 1980, Office of Basic Energy Sciences

    International Nuclear Information System (INIS)

    1980-09-01

    This report provides a convenient compilation index of the DOE Materials Sciences Division programs. This compilation is intended for use by administrators, managers, and scientists to help coordinate research and as an aid in selecting new programs and is divided into Sections A and B, listing all the projects, Section C, a summary of funding levels, and Section D, an index

  20. Compilation of contract research for the Chemical Engineering Branch, Division of Engineering Technology. Annual report for FY 1985

    International Nuclear Information System (INIS)

    1986-07-01

    This compilation of annual research reports by the contractors to the Chemical Engineering Branch, DET, is published to disseminate information from ongoing programs and covers research conducted during fiscal year 1985. The programs covered in this document include research on: (1) engineered safety feature (ESF) system effectiveness in terms of fission product retention under severe accident conditions; (2) effectiveness and safety aspects of selected decontamination methods; (3) decontamination impacts on solidification and waste disposal; (4) evaluation of nuclear facility decommissioning projects and concepts, and (5) operational schemes to prevent or mitigate the effects of hydrogen combustion during LWR accidents

  1. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  2. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  3. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  4. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  5. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  6. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  7. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  8. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  9. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  10. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  11. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  12. Multiprocessor programming environment

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.B.; Fornaro, R.

    1988-12-01

    Programming tools and techniques have been well developed for traditional uniprocessor computer systems. The focus of this research project is on the development of a programming environment for a high speed real time heterogeneous multiprocessor system, with special emphasis on languages and compilers. The new tools and techniques will allow a smooth transition for programmers with experience only on single processor systems.

  13. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  14. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  15. FY-2007 PNNL Voluntary Protection Program (VPP) Program Evaluation

    International Nuclear Information System (INIS)

    Wright, Patrick A.; Fisher, Julie A.; Goheen, Steven C.; Isern, Nancy G.; Madson, Vernon J.; Meicenheimer, Russell L.; Pugh, Ray; Schneirla, Keri A.; Shockey, Loretta L.; Tinker, Mike R.

    2008-01-01

    This document reports the results of the FY-2007 PNNL VPP Program Evaluation, which is a self-assessment of the operational and programmatic performance of the Laboratory related to worker safety and health. The report was compiled by a team of worker representatives and safety professionals who evaluated the Laboratory's worker safety and health programs on the basis of DOE-VPP criteria. The principle elements of DOE's VPP program are: Management Leadership, Employee Involvement, Worksite Analysis, Hazard Prevention and Control, and Safety and Health Training.

  16. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  17. Materials sciences programs: Fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-05-01

    The purpose of this report is to provide a convenient compilation and index of the DOE Materials Science Division programs. This compilation is primarily intended for use by administrators, managers, and scientists to help coordinate research. The report is divided into eight sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the Small Business Innovation Research Program, Section D describes the Center of Excellence for the Synthesis and Processing of Advanced Materials and E has information on major user facilities. F describes other user facilities, G as a summary of funding levels and H has indices characterizing research projects.

  18. Materials sciences programs fiscal year 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The purpose of this report is to provide a convenient compilation and index of the DOE Materials Sciences Division programs. This compilation is primarily intended for use by administrators, managers, and scientists to help coordinate research. The report is divided into eight sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the Small Business Innovation Research Program, Section D describes the Center of Excellence for the Synthesis and Processing of Advanced Materials and E has information on major user facilities. F describes other user facilities, G as a summary of funding levels and H has indices characterizing research projects.

  19. Energy efficiency buildings program, FY 1980

    Energy Technology Data Exchange (ETDEWEB)

    1981-05-01

    A separate abstract was prepared on research progress in each group at LBL in the energy efficient buildings program. Two separate abstracts were prepared for the Windows and Lighting Program. Abstracts prepared on other programs are: Energy Performance of Buildings; Building Ventilation and Indoor Air Quality Program; DOE-21 Building Energy Analysis; and Building Energy Data Compilation, Analysis, and Demonstration. (MCW)

  20. GSG-GIS development program plan

    International Nuclear Information System (INIS)

    Lee, R.C.

    1992-01-01

    For the past 40 years, the Savannah River Site (SRS) has been subjected to numerous geological and geotechnical investigations in support of facility construction and waste site development and remediation. Over this period,.a variety of different subcontractors have collected large quantities of geoscience data. In addition, current programs involve numerous investigators from different departments, and consequently, earth science data and interpretations are scattered among the departments, investigators, and subcontractors at SRS. As a result, scientific and management decisions cannot take advantage of the significant body of information that exists at SRS. Recent DOE Orders (Systematic Evaluation Program, 1991) have put specific requirements on their contractors to compile geological databases to coordinate DOE site data gathering and interpretations, and to assist in compiling safety analysis reports. The Earth Science Advisory Committee and the Environmental Advisory Committee have also made specific recommendations on the management of SRS geoscience data. This plan describes a management system to identify, communicate, and compile SRS geological (including geohydrologic), seismological, and geotechnical (656) data and interpretations on a Geographic Information System (GIS)

  1. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  2. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  3. Modeling EERE Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

  4. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  5. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  6. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1992 Annual

    International Nuclear Information System (INIS)

    1993-04-01

    This compilation contains 50 ACRS reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1992. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part I contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  7. A compilation of reports of the Advisory Committee on Reactor Safeguards, 1990 annual

    International Nuclear Information System (INIS)

    1991-04-01

    This compilation contains 31 Advisory Committee on Reactor Safeguards (ACRS) reports submitted to the Commission or to the Executive Director for Operations during calendar year 1990. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subject. Part 1 contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  8. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1987 annual

    International Nuclear Information System (INIS)

    1988-04-01

    This compilation contains 47 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1987. It also includes a report to the Congress on the NRC Safety Research Program for FY 1988. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  9. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1989 annual

    International Nuclear Information System (INIS)

    1990-04-01

    This compilation contains 54 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1989. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1 -- ACRS Reports on Project Reviews, and Part 2 -- ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  10. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1988 annual

    International Nuclear Information System (INIS)

    1989-04-01

    This compilation contains 47 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1988. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1, ACRS Reports on Project Reviews, and Part 2, ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order. 136 refs., 1 tab

  11. FY-2007 PNNL Voluntary Protection Program (VPP) Program Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Patrick A.; Fisher, Julie A.; Goheen, Steven C.; Isern, Nancy G.; Madson, Vernon J.; Meicenheimer, Russell L.; Pugh, Ray; Schneirla, Keri A.; Shockey, Loretta L.; Tinker, Mike R.

    2008-08-15

    This document reports the results of the FY-2007 PNNL VPP Program Evaluation, which is a self-assessment of the operational and programmatic performance of the Laboratory related to worker safety and health. The report was compiled by a team of worker representatives and safety professionals who evaluated the Laboratory's worker safety and health programs on the basis of DOE-VPP criteria. The principle elements of DOE's VPP program are: Management Leadership, Employee Involvement, Worksite Analysis, Hazard Prevention and Control, and Safety and Health Training.

  12. Description of source term data on contaminated sites and buildings compiled for the waste management programmatic environmental impact statement (WMPEIS)

    International Nuclear Information System (INIS)

    Short, S.M.; Smith, D.E.; Hill, J.G.; Lerchen, M.E.

    1995-10-01

    The U.S. Department of Energy (DOE) and its predecessor agencies have historically had responsibility for carrying out various national missions primarily related to nuclear weapons development and energy research. Recently, these missions have been expanded to include remediation of sites and facilities contaminated as a result of past activities. In January 1990, the Secretary of Energy announced that DOE would prepare a Programmatic Environmental Impact Statement on the DOE's environmental restoration and waste management program; the primary focus was the evaluation of (1) strategies for conducting remediation of all DOE contaminated sites and facilities and (2) potential configurations for waste management capabilities. Several different environmental restoration strategies were identified for evaluation, ranging from doing no remediation to strategies where the level of remediation was driven by such factors as final land use and health effects. A quantitative assessment of the costs and health effects of remediation activities and residual contamination levels associated with each remediation strategy was made. These analyses required that information be compiled on each individual contaminated site and structure located at each DOE installation and that the information compiled include quantitative measurements and/or estimates of contamination levels and extent of contamination. This document provides a description of the types of information and data compiled for use in the analyses. Also provided is a description of the database used to manage the data, a detailed discussion of the methodology and assumptions used in compiling the data, and a summary of the data compiled into the database as of March 1995. As of this date, over 10,000 contaminated sites and structures and over 8,000 uncontaminated structures had been identified across the DOE complex of installations

  13. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  14. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  15. Argonne Nuclear Data Program

    Energy Technology Data Exchange (ETDEWEB)

    Kondev, F. [US Nuclear Data Program, U.S. DOE/SC (United States)

    2013-08-15

    Nuclear Data Compilations and Evaluations: - Nuclear structure and decay data compilations and evaluations for the International NSDD network (ENSDF and XUNDL); - AME12 and NuBase12 - in collaboration with G. Audi and M. MacCormick, CSNSM (Orsay), M. Wang, IMP (Lanzhou) and B. Pfeiffer, GSI (Darmstadt) - presentation by M. Wang; - DDEP coordinator - completed; - Horizontal nuclear data evaluation activities -IAEA CRP's, Isomers, Medical Isotopes; Complementary ND research Activities: - CARIBU, FRIB and other RIB facilities, Gretina, IAEA-CRP - emphasis on nuclear structure physics and astrophysics, and their intersection with applied nuclear physics programs.

  16. A compilation of reports of the Advisory Committee on reactor safeguards. 1996 Annual report

    International Nuclear Information System (INIS)

    1997-04-01

    This compilation contains 47 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1996. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room, the U.S. Library of Congress, and the Internet at http://www.nrc.gov/ACRSACNW. The reports are divided into two groups: Part 1 contains ACRS reports by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  17. Continued advancement of the programming language HAL to an operational status

    Science.gov (United States)

    1971-01-01

    The continued advancement of the programming language HAL to operational status is reported. It is demonstrated that the compiler itself can be written in HAL. A HAL-in-HAL experiment proves conclusively that HAL can be used successfully as a compiler implementation tool.

  18. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  19. Making didactics proposals aimed at improving socioeducational programs for youngters at risk of social exclusion

    Directory of Open Access Journals (Sweden)

    María Violeta Álvarez Fernández

    2013-12-01

    Full Text Available Educators at Sograndio Juvenile Detention Center in Asturias were sure that they were able to offer new alternatives to improve their socio-educational intervention. That is the reason why they got involved in an action research training process aimed at making didactic proposals which tried to optimize the development of social competence programs for youngsters with criminal behavior. In order to do this, the Short Version of the Prosocial Thinking Program for Young People (Alba et al., 2005 was applied to nine inmates, eight males between 14 and 20 years old and a female of 16 years old. Theywere the basis and reflection of ourmethodological intervention. The teamtaking part in this intervention is formed by nine inmates and three external members (two coordinators and a woman psychologist. The investigation, mainly qualitative, includes different instruments (both qualitative and quantitative, such as: observation registries, discussion groups, questionnaires and notebooks. After the implementation of the program, positive changes must be highlighted in the emotional dimension, the resolution of problems, self-control and frustration tolerance. Besides, very high levels of satisfaction have been observed among youngsters, educators and external informants. It is important to have created a climate of professional compromise towards change and this way to have abandoned a passive attitude towards training, so that we were able to start an active search for practical answers, adjusted to our necessities and professional interests. Finally, we make several observations and intervention proposals that contribute to generate didactic knowledge thought and adapted for social education.

  20. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  1. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  2. Feedback Driven Annotation and Refactoring of Parallel Programs

    DEFF Research Database (Denmark)

    Larsen, Per

    and communication in embedded programs. Runtime checks are developed to ensure that annotations correctly describe observable program behavior. The performance impact of runtime checking is evaluated on several benchmark kernels and is negligible in all cases. The second aspect is compilation feedback. Annotations...... are not effective unless programmers are told how and when they are benecial. A prototype compilation feedback system was developed in collaboration with IBM Haifa Research Labs. It reports issues that prevent further analysis to the programmer. Performance evaluation shows that three programs performes signicantly......This thesis combines programmer knowledge and feedback to improve modeling and optimization of software. The research is motivated by two observations. First, there is a great need for automatic analysis of software for embedded systems - to expose and model parallelism inherent in programs. Second...

  3. Program package for multicanonical simulations of U(1) lattice gauge theory-Second version

    Science.gov (United States)

    Bazavov, Alexei; Berg, Bernd A.

    2013-03-01

    A new version STMCMUCA_V1_1 of our program package is available. It eliminates compatibility problems of our Fortran 77 code, originally developed for the g77 compiler, with Fortran 90 and 95 compilers. New version program summaryProgram title: STMC_U1MUCA_v1_1 Catalogue identifier: AEET_v1_1 Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language: Fortran 77 compatible with Fortran 90 and 95 Computers: Any capable of compiling and executing Fortran code Operating systems: Any capable of compiling and executing Fortran code RAM: 10 MB and up depending on lattice size used No. of lines in distributed program, including test data, etc.: 15059 No. of bytes in distributed program, including test data, etc.: 215733 Keywords: Markov chain Monte Carlo, multicanonical, Wang-Landau recursion, Fortran, lattice gauge theory, U(1) gauge group, phase transitions of continuous systems Classification: 11.5 Catalogue identifier of previous version: AEET_v1_0 Journal Reference of previous version: Computer Physics Communications 180 (2009) 2339-2347 Does the new version supersede the previous version?: Yes Nature of problem: Efficient Markov chain Monte Carlo simulation of U(1) lattice gauge theory (or other continuous systems) close to its phase transition. Measurements and analysis of the action per plaquette, the specific heat, Polyakov loops and their structure factors. Solution method: Multicanonical simulations with an initial Wang-Landau recursion to determine suitable weight factors. Reweighting to physical values using logarithmic coding and calculating jackknife error bars. Reasons for the new version: The previous version was developed for the g77 compiler Fortran 77 version. Compiler errors were encountered with Fortran 90 and Fortran 95 compilers (specified below). Summary of revisions: epsilon=one/10**10 is replaced by epsilon/10.0D10 in the parameter statements of the subroutines u1_bmha.f, u1_mucabmha.f, u1wl

  4. Current Trends in Associate Degree Nursing Programs.

    Science.gov (United States)

    Blackstone, Elaine Grant

    This study was designed to ascertain current trends in associate degree nursing programs and to discover innovative ideas and techniques which could be applied to the existing program at Miami-Dade Community College (Florida). Data was compiled from interviews with representatives of ten associate degree nursing programs in six states. Information…

  5. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  6. A gloomy picture: a meta-analysis of randomized controlled trials reveals disappointing effectiveness of programs aiming at preventing child maltreatment.

    Science.gov (United States)

    Euser, Saskia; Alink, Lenneke Ra; Stoltenborgh, Marije; Bakermans-Kranenburg, Marian J; van IJzendoorn, Marinus H

    2015-10-18

    Consistent findings about the effectiveness of parent programs to prevent or reduce child maltreatment are lacking. In the present meta-analysis we synthesized findings from 27 independent samples from randomized controlled trials (RCTs) on the effectiveness of 20 different intervention programs aimed at (i) preventing the occurrence of child maltreatment in the general population or with at-risk but non-maltreating families, or (ii) reducing the incidence of child maltreatment in maltreating families. A significant combined effect on maltreatment (d = 0.13; N = 4883) disappeared after the trim-and-fill approach that takes into account publication bias against smaller studies without significant outcomes. However, moderator analyses showed that larger effect sizes were found for more recent studies, studies with smaller samples, programs that provide parent training instead of only support, programs that target maltreating instead of at-risk families, and programs with a moderate length (6-12 months) or a moderate number of sessions (16-30). More RCTs are needed to further unravel which factors are associated with program effectiveness. Because currently existing programs appeared to only reduce and not prevent child maltreatment, efforts in the field of preventive intervention should also focus on the development and testing of preventive programs for families at risk for child maltreatment.

  7. Classical Fortran programming for engineering and scientific applications

    CERN Document Server

    Kupferschmid, Michael

    2009-01-01

    IntroductionWhy Study Programming?The Evolution of FORTRANWhy Study FORTRAN?Classical FORTRANAbout This BookAdvice to InstructorsAbout the AuthorAcknowledgmentsDisclaimersHello, World!Case Study: A First FORTRAN ProgramCompiling the ProgramRunning a Program in UNIXOmissionsExpressions and Assignment StatementsConstantsVariables and Variable NamesArithmetic OperatorsFunction ReferencesExpressionsA

  8. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  9. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  10. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  11. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  12. PNNL FY2005 DOE Voluntary Protection Program (VPP) Program Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Patrick A.; Madson, Vernon J.; Isern, Nancy G.; Haney, Janice M.; Fisher, Julie A.; Goheen, Steven C.; Gulley, Susan E.; Reck, John J.; Collins, Drue A.; Tinker, Mike R.; Walker, Landon A.; Wynn, Clifford L.

    2005-01-31

    This document reports the results of the FY 2005 PNNL VPP Program Evaluation, which is a self-assessment of the operational and programmatic performance of the Laboratory related to worker safety and health. The report was compiled by a team of worker representatives and safety professionals who evaluated the Laboratory's worker safety and health programs on the basis of DOE-VPP criteria. The principle elements of DOE's VPP program are: Management Leadership, Employee Involvement, Worksite Analysis, Hazard Prevention and Control, and Safety and Health Training.

  13. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  14. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  15. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  16. Compiling standardized information from clinical practice: using content analysis and ICF Linking Rules in a goal-oriented youth rehabilitation program.

    Science.gov (United States)

    Lustenberger, Nadia A; Prodinger, Birgit; Dorjbal, Delgerjargal; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-09-23

    To illustrate how routinely written narrative admission and discharge reports of a rehabilitation program for eight youths with chronic neurological health conditions can be transformed to the International Classification of Functioning, Disability and Health. First, a qualitative content analysis was conducted by building meaningful units with text segments assigned of the reports to the five elements of the Rehab-Cycle ® : goal; assessment; assignment; intervention; evaluation. Second, the meaningful units were then linked to the ICF using the refined ICF Linking Rules. With the first step of transformation, the emphasis of the narrative reports changed to a process oriented interdisciplinary layout, revealing three thematic blocks of goals: mobility, self-care, mental, and social functions. The linked 95 unique ICF codes could be grouped in clinically meaningful goal-centered ICF codes. Between the two independent linkers, the agreement rate was improved after complementing the rules with additional agreements. The ICF Linking Rules can be used to compile standardized health information from narrative reports if prior structured. The process requires time and expertise. To implement the ICF into common practice, the findings provide the starting point for reporting rehabilitation that builds upon existing practice and adheres to international standards. Implications for Rehabilitation This study provides evidence that routinely collected health information from rehabilitation practice can be transformed to the International Classification of Functioning, Disability and Health by using the "ICF Linking Rules", however, this requires time and expertise. The Rehab-Cycle ® , including assessments, assignments, goal setting, interventions and goal evaluation, serves as feasible framework for structuring this rehabilitation program and ensures that the complexity of local practice is appropriately reflected. The refined "ICF Linking Rules" lead to a standardized

  17. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  18. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1991 annual

    International Nuclear Information System (INIS)

    1992-04-01

    This compilation contains 41 Advisory Committee on Reactor Safeguards (ACRS) reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1991. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  19. Moderator Chemistry Program

    International Nuclear Information System (INIS)

    Dewitt, L.V.; Gibbs, A.; Lambert, D.P.; Bohrer, S.R.; Fanning, R.L.; Houston, M.W.; Stinson, S.L.; Deible, R.W.; Abdel-Khalik, S.I.

    1990-11-01

    Over the past fifteen months, the Systems Chemistry Group of the Reactor Engineering Department has undertaken a comprehensive study of the Department's moderator chemistry program at Savannah River Site (SRS). An internal review was developed to formalize and document this program. Objectives were as outlined in a mission statement and action plan. In addition to the mission statement and action plan, nine separate task reports have been issued during the course of this study. Each of these task reports is included in this document as a chapter. This document is an organized compilation of the individual reports issued by the Systems Chemistry Group in assessment of SRS moderator chemistry to determine if there were significant gaps in the program as ft existed in October, 1989. While these reviews found no significant gaps in that mode of operation, or any items that adversely affected safety, items were identified that could be improved. Many of the items have already been dear with or are in the process of completion under this Moderator Chemistry Program and other Reactor Restart programs. A complete list of the items of improvement found under this assessment is found in Chapter 9, along with a proposed time table for correcting remaining items that can be improved for the chemistry program of SRS reactors. An additional external review of the moderator chemistry processes, recommendations, and responses to/from the Reactor Corrosion Mitigation Committee is included as Appendix to this compilation

  20. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  1. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  2. Compilation of reactor physics data of the year 1984, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-12-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1984 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  3. Compilation of reactor physics data of the year 1983, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-06-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1983 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  4. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  5. Compendium of student papers : 2008 Undergraduate Transportation Scholars Program.

    Science.gov (United States)

    2008-08-01

    This report is a compilation of research papers written by students participating in the 2008 Undergraduate : Transportation Scholars Program. The ten-week summer program, now in its eighteenth year, provides : undergraduate students in Civil Enginee...

  6. Plans for the NKS-program 1998-2001

    International Nuclear Information System (INIS)

    Bennerstedt, T.

    1999-08-01

    The present report is a comprehensive compilation of the adopted NKS project plans for the sixth four-year period, 1998-2001. Most of the plans are in English. One is in both English and Danish. One is in Norwegian, with a brief summary in English. Only two of the six appendices are in English. In spite of this, it is believed that the report will serve as a valuable source of information not only to those actually active in or closely following the NKS work, but also the international scientific community, e.g., within EU and in the Baltic States. The research program incorporates reactor safety, radioactive waste, emergency preparedness, radioecology, cross-disciplinary studies, and information issues. The necessary administrative support program, including the NKS Secretariat, is not described herein. Neither is the aim, scope or organization of NKS, since this has been covered elsewhere. (EHS)

  7. PLOTGEOMX: a program for display of a neutron target assembly by means of a GHOST plotting system

    International Nuclear Information System (INIS)

    Clarke, J.H.

    1978-02-01

    The program PLOTGEOM has been modified to work on the A.E.R.E., Harwell IBM 370-167 computer using the GHOST graphics package. The control data routine has been altered to permit free format input and the program has been compiled and stored using the extended-H FORTRAN optimising compiler. (author)

  8. Low-level waste management program: technical program overview

    International Nuclear Information System (INIS)

    Lowrie, R.S.

    1981-01-01

    The mission of the technical program is to develop the technology component of the Department of Energy's Low-Level Waste Management Program and to manage research and development, demonstration, and documentation of the technical aspects of the program. Some of the major technology objectives are: develop and demonstrate techniques for waste generation reduction; develop and demonstrate waste treatment, handling and packaging techniques; develop and demonstrate the technology for greater confinement; and develop the technology for remedial action at existing sites. In addition there is the technology transfer objective which is to compile and issue a handbook documenting the technology for each of the above technology objectives

  9. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  10. ABSTRACT IMPLEMENTATION OF A KIND TO MANAGE DATA SETS USING PROGRAMMING LANGUAGE C + +

    OpenAIRE

    Ruiz L., Edgar; Hinojosa L., Hilmar

    2014-01-01

    This article presents the implementation of a data abstract type to represent the Set Theory Mathematical Concept, The program has been written in C++ Program Language applying the Object Oriented Programming Paradigm through a Dev C++ v.4.1 Compiler, a GNU compiler with GPL licence. El artículo presenta la implementación de un tipo abstracto de datos para representar el concepto matemático de la teoría de conjuntos. El programa ha sido escrito en lenguaje de programación C++ aplicando el ...

  11. FORTRAN program for calculating liquid-phase and gas-phase thermal diffusion column coefficients

    International Nuclear Information System (INIS)

    Rutherford, W.M.

    1980-01-01

    A computer program (COLCO) was developed for calculating thermal diffusion column coefficients from theory. The program, which is written in FORTRAN IV, can be used for both liquid-phase and gas-phase thermal diffusion columns. Column coefficients for the gas phase can be based on gas properties calculated from kinetic theory using tables of omega integrals or on tables of compiled physical properties as functions of temperature. Column coefficients for the liquid phase can be based on compiled physical property tables. Program listings, test data, sample output, and users manual are supplied for appendices

  12. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  13. Bibliography of Literature on Neuro-Linguistic Programming.

    Science.gov (United States)

    McCormick, Donald W.

    Two bibliographies on neurolinguistic programming are updates of an earlier literature review by the same compiler. The two lists contain citations of over 160 books, research reports, dissertations, journal articles, audio and video recordings, and research projects in progress on aspects of neurolinguistic programming. Appended notes suggest…

  14. Silicon compilation: From the circuit to the system

    Science.gov (United States)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  15. A Computer Program to Compile a Flander-Amidon Interaction Analysis Matrix

    Science.gov (United States)

    Hardy, Robert C.

    1970-01-01

    A program was written in FORTRAN IV for an IBM 3600 to produce the Flanders-Amidon Interaction Analysis Matrix and to also produce percentages of certain p FORTRAN IV and V for the Univac 1108. (Editor/RT)

  16. Compilation of nuclear decay data used for dose calculations. Data for radionuclides not listed in ICRP publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Tsutomu

    1999-07-01

    Nuclear decay data used for dose calculations were compiled for 162 nuclides with half-lives greater than or equal to 10 min that are not listed in ICRP Publication 38 (Publ. 38) and their 28 daughter nuclides. Additional 14 nuclides that are considered to be important in fusion reactor facilities were also included. The data were compiled using decay data sets of the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Investigations of the data sets were performed to check their consistency by referring to recent literature and NUBASE, the database for nuclear and decay properties of nuclides, and by using the utility programs of ENSDF. Possible revisions of the data sets were made for their format and syntax errors, level schemes, normalization records, and so on. The revised data sets were processed by EDISTR in order to calculate the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays including annihilation photons, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformations of the radionuclides. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were presented in two types of format; Publ. 38 and NUCDECAY formats. This report provides the decay data in the Publ. 38 format along with decay scheme drawings. The data will be widely used for internal and external dose calculations in radiation protection. (author)

  17. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  18. Compendium of student papers : 2011 undergraduate transportation scholars program.

    Science.gov (United States)

    2012-05-01

    This report is a compilation of research papers written by students participating in the 2011 Undergraduate : Transportation Scholars Program. The 10-week summer program, now in its 21st year, provides : undergraduate students in Civil Engineering th...

  19. Compendium of student papers : 2012 undergraduate transportation scholars program.

    Science.gov (United States)

    2013-05-01

    This report is a compilation of research papers written by students participating in the 2012 Undergraduate : Transportation Scholars Program. The 10-week summer program, now in its 22nd year, provides : undergraduate students in Civil Engineering th...

  20. Compendium of student papers : 2010 undergraduate transportation scholars program.

    Science.gov (United States)

    2011-06-01

    This report is a compilation of research papers written by students participating in the 2010 Undergraduate : Transportation Scholars Program. The 10-week summer program, now in its 20th year, provides : undergraduate students in Civil Engineering th...

  1. Guide to NRC reporting and recordkeeping requirements. Compiled from requirements in Title 10 of the U.S. Code of Federal Regulations as codified on December 31, 1993; Revision 1

    International Nuclear Information System (INIS)

    Collins, M.; Shelton, B.

    1994-07-01

    This compilation includes in the first two sections the reporting and recordkeeping requirements applicable to US Nuclear Regulatory Commission (NRC) licensees and applicants and to members of the public. It includes those requirements codified in Title 10 of the code of Federal Regulations, Chapter 1, on December 31, 1993. It also includes, in a separate section, any of those requirements that were superseded or discontinued between January 1992 and December 1993. Finally, the appendix lists mailing and delivery addresses for NRC Headquarters and Regional Offices mentioned in the compilation. The Office of Information Resources Management staff compiled this listing of reporting and recordkeeping requirements to briefly describe each in a single document primarily to help licensees readily identify the requirements. The compilation is not a substitute for the regulations, and is not intended to impose any new requirements or technical positions. It is part of NRC's continuing efforts to comply with the Paperwork Reduction Act of 1980 and the Office of Management and Budget regulations that mandate effective and efficient Federal information resources management programs

  2. R programming for the Quarterly National Accounts: Moroccan case

    Directory of Open Access Journals (Sweden)

    Houssam HACHIMI

    2017-11-01

    Full Text Available The compilation of quarterly national accounts (QNA has different methods based on the specificity of the statistical system of the country, in effect, the method of calibration adopted by the Moroccan national account department has several steps that estimates indirectly the quarterly components of the Gross Domestic Product (GDP by using statistics indicators as regressors in a linear model. The use of R, as statistical software for the compilation of that official statistics presents some challenges for the statisticians from the first step of the data import, until the export of the results, the responsible of the compilation of the QNA must have good algorithmic coding skills so as he can build his R program by choosing the adequate packages and version of the R software. The objective of this work is to present the R program and challenges that face the Moroccan case.

  3. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  4. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  5. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  6. On the Automatic Parallelization of Sparse and Irregular Fortran Programs

    Directory of Open Access Journals (Sweden)

    Yuan Lin

    1999-01-01

    Full Text Available Automatic parallelization is usually believed to be less effective at exploiting implicit parallelism in sparse/irregular programs than in their dense/regular counterparts. However, not much is really known because there have been few research reports on this topic. In this work, we have studied the possibility of using an automatic parallelizing compiler to detect the parallelism in sparse/irregular programs. The study with a collection of sparse/irregular programs led us to some common loop patterns. Based on these patterns new techniques were derived that produced good speedups when manually applied to our benchmark codes. More importantly, these parallelization methods can be implemented in a parallelizing compiler and can be applied automatically.

  7. Compendium of student papers : 2013 undergraduate transportation scholars program.

    Science.gov (United States)

    2013-11-01

    This report is a compilation of research papers written by students participating in the 2013 Undergraduate Transportation Scholars Program. The 10-week summer program, now in its 23nd year, provides undergraduate students in Civil Engineering the op...

  8. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  9. P3T+: A Performance Estimator for Distributed and Parallel Programs

    Directory of Open Access Journals (Sweden)

    T. Fahringer

    2000-01-01

    Full Text Available Developing distributed and parallel programs on today's multiprocessor architectures is still a challenging task. Particular distressing is the lack of effective performance tools that support the programmer in evaluating changes in code, problem and machine sizes, and target architectures. In this paper we introduce P3T+ which is a performance estimator for mostly regular HPF (High Performance Fortran programs but partially covers also message passing programs (MPI. P3T+ is unique by modeling programs, compiler code transformations, and parallel and distributed architectures. It computes at compile-time a variety of performance parameters including work distribution, number of transfers, amount of data transferred, transfer times, computation times, and number of cache misses. Several novel technologies are employed to compute these parameters: loop iteration spaces, array access patterns, and data distributions are modeled by employing highly effective symbolic analysis. Communication is estimated by simulating the behavior of a communication library used by the underlying compiler. Computation times are predicted through pre-measured kernels on every target architecture of interest. We carefully model most critical architecture specific factors such as cache lines sizes, number of cache lines available, startup times, message transfer time per byte, etc. P3T+ has been implemented and is closely integrated with the Vienna High Performance Compiler (VFC to support programmers develop parallel and distributed applications. Experimental results for realistic kernel codes taken from real-world applications are presented to demonstrate both accuracy and usefulness of P3T+.

  10. International Fund of Aral Sea rescue:prospects and new aims

    International Nuclear Information System (INIS)

    Aslov, S.M.

    2003-01-01

    In this chapter of book author suggest about aims and prospects of International Fund of Aral Sea rescue. The base aim of International Fund of Aral Sea rescue was the funds attraction of Central Asia governments and international community-donors for financing of Program of Aral Sea basin. Now the Executive Committee of International Fund of Aral Sea rescue develops the program of basin of Aral Sea for the period till 2010

  11. Compendium of student papers : 2009 undergraduate transportation engineering fellows program.

    Science.gov (United States)

    2009-10-01

    This report is a compilation of research papers written by students participating in the 2009 Undergraduate : Transportation Scholars Program. The ten-week summer program, now in its nineteenth year, provides : undergraduate students in Civil Enginee...

  12. The Compilation of the Shona–English Biomedical Dictionary: Problems and Challenges

    Directory of Open Access Journals (Sweden)

    Nomalanga Mpofu

    2011-10-01

    Full Text Available

    ABSTRACT: The bilingual Shona–English dictionary of biomedical terms, Duramazwi reUrapi neUtano, was compiled with the aim of improving the efficiency of communication between doctor and patient. The dictionary is composed of terms from both modern and traditional medicinal practices. The article seeks to look at the methods of production of the dictionary, the presentation of entries in the dictionary and the problems and challenges encountered in the compilation proc-ess, namely, developing Shona medical terminology in the cultural context and especially the as-pect of equivalence between English and Shona biomedical terms.

    Keywords: BIOMEDICAL, ADOPTIVES, ENTRIES, SYNONYMS, CROSS-REFERENCES, IDIOMS, CIRCUMLOCUTION, STANDARDISATION, HEADWORD, EQUIVALENCE, VARI-ANTS, DEFINITION, CULTURE, EUPHEMISMS, MODERN, TRADITIONAL, MONOLINGUAL, BILINGUAL, CORPUS, BORROWING, SHONA, COMMUNICATION

    *****

    OPSOMMING: Die samestelling van die Sjona–Engelse biomediese woorde-boek: Probleme en uitdagings. Die tweetalige Sjona–Engelse woordeboek van biomediese terme, Duramazwi reUrapi neUtano, is saamgestel met die doel om die effektiwiteit van kommunika-sie tussen dokter en pasiënt te verbeter. Die woordeboek bestaan uit terme van sowel moderne as tradisionele geneeskundige praktyke. Die artikel wil die metodes van die totstandkoming van die woordeboek beskou, die aanbieding van die inskrywings in die woordeboek en die probleme en uitdagings wat in die samestellingsproses teëgekom is, naamlik, die ontwikkeling van Sjona- mediese terminolgie binne die kulturele konteks en veral die aspek van ekwivalensie tussen Engel-se en Sjona- biomediese terme.

    Sleutelwoorde: BIOMEDIES, LEENWOORDE, INSKRYWINGS, SINONIEME, KRUISVER-WYSINGS, IDIOME, OMSKRYWING, STANDAARDISASIE, TREFWOORD, EKWIVALENSIE, WISSELVORME, DEFINISIE, KULTUUR, EUFEMISMES, MODERN, TRADISIONEEL, EEN-TALIG, TWEETALIG, KORPUS, ONTLENING, KOMMUNIKASIE, SJONA

  13. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  14. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  15. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  16. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  17. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  18. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  19. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  20. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  1. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  2. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1995 annual. Volume 17

    International Nuclear Information System (INIS)

    1996-04-01

    This compilation contains 44 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1995. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  3. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  4. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  5. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  6. Empolder and application of LiveWire program

    International Nuclear Information System (INIS)

    Zhang Bo; Li Jing; Wang Xiaoming

    2007-01-01

    LiveWire is a specific module of Netscape Web server to actualize CGI function; through LiveWire application program one can create dynamic web page on web site. This article introduces how to write LiveWire application code, have to compile, debug and manage LiveWire application programs, and how to apply LiveWire application program on Netscape Web server to create a dynamic web page. (authors)

  7. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  8. Modeling EERE Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Hostick, Donna J.; Belzer, David B.; Livingston, Olga V.

    2007-11-08

    The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.

  9. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  10. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    common anions, gamma spectrometry, metals, corrosivity, organics and alpha spectrometry (note: alpha spectrometry was cancelled during the performance of this work with concurrence from CHPRC). Results may help elucidate the components that led to the unexpected reaction in the canyon as well as inform the radiological and hazardous characteristics. The specific anions, gamma emitters, organics and metals requested by CHPRC are provided in the analytical reports sections. The individual analyses were conducted under the Plutonium Finishing Plant (PFP) Floor Pan Evaluation Project Quality Assurance Project Plan (PFP Floor Pan Evaluation QAPP, Revision 0.) developed by PNNL specifically for this project. The final reports for each analysis set are included in this compilation of the results. Each package was reviewed under the PFP Floor Pan Evaluation Project Quality Assurance Project Plan so no additional reviews were conducted in this compilation task. The Gas Generation Rates in Appendix G were conducted under the PNNL “How Do I…” quality assurance program and were NOT conducted under the Plutonium Finishing Plant (PFP) Floor Pan Evaluation Project Quality Assurance Project Plan (PFP Floor Pan Evaluation QAPP, Revision 0.).

  11. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  12. A survey aimed at general citizens of the US and Japan about their attitudes toward electronic medical data handling.

    Science.gov (United States)

    Kimura, Michio; Nakaya, Jun; Watanabe, Hiroshi; Shimizu, Toshiro; Nakayasu, Kazuyuki

    2014-04-25

    To clarify the views of the general population of two countries (US and Japan), concerning the handling of their medical records electronically. We contacted people nationwide in the United States at random via Random Digit Dialing (RDD) to obtain 200 eligible responders. The questionnaire was for obtaining the information on their attitudes towards handling of their medical records, disclosure of the name of disease, secondary usage of information, compiling their records into a lifelong medical record, and access to their medical records on the Internet. We had also surveyed people of Shizuoka prefecture in Japan using same questionnaires sent by mail, for which we obtained 457 valid answers. Even in an unidentifiable manner, US people feel profit-oriented usage of medical data without specific consent is not acceptable. There is a significant difference between usage of unidentifiable medical data for profit (about 50% feel negatively) and for official/research purposes (about 30% feel negatively). About 60% of the US responders have a negative view on the proposal that unidentifiable medical information be utilized for profit by private companies to attain healthcare cost savings. As regards compiling a lifelong medical record, positive answers and negative answers are almost equally divided in the US (46% vs. 38%) while more positive attitudes are seen in Japan (74% vs. 12%). However, any incentive measures aimed at changing attitudes to such a compiling including the discount of healthcare costs or insurance fees are unwelcomed by people regardless of their age or health condition in both surveys. Regarding the access to their own medical record via the Internet, 38% of the US responders feel this is unacceptable while 50.5% were willing to accept it. Participants from the US think that the extent of the sharing their identifiable medical records should be limited to the doctors-in-charge and specified doctors referred to by their own doctors. On the other

  13. A Survey Aimed at General Citizens of the US and Japan about Their Attitudes toward Electronic Medical Data Handling

    Directory of Open Access Journals (Sweden)

    Michio Kimura

    2014-04-01

    Full Text Available Objectives: To clarify the views of the general population of two countries (US and Japan, concerning the handling of their medical records electronically. Methods: We contacted people nationwide in the United States at random via Random Digit Dialing (RDD to obtain 200 eligible responders. The questionnaire was for obtaining the information on their attitudes towards handling of their medical records, disclosure of the name of disease, secondary usage of information, compiling their records into a lifelong medical record, and access to their medical records on the Internet. We had also surveyed people of Shizuoka prefecture in Japan using same questionnaires sent by mail, for which we obtained 457 valid answers. Results: Even in an unidentifiable manner, US people feel profit-oriented usage of medical data without specific consent is not acceptable. There is a significant difference between usage of unidentifiable medical data for profit (about 50% feel negatively and for official/research purposes (about 30% feel negatively. About 60% of the US responders have a negative view on the proposal that unidentifiable medical information be utilized for profit by private companies to attain healthcare cost savings. As regards compiling a lifelong medical record, positive answers and negative answers are almost equally divided in the US (46% vs. 38% while more positive attitudes are seen in Japan (74% vs. 12%. However, any incentive measures aimed at changing attitudes to such a compiling including the discount of healthcare costs or insurance fees are unwelcomed by people regardless of their age or health condition in both surveys. Regarding the access to their own medical record via the Internet, 38% of the US responders feel this is unacceptable while 50.5% were willing to accept it. Conclusions: Participants from the US think that the extent of the sharing their identifiable medical records should be limited to the doctors-in-charge and specified

  14. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  15. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  16. Design Considerations in Developing a Text Messaging Program Aimed at Smoking Cessation

    Science.gov (United States)

    Holtrop, Jodi Summers; Bağci Bosi, A Tülay; Emri, Salih

    2012-01-01

    Background Cell phone text messaging is gaining increasing recognition as an important tool that can be harnessed for prevention and intervention programs across a wide variety of health research applications. Despite the growing body of literature reporting positive outcomes, very little is available about the design decisions that scaffold the development of text messaging-based health interventions. What seems to be missing is documentation of the thought process of investigators in the initial stages of protocol and content development. This omission is of particular concern because many researchers seem to view text messaging as the intervention itself instead of simply a delivery mechanism. Certainly, aspects of this technology may increase participant engagement. Like other interventions, however, the content is a central driver of the behavior change. Objective To address this noted gap in the literature, we discuss the protocol decisions and content development for SMS Turkey (or Cebiniz birakin diyor in Turkish), a smoking cessation text messaging program for adult smokers in Turkey. Methods Content was developed in English and translated into Turkish. Efforts were made to ensure that the protocol and content were grounded in evidence-based smoking cessation theory, while also reflective of the cultural aspects of smoking and quitting in Turkey. Results Methodological considerations included whether to provide cell phones and whether to reimburse participants for texting costs; whether to include supplementary intervention resources (eg, personal contact); and whether to utilize unidirectional versus bidirectional messaging. Program design considerations included how messages were tailored to the quitting curve and one’s smoking status after one’s quit date, the number of messages participants received per day, and over what period of time the intervention lasted. Conclusion The content and methods of effective smoking cessation quitline programs were

  17. Utilizing the RE-AIM Framework in formative evaluation and program planning for a healthy food choice intervention in the Lower Mississippi Delta

    Science.gov (United States)

    A robust approach to program planning is needed for the development and execution of effective and sustainable behavioral interventions with large public health impact. The purpose of this formative research was to apply dimensions of the RE-AIM (i.e., Reach, Effectiveness, Adoption, Implementation,...

  18. Prevention program at construction worksites aimed at improving health and work ability is cost-saving to the employer: Results from an RCT

    NARCIS (Netherlands)

    Oude Hengel, K.M.; Bosmans, J.E.; Dongen, J.M. van; Bongers, P.M.; Beek, A.J. van der; Blatter, B.M.

    2014-01-01

    Background: To prolong sustainable healthy working lives of construction workers, a prevention program was developed which aimed to improve the health and work ability of construction workers. The objective of this study was to analyze the cost-effectiveness and financial return from the employers'

  19. 2014 Water Power Program Peer Review: Marine and Hydrokinetic Technologies, Compiled Presentations (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    2014-02-01

    This document represents a collection of all presentations given during the EERE Wind and Water Power Program's 2014 Marine and Hydrokinetic Peer Review. The purpose of the meeting was to evaluate DOE-funded hydropower and marine and hydrokinetic R&D projects for their contribution to the mission and goals of the Water Power Program and to assess progress made against stated objectives.

  20. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  1. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    Science.gov (United States)

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  2. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  3. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  4. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  5. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  6. Bibliography of the space processing program. Volume 1: A compilation through June 1974, Parts 1 and 2. [space manufacturing/spacecraft construction materials - aerospace environments

    Science.gov (United States)

    Shoultz, M. B.; Mcclurken, E. W., Jr.

    1975-01-01

    A compilation of NASA research efforts in the area of space environmental effects on materials and processes is presented. Topics considered are: (1) fluid mechanics and heat transfer; (2) crystal growth and containerless melts; (3) acoustics; (4) glass and ceramics; (5) electrophoresis; (6) welding; and (7) exobiology.

  7. Usability Issues in the Design of Novice Programming Systems,

    Science.gov (United States)

    1996-08-01

    lists this as a design principle for novice programming environments. In traditional compiled languages, beginners are also confused by the need to...programming task external knowledge that might interfere with correct under- standing of the language. Most beginner programming errors can be...language for text editing, but [Curtis 1988] found that a textual pseudocode and graphical flowcharts were both bet- ter than natural language in program

  8. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  9. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  10. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  11. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  12. The BLAZE language - A parallel language for scientific programming

    Science.gov (United States)

    Mehrotra, Piyush; Van Rosendale, John

    1987-01-01

    A Pascal-like scientific programming language, BLAZE, is described. BLAZE contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus BLAZE should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with conceptually sequential control flow. A central goal in the design of BLAZE is portability across a broad range of parallel architectures. The multiple levels of parallelism present in BLAZE code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of BLAZE are described and it is shown how this language would be used in typical scientific programming.

  13. The BLAZE language: A parallel language for scientific programming

    Science.gov (United States)

    Mehrotra, P.; Vanrosendale, J.

    1985-01-01

    A Pascal-like scientific programming language, Blaze, is described. Blaze contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus Blaze should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with onceptually sequential control flow. A central goal in the design of Blaze is portability across a broad range of parallel architectures. The multiple levels of parallelism present in Blaze code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of Blaze are described and shows how this language would be used in typical scientific programming.

  14. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  15. HAPS, a Handy Analog Programming System

    DEFF Research Database (Denmark)

    Højberg, Kristian Søe

    1975-01-01

    HAPS (Hybrid Analog Programming System) is an analog compiler that can be run on a minicomputer in an interactive mode. Essentially HAPS is written in FORTRAN. The equations to be programmed for an ana log computer are read in by using a FORTRAN-like notation. The input must contain maximum...... and emphasizes the limitations HAPS puts on equation structure, types of computing circuit, scaling, and static testing....

  16. A compilation of reports of the Advisory Committee on Reactor Safeguards, 1997 annual, U.S. Nuclear Regulatory Commission. Volume 19

    International Nuclear Information System (INIS)

    1998-04-01

    This compilation contains 67 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1997. It also includes a report to the Congress on the NRC Safety Research Program. Specific topics include: (1) advanced reactor designs, (2) emergency core cooling systems, (3) fire protection, (4) generic letters and issues, (5) human factors, (6) instrumentation, control and protection systems, (7) materials engineering, (8) probabilistic risk assessment, (9) regulatory guides and procedures, rules, regulations, and (10) safety research, philosophy, technology and criteria

  17. A compilation of reports of the Advisory Committee on Reactor Safeguards, 1997 annual, U.S. Nuclear Regulatory Commission. Volume 19

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    This compilation contains 67 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1997. It also includes a report to the Congress on the NRC Safety Research Program. Specific topics include: (1) advanced reactor designs, (2) emergency core cooling systems, (3) fire protection, (4) generic letters and issues, (5) human factors, (6) instrumentation, control and protection systems, (7) materials engineering, (8) probabilistic risk assessment, (9) regulatory guides and procedures, rules, regulations, and (10) safety research, philosophy, technology and criteria.

  18. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  19. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  20. The CCAA program aims to improve the capacity of African countries ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    CCAA

    of African countries to adapt to climate change in ways ... training. • Communications and networking ... in Cities of Portuguese-speaking Small Island Developing States – ... This includes training in topics core to addressing climate ... Program (ACCFP) to deepen Africa's capacity in policy, teaching .... and rural livelihoods.

  1. Checking Java Programs

    CERN Document Server

    Darwin, Ian

    2007-01-01

    This Short Cut tells you about tools that will improve the quality of your Java code, using checking above and beyond what the standard tools do, including: Using javac options, JUnit and assertions Making your IDE work harder Checking your source code with PMD Checking your compiled code (.class files) with FindBugs Checking your program's run-time behavior with Java PathFinder

  2. Consolidated fuel reprocessing. Program progress report, April 1-June 30, 1980

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    This progress report is compiled from major contributions from three programs: (1) the Advanced Fuel Recycle Program at ORNL; (2) the Converter Fuel Reprocessing Program at Savannah River Laboratory; and (3) the reprocessing components of the HTGR Fuel Recycle Program, primarily at General Atomic and ORNL. The coverage is generally overview in nature; experimental details and data are limited.

  3. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  4. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  5. Analogy Mapping Development for Learning Programming

    Science.gov (United States)

    Sukamto, R. A.; Prabawa, H. W.; Kurniawati, S.

    2017-02-01

    Programming skill is an important skill for computer science students, whereas nowadays, there many computer science students are lack of skills and information technology knowledges in Indonesia. This is contrary with the implementation of the ASEAN Economic Community (AEC) since the end of 2015 which is the qualified worker needed. This study provided an effort for nailing programming skills by mapping program code to visual analogies as learning media. The developed media was based on state machine and compiler principle and was implemented in C programming language. The state of every basic condition in programming were successful determined as analogy visualization.

  6. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  7. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  8. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  9. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  10. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  11. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  12. Infrastructure for Rapid Development of Java GUI Programs

    Science.gov (United States)

    Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip

    2006-01-01

    The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.

  13. Towards Implementation of a Generalized Architecture for High-Level Quantum Programming Language

    Science.gov (United States)

    Ameen, El-Mahdy M.; Ali, Hesham A.; Salem, Mofreh M.; Badawy, Mahmoud

    2017-08-01

    This paper investigates a novel architecture to the problem of quantum computer programming. A generalized architecture for a high-level quantum programming language has been proposed. Therefore, the programming evolution from the complicated quantum-based programming to the high-level quantum independent programming will be achieved. The proposed architecture receives the high-level source code and, automatically transforms it into the equivalent quantum representation. This architecture involves two layers which are the programmer layer and the compilation layer. These layers have been implemented in the state of the art of three main stages; pre-classification, classification, and post-classification stages respectively. The basic building block of each stage has been divided into subsequent phases. Each phase has been implemented to perform the required transformations from one representation to another. A verification process was exposed using a case study to investigate the ability of the compiler to perform all transformation processes. Experimental results showed that the efficacy of the proposed compiler achieves a correspondence correlation coefficient about R ≈ 1 between outputs and the targets. Also, an obvious achievement has been utilized with respect to the consumed time in the optimization process compared to other techniques. In the online optimization process, the consumed time has increased exponentially against the amount of accuracy needed. However, in the proposed offline optimization process has increased gradually.

  14. Materials Sciences programs, Fiscal Year 1984

    International Nuclear Information System (INIS)

    1984-09-01

    This report provides a convenient compilation and index of the DOE Materials Sciences Division programs. The report is divided into six sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the Small Business Innovation Research program, Section D has information on DOE collaborative research centers, Section E gives distributions of funding, and Section F has various indexes

  15. Scientific Programming in Fortran

    Directory of Open Access Journals (Sweden)

    W. Van Snyder

    2007-01-01

    Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.

  16. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  17. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  18. On program restructuring, scheduling, and communication for parallel processor systems

    Energy Technology Data Exchange (ETDEWEB)

    Polychronopoulos, Constantine D. [Univ. of Illinois, Urbana, IL (United States)

    1986-08-01

    This dissertation discusses several software and hardware aspects of program execution on large-scale, high-performance parallel processor systems. The issues covered are program restructuring, partitioning, scheduling and interprocessor communication, synchronization, and hardware design issues of specialized units. All this work was performed focusing on a single goal: to maximize program speedup, or equivalently, to minimize parallel execution time. Parafrase, a Fortran restructuring compiler was used to transform programs in a parallel form and conduct experiments. Two new program restructuring techniques are presented, loop coalescing and subscript blocking. Compile-time and run-time scheduling schemes are covered extensively. Depending on the program construct, these algorithms generate optimal or near-optimal schedules. For the case of arbitrarily nested hybrid loops, two optimal scheduling algorithms for dynamic and static scheduling are presented. Simulation results are given for a new dynamic scheduling algorithm. The performance of this algorithm is compared to that of self-scheduling. Techniques for program partitioning and minimization of interprocessor communication for idealized program models and for real Fortran programs are also discussed. The close relationship between scheduling, interprocessor communication, and synchronization becomes apparent at several points in this work. Finally, the impact of various types of overhead on program speedup and experimental results are presented.

  19. Optimizing python-based ROOT I/O with PyPy's tracing just-in-time compiler

    Science.gov (United States)

    Tlp Lavrijsen, Wim

    2012-12-01

    The Python programming language allows objects and classes to respond dynamically to the execution environment. Most of this, however, is made possible through language hooks which by definition can not be optimized and thus tend to be slow. The PyPy implementation of Python includes a tracing just in time compiler (JIT), which allows similar dynamic responses but at the interpreter-, rather than the application-level. Therefore, it is possible to fully remove the hooks, leaving only the dynamic response, in the optimization stage for hot loops, if the types of interest are opened up to the JIT. A general opening up of types to the JIT, based on reflection information, has already been developed (cppyy). The work described in this paper takes it one step further by customizing access to ROOT I/O to the JIT, allowing for fully automatic optimizations.

  20. Optimizing python-based ROOT I/O with PyPy's tracing just-in-time compiler

    International Nuclear Information System (INIS)

    Lavrijsen, Wim TLP

    2012-01-01

    The Python programming language allows objects and classes to respond dynamically to the execution environment. Most of this, however, is made possible through language hooks which by definition can not be optimized and thus tend to be slow. The PyPy implementation of Python includes a tracing just in time compiler (JIT), which allows similar dynamic responses but at the interpreter-, rather than the application-level. Therefore, it is possible to fully remove the hooks, leaving only the dynamic response, in the optimization stage for hot loops, if the types of interest are opened up to the JIT. A general opening up of types to the JIT, based on reflection information, has already been developed (cppyy). The work described in this paper takes it one step further by customizing access to ROOT I/O to the JIT, allowing for fully automatic optimizations.

  1. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  2. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  3. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  4. Compiling language definitions: the ASF+SDF compiler

    NARCIS (Netherlands)

    M.G.J. van den Brand (Mark); J. Heering (Jan); P. Klint (Paul); P.A. Olivier (Pieter)

    2000-01-01

    textabstractThe ASF+SDF Meta-Environment is an interactive language development environment whose main application areas are definition of domain-specific languages, generation of program analysis and transformation tools, production of software renovation tools, and general specification and

  5. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  6. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  7. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  8. Towards programming languages for genetic engineering of living cells.

    Science.gov (United States)

    Pedersen, Michael; Phillips, Andrew

    2009-08-06

    Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts.

  9. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  10. A model for the design and programming of multi-cores

    NARCIS (Netherlands)

    Jesshope, C.; Grandinetti, L.

    2008-01-01

    This paper describes a machine/programming model for the era of multi-core chips. It is derived from the sequential model but replaces sequential composition with concurrent composition at all levels in the program except at the level where the compiler is able to make deterministic decisions on

  11. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  12. Galveston Head Start Captive Reared Sea Turtle Program 1979 to 2016 (NCEI Accession 0157625)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This is a compilation of several data sets related to the Galveston Texas Seaturtle Headstart program. Most notable is the Kemp's ridley headstart program...

  13. Materials Sciences programs, fiscal year 1986

    International Nuclear Information System (INIS)

    1986-09-01

    Purpose of this report is to provide a convenient compilation and index of the DOE Materials Sciences Division programs. The report is divided into six sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the Small Business Innovation Research Program, Sections D and E have information on DOE collaborative research centers, Section F gives distribution of funding, and Section G has various indexes

  14. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  15. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  16. Functional Programming with C++ Template Metaprograms

    Science.gov (United States)

    Porkoláb, Zoltán

    Template metaprogramming is an emerging new direction of generative programming. With the clever definitions of templates we can force the C++ compiler to execute algorithms at compilation time. Among the application areas of template metaprograms are the expression templates, static interface checking, code optimization with adaption, language embedding and active libraries. However, as template metaprogramming was not an original design goal, the C++ language is not capable of elegant expression of metaprograms. The complicated syntax leads to the creation of code that is hard to write, understand and maintain. Although template metaprogramming has a strong relationship with functional programming, this is not reflected in the language syntax and existing libraries. In this paper we give a short and incomplete introduction to C++ templates and the basics of template metaprogramming. We will enlight the role of template metaprograms, and some important and widely used idioms. We give an overview of the possible application areas as well as debugging and profiling techniques. We suggest a pure functional style programming interface for C++ template metaprograms in the form of embedded Haskell code which is transformed to standard compliant C++ source.

  17. Summary report of the 1. research co-ordination meeting on compilation and evaluation of photonuclear data for applications

    International Nuclear Information System (INIS)

    1997-04-01

    The present report contains the summary of the first Research Co-ordination Meeting on ''Compilation and Evaluation of Photonuclear Data for Applications'', held in Obninsk, Russia, from 3 to 6 December 1996. The project aims to produce a Technical Document on Photonuclear Data Library for Applications and to develop an IAEA Photonuclear Data Library. Summarized are the conclusions and recommendations of the meeting together with a detailed list of actions. Attached is the information sheet on the project, the agenda of the meeting and the list of participants along with extended abstracts of their presentations. Refs, figs, tabs

  18. NVL-C: Static Analysis Techniques for Efficient, Correct Programming of Non-Volatile Main Memory Systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2016-01-01

    Computer architecture experts expect that non-volatile memory (NVM) hierarchies will play a more significant role in future systems including mobile, enterprise, and HPC architectures. With this expectation in mind, we present NVL-C: a novel programming system that facilitates the efficient and correct programming of NVM main memory systems. The NVL-C programming abstraction extends C with a small set of intuitive language features that target NVM main memory, and can be combined directly with traditional C memory model features for DRAM. We have designed these new features to enable compiler analyses and run-time checks that can improve performance and guard against a number of subtle programming errors, which, when left uncorrected, can corrupt NVM-stored data. Moreover, to enable recovery of data across application or system failures, these NVL-C features include a flexible directive for specifying NVM transactions. So that our implementation might be extended to other compiler front ends and languages, the majority of our compiler analyses are implemented in an extended version of LLVM's intermediate representation (LLVM IR). We evaluate NVL-C on a number of applications to show its flexibility, performance, and correctness.

  19. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  20. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  1. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  2. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  3. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  4. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  5. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  6. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  7. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  8. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  9. Prognosis of medical and economic efficiency of a patient-oriented program implementation aimed at formation of adherenceto drug therapy among rural population

    Directory of Open Access Journals (Sweden)

    E A Kitaeva

    2018-02-01

    Full Text Available Aim. Development and implementation of novel organizational management technologies of medical care aimed at formation of adherence to drug therapy in patients from rural areas and calculation of medical and economic efficiency of implementation of this project. Methods. The study subject was the population of Rybnaya Sloboda district of the Republic of Tatarstan. Patient recruitment into the groups was conducted in the polyclinic of Rybnaya Sloboda central regional hospital. The duration of the study was 6 months for each of two groups with further follow-up and evaluation of adherence to therapy for 2 months. Results. Annually stroke affects 5.6 to 6.6 million of people around the world, 35% of whom die in the acute period. Recently, serious rejuvenation of cardiovascular disorders has been observed. The main reason for such trend is low patients’ compliance to drug therapy. And patients’ compliance itself allows significantly decreasing the risk of cardiovascular complications. The article discussed the issues of low compliance to drug therapy, presents the methods of its formation in patients from rural area. The examples of foreign and Russian experience of increasing patients’ compliance to drug therapy are described and the key intervention points for patients are determined. On the basis of conducted analysis, implementation was developed and suggested for patient-oriented program aimed at formation of adherence to drug therapy of rural population. Also, the authors performed evaluation of medical and economic efficiency of implementation of a patient-oriented program aimed at formation of adherence to drug therapy of rural population (assessment of expenditures for medications, hospital stay, incapacity related to the main disease; evaluation of expenditures for prevention of complications and disability. Conclusion. Effective organization of prophylactic activity is of great importance for prevention of cardiovascular disease

  10. USERDA computer program summaries. Numbers 177--239

    International Nuclear Information System (INIS)

    1975-10-01

    Since 1960 the Argonne Code Center has served as a U. S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U. S. Atomic Energy Commission program areas and the compilation and publication of this report. The Computer Program Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data

  11. Low-temperature geothermal water in Utah: A compilation of data for thermal wells and springs through 1993

    Energy Technology Data Exchange (ETDEWEB)

    Blackett, R.E.

    1994-07-01

    The Geothermal Division of DOE initiated the Low-Temperature Geothermal Resources and Technology Transfer Program, following a special appropriation by Congress in 1991, to encourage wider use of lower-temperature geothermal resources through direct-use, geothermal heat-pump, and binary-cycle power conversion technologies. The Oregon Institute of Technology (OIT), the University of Utah Research Institute (UURI), and the Idaho Water Resources Research Institute organized the federally-funded program and enlisted the help of ten western states to carry out phase one. This first phase involves updating the inventory of thermal wells and springs with the help of the participating state agencies. The state resource teams inventory thermal wells and springs, and compile relevant information on each sources. OIT and UURI cooperatively administer the program. OIT provides overall contract management while UURI provides technical direction to the state teams. Phase one of the program focuses on replacing part of GEOTHERM by building a new database of low- and moderate-temperature geothermal systems for use on personal computers. For Utah, this involved (1) identifying sources of geothermal date, (2) designing a database structure, (3) entering the new date; (4) checking for errors, inconsistencies, and duplicate records; (5) organizing the data into reporting formats; and (6) generating a map (1:750,000 scale) of Utah showing the locations and record identification numbers of thermal wells and springs.

  12. Effectiveness of a stress management pilot program aimed at reducing the incidence of sports injuries in young football (soccer) players.

    Science.gov (United States)

    Olmedilla-Zafra, Aurelio; Rubio, Victor J; Ortega, Enrique; García-Mas, Alexandre

    2017-03-01

    Several attempts to reduce the incidence of sport injuries using psychosocial interventions produced fruitful, although inconclusive results. This paper presents the effectiveness and implementation issues of a pilot 3-month stress-management and muscle relaxation program aimed at reducing sport injury incidence. Pre-post treatment-non treatment group comparison. The program was administered by a trained psychologist on a once-a-week, 1-h session basis. Seventy-four male soccer players from four National Youth league teams voluntarily participated. Teams were randomly assigned to either treatment/non-treatment group. Injury protocol, Self-monitoring cards, Athletes' satisfaction and commitment survey, Coaches' interview. Group main effect and Time-Group interaction effect were both statistically significant, F(1,60) = 8.30, p = 0.005, η 2 p  = 0.121, with the average number of injuries larger in the post-treatment phase of non-treatment group (p = 0.005, η 2 p  = 0.077). There was a significant decrease in the average number of injuries for the intervention group before and after implementing the program (p youth soccer sport injuries, with a high level of satisfaction and commitment from the athletes, as well as high acceptance from the coaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  14. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  15. Comprehensive Technical Report, General Electric Direct-Air-Cycle Aircraft Nuclear Propulsion Program, Program Summary and References

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, G.; Rothstein, A.J.

    1962-06-28

    This is one of twenty-one volumes sumarizing the Aircraft Nuclear Propulsion Program of the General Electric Company. This volume discusses the background to the General Electric program, and summarizes the various direct-air-cycle nuclear test assemblies and power plants that were developed. Because of the requirements of high performance, low weight, and small size, vast improvements in existing technology were required to meet the flight objectives. The technological progress achieved during the program is also summarized. The last appendix contains a compilation of the abstracts, tables of contents, and reference lists of the other twenty volumes.

  16. Software for the ACP [Advanced Computer Program] multiprocessor system

    International Nuclear Information System (INIS)

    Biel, J.; Areti, H.; Atac, R.

    1987-01-01

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system

  17. Language constructs for modular parallel programs

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.

    1996-03-01

    We describe programming language constructs that facilitate the application of modular design techniques in parallel programming. These constructs allow us to isolate resource management and processor scheduling decisions from the specification of individual modules, which can themselves encapsulate design decisions concerned with concurrence, communication, process mapping, and data distribution. This approach permits development of libraries of reusable parallel program components and the reuse of these components in different contexts. In particular, alternative mapping strategies can be explored without modifying other aspects of program logic. We describe how these constructs are incorporated in two practical parallel programming languages, PCN and Fortran M. Compilers have been developed for both languages, allowing experimentation in substantial applications.

  18. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  19. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  20. Regulatory and technical reports, compilation for 1979. Volume 4. Bibliographical report Jan-Dec 79

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzie, L.; Aragon, R.

    1980-07-01

    The compilation lists formal regulatory and technical reports issued in 1979 by the U.S. Nuclear Regulatory Commission (NRC) staff and by NRC contractors. The compilation is divided into three major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The first portion of this sequential section lists staff reports, the second portion lists NRC-sponsored conference proceedings, and the third lists contractor reports. Each report citation in the sequential section contains full bibliographic information

  1. LBNL Laboratory Directed Research and Development Program FY2016

    Energy Technology Data Exchange (ETDEWEB)

    Ho, D.

    2017-03-01

    The Berkeley Lab Laboratory Directed Research and Development Program FY2016 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the supported projects and summarizes their accomplishments. It constitutes a part of the LDRD program planning and documentation process that includes an annual planning cycle, project selection, implementation and review.

  2. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  3. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  4. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  5. AGRIS: Description of computer programs

    International Nuclear Information System (INIS)

    Schmid, H.; Schallaboeck, G.

    1976-01-01

    The set of computer programs used at the AGRIS (Agricultural Information System) Input Unit at the IAEA, Vienna, Austria to process the AGRIS computer-readable data is described. The processing flow is illustrated. The configuration of the IAEA's computer, a list of error messages generated by the computer, the EBCDIC code table extended for AGRIS and INIS, the AGRIS-6 bit code, the work sheet format, and job control listings are included as appendixes. The programs are written for an IBM 370, model 145, operating system OS or VS, and require a 130K partition. The programming languages are PL/1 (F-compiler) and Assembler

  6. Implementation of evidence-based home visiting programs aimed at reducing child maltreatment: A meta-analytic review.

    Science.gov (United States)

    Casillas, Katherine L; Fauchier, Angèle; Derkash, Bridget T; Garrido, Edward F

    2016-03-01

    In recent years there has been an increase in the popularity of home visitation programs as a means of addressing risk factors for child maltreatment. The evidence supporting the effectiveness of these programs from several meta-analyses, however, is mixed. One potential explanation for this inconsistency explored in the current study involves the manner in which these programs were implemented. In the current study we reviewed 156 studies associated with 9 different home visitation program models targeted to caregivers of children between the ages of 0 and 5. Meta-analytic techniques were used to determine the impact of 18 implementation factors (e.g., staff selection, training, supervision, fidelity monitoring, etc.) and four study characteristics (publication type, target population, study design, comparison group) in predicting program outcomes. Results from analyses revealed that several implementation factors, including training, supervision, and fidelity monitoring, had a significant effect on program outcomes, particularly child maltreatment outcomes. Study characteristics, including the program's target population and the comparison group employed, also had a significant effect on program outcomes. Implications of the study's results for those interested in implementing home visitation programs are discussed. A careful consideration and monitoring of program implementation is advised as a means of achieving optimal study results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  8. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  9. Building Program Models Incrementally from Informal Descriptions.

    Science.gov (United States)

    1979-10-01

    specified at each step. Since the user controls the interaction, the user may determine the order in which information flows into PMB. Information is received...until only ten years ago the term aautomatic programming" referred to the development of the assemblers, macro expanders, and compilers for these

  10. Compilation of the abstracts of nuclear computer codes available at CPD/IPEN

    International Nuclear Information System (INIS)

    Granzotto, A.; Gouveia, A.S. de; Lourencao, E.M.

    1981-06-01

    A compilation of all computer codes available at IPEN in S.Paulo are presented. These computer codes are classified according to Argonne National Laboratory - and Energy Nuclear Agency schedule. (E.G.) [pt

  11. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, Mahmut Taylan [PSU; Choudary, Alok [Northwestern; Thakur, Rajeev [ANL

    2014-03-01

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.

  12. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  13. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  14. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  15. Plans for the NKS-program 1998-2001; Planer for NKS-programmet 1998-2001

    Energy Technology Data Exchange (ETDEWEB)

    Bennerstedt, T [ed.

    1999-08-01

    The present report is a comprehensive compilation of the adopted NKS project plans for the sixth four-year period, 1998-2001. Most of the plans are in English. One is in both English and Danish. One is in Norwegian, with a brief summary in English. Only two of the six appendices are in English. In spite of this, it is believed that the report will serve as a valuable source of information not only to those actually active in or closely following the NKS work, but also the international scientific community, e.g., within EU and in the Baltic States. The research program incorporates reactor safety, radioactive waste, emergency preparedness, radioecology, cross-disciplinary studies, and information issues. The necessary administrative support program, including the NKS Secretariat, is not described herein. Neither is the aim, scope or organization of NKS, since this has been covered elsewhere. (EHS)

  16. Compilation and analysis of national and international OPEX for Safe Enclosure prior to decommissioning

    International Nuclear Information System (INIS)

    Dinner, Paul J.C.; Heimlich, Karel

    2016-01-01

    Around the world, a large number of aging nuclear plants are approaching final shutdown. While this is largely driven by plants reaching the end of their design life, economic factors such as low gas prices (in North America) and the smaller unit size of early commercial reactors are important contributors to this trend. In several instances, economic pressures have resulted in a need for a more rapid transition to Safe Enclosure than originally anticipated. Thus plans for this transition taking into account experience with Safe Enclosure periods of varying lengths are being actively prepared in many jurisdictions. The IAEA as well as other national and international authorities have long recognized the importance of the topic of Safe Enclosure and provided guidance, and the IAEA has recently undertaken a study of 'Lessons Learned from Deferred Decommissioning of Nuclear Facilities'. Beginning with preliminary experience from Canadian CANDU reactors in extended shutdown or safe enclosure, this paper aims to compare this experience with the larger pool of experience from the international community to: - classify the main issues or themes, - examine means to mitigate these, and - formulate general measures of 'good practice'. Compilation of this experience represents the first steps towards a comprehensive, searchable database potentially of use to many in the decommissioning community. Tabulation and analysis of the complete list (comprising approximately 70 cases) has provided the 'short list' of issues presented in Table 1. Examples of the most important listed issues are discussed. The authors' objective is to stimulate interest in extending this compilation. In this way it will continue to grow and benefit all those preparing for transition to decommissioning. (authors)

  17. Compilation and analysis of national and international OPEX or safe enclosure prior to decommissioning

    International Nuclear Information System (INIS)

    Dinner, Paul J.C.; Heimlich, Karel

    2016-01-01

    Around the world, a large number of aging nuclear plants are approaching final shutdown. While this is largely driven by plants reaching the end of their design life, economic factors such as low gas prices (in North America) and the smaller unit size of early commercial reactors are important contributors to this trend. In several instances, economic pressures have resulted in a need for a more rapid transition to Safe Enclosure than originally anticipated. Thus plans for this transition taking into account experience with Safe Enclosure periods of varying lengths are being actively prepared in many jurisdictions. The IAEA as well as other national and international authorities have long recognized the importance of the topic of Safe Enclosure and provided guidance [1-7], and the IAEA has recently undertaken a study of 'Lessons Learned from Deferred Decommissioning of Nuclear Facilities' [8]. Beginning with preliminary experience from Canadian CANDU reactors in extended shutdown or safe enclosure, this paper aims to compare this experience with the larger pool of experience from the international community to: - classify the main issues or themes, - examine means to mitigate these, and - formulate general measures of 'good practice'. Compilation of this experience represents the first steps towards a comprehensive, searchable database potentially of use to many in the decommissioning community. Tabulation and analysis of the complete list (comprising approximately 70 cases) has provided the 'short list' of issues presented. Examples of the most important listed issues are discussed. The authors' objective is to stimulate interest in extending this compilation. In this way it will continue to grow and benefit all those preparing for transition to decommissioning. (authors)

  18. NRC/RSR Data Bank Program

    International Nuclear Information System (INIS)

    Bankert, S.F.; Evans, C.D.; Hardy, H.A.; Litteer, G.L.; Schulz, G.L.; Smith, N.C.

    1978-01-01

    The United States Nuclear Regulatory Commission (NRC) has established the NRC/Reactor Safety Research (RSR) Data Bank Program to provide a means of collecting, processing, and making available experimental data from the many domestic and foreign water reactor safety research programs. The NRC/RSR Data Bank Program collects qualified engineering data from experimental program data bases, stores the data in a single data bank in a common format, and makes the data available to users. The program is designed to be user oriented to minimize the effort required to obtain and manipulate data of interest. The data bank concept and structure embodied in the data bank processing system are applicable to any program where large quantities of scientific (numeric) data are generated and require compiling, storage, and accessing in order to be collected and made available to multiple users. 3 figures

  19. Priorities for injury prevention in women's Australian football: a compilation of national data from different sources.

    Science.gov (United States)

    Fortington, Lauren V; Finch, Caroline F

    2016-01-01

    Participation in Australian football (AF) has traditionally been male dominated and current understanding of injury and priorities for prevention are based solely on reports of injuries in male players. There is evidence in other sports that indicates that injury types differ between males and females. With increasing participation in AF by females, it is important to consider their specific injury and prevention needs. This study aimed to provide a first injury profile from existing sources for female AF. Compilation of injury data from four prospectively recorded data sets relating to female AF: (1) hospital admissions in Victoria, 2008/09-13/14, n=500 injuries; (2) emergency department (ED) presentations in Victoria, 2008/09-2012/13, n=1,879 injuries; (3) insurance claims across Australia 2004-2013, n=522 injuries; (4) West Australian Women's Football League (WAWFL), 2014 season club data, n=49 injuries. Descriptive results are presented as injury frequencies, injury types and injury to body parts. Hospital admissions and ED presentations were dominated by upper limb injuries, representing 47% and 51% of all injuries, respectively, primarily to the wrist/hand at 32% and 40%. Most (65%) insurance claim injuries involved the lower limb, 27% of which were for knee ligament damage. A high proportion of concussions (33%) were reported in the club-collected data. The results provide the first compilation of existing data sets of women's AF injuries and highlight the need for a rigorous and systematic injury surveillance system to be instituted.

  20. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.